Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Opensource and MPL? (Score 1) 140

| It's pretty hypocritical to criticize a license for requiring that redistribution of the source of that code or derivatives must be under the same license and then turn around and recommend everyone use the GPL instead.

This this this! Finally - someone points out the elephant, the pot calling the kettle black. For once I'd like the GPL community to admit this. The tactics and business models supported by GPL-style licenses is why I never have, and never will, release my software with one of their licenses. It's either free - or it's not.

Either give it away free like you intended, or follow the GPL rabbit hole down and just guess what GPL 4, 5 and 6 will look like...

Comment Re:How is this quantifiable in any stretch? (Score 4, Interesting) 132

| User/Device security is no more or less "secure" than it was back in 1995,

I disagree. The amount of compute time rises dramatically each year (Moore's law), it is not good enough to simply 'tread water' and just upping the key length are sufficient. New techniques and systems are constantly being built to attack these methods. While I'm not saying SSH is bad or outdated, I'm saying that cryptanalysis and raw compute has not stopped chipping away at the corners and weak spots. What if at 51200 bit security, we find an aweful and damnin patter appears in the math? We still cannot prove that any of these particular methods for cryptography today couldn't be completely broken wide open with a numerical discovery tomorrow (while we are pretty sure it can't).

We mustn't fall into the trap of thinking that what is good enough today is good forever. Have as many irons in the fire being tested and competing is the best way for your protection today and tomorrow.

Comment Security getting worse (Score 5, Interesting) 132

I would largely agree. Unfortunately, I believe it is because real security - cryptography and end-to-end security and privacy - are very difficult, and hence, very expensive to develop, implement, and test. My experience with such coding is that it's every bit, if not more, rigorous as code written for medical devices or flight control software. It simply has to be bulletproof. Any one hole in the theory, algorithm, or implementation - and the whole thing comes apart. Learning about all those possible holes and plugging them is a herculean task. One can point to the near constant stream of security patches for every browser, app, and OS on the market. And these are the best-funded commercial enterprises around.

Another huge problem is the 'meh' attitude people have towards their personal information. We throw our data around so willy-nilly on smart phones and social networks. We check in places that tell everyone where we are (or are not http://pleaserobme.com/ ), publicly publish our most intimate family and friend relationships, report where we live and work, we even identify people to image recognition software. One expert I heard said that he could not imagine a more dastardly personal information monitoring system than Facebook. And we WILLINGLY give that information away. Google reads your emails and all the documents you upload to their 'free' services. Websites use everything they can to target ads at you, etc.

The unfortunate part, as my CS security professor pointed out, is that by the time it crosses an ethical line - it's nearly impossible to stop. Even worse, what if the company you gave all that info too gets sold to a very un-scrupulous person in a country with no protections? What if your government is taken over and they raid these databases for information about dissenters? All of these things are real, happen today, and yet we consider it more important to be able to brag to our friends and family what we had for dinner last night than protect ourselves.

Comment Re:You get what you pay/wait for (Score 1) 491

| In my experience, Agile projects tend to run longer than they would have under waterfall, but the end product is usually closer to what the customer needs

Great analysis - I would agree.

The other pitfall I've seen of people claiming to be 'agile' is that it's a keyword for "You're about to give your life and soul to this startup". I think a fair number of places claim to be running an Agile process, but in reality use it as a keyword for busting your hump as hard as they can. And run as fast as you can if you ever see an ad looking for someone that wants to hire you for "a fast-paced, dynamic, agile environment".

Comment Re:SSD? (Score 2) 292

This has also dramatically changed. Yes, early drives suffered from a number of problems - many of which have been solved or greatly reduced. The biggest difference for SSD's is their memory cells do only allow a finite amount of writes before they wear out. Much as USB sticks. The problems came from both firmware bugs in early designs, and the inherent longevity of the memory.

Here are the salient hardware points:
-Intel SSD drives now carry a 3 year warranty. Now as good as any platter drive (sadly).
-Drives have built-in error correction and failure tolerance with replaceable blocks if one cell dies.
-Newer processing techniques have greatly improved reliability
-Many bug fixes to early firmware problems. Things such as wear-leveling that prevents certain cells from getting hammered while the rest of the drive sits unused.

Some of the software points:
-OS's have long be optimized for the slow platter performance. Newer OS's (Win7 is one) detect the use of an SSD and turn off the features that actually hurt drives. What are those things? Turn off disk defragmenting, optimizing the use of the system so that caches and data live on a platter drive while the OS and programs you want mostly read access are on the SSD.

There are many good articles and write-ups on these topics. I finally made the jump and would never look back. I just got a 180gb Intel SSD for $129 and it's the single most trans-formative upgrade to my computing experience I've made since the Core line of processors were released. The sheer speed your machine runs at, how apps just instantly start and the system boots in a few seconds, is just mind boggling. You start feeling the power of modern processors like you never have before.

Comment Why so much arguing and no data? (Score 1) 256

Toms hardware and other sites have done extensive tests of these hybrid drives with a very mixed bag of results. Usually they are still an order of magnitude slower than SSDs, and only achieve their near-SSD perf after it's been 'trained' for a while on the data.
http://www.tomshardware.com/reviews/momentus-xt-750gb-review,3223.html

And for all the complaining on here about SSD failure rates, wouldn't the lifetime of the solid-state memory units in a hybrid drive be even worse because it's a much smaller block of memory and therefore must swap much more data in/out of the total capacity? What's the point of a hybrid drive if the caching part is likely to fail much faster than the SSD equivalent?

I finally made the jump to buying an SDD and have a 180gb Intel one coming from Amazon's super-sale last week. I'll be installing my OS on it and using the procedure that many others have used with great success. Maybe you can find some real solutions to some of your counter-arguments there:
http://www.sevenforums.com/tutorials/70822-ssd-tweaks-optimizations-windows-7-a.html

After having my work laptop with one, I simply am amazed at how much more real-life usable my machine is. Waking from sleep is near instantaneous and I know longer wonder if it's worth it to wake my machine and wait 20 seconds to check something out before walking out the door. Batter life is nearly doubled. I don't hear clicks as the drive parks itself every time it gets a chance. I don't have to worry about josteling or even *dropping* the laptop while it's on. It's honestly the most amazing upgrade to PC technology and usability I've seen in years. Beats any graphics card upgrade since first going to VGA and proc upgrade since the original Pentium.

Sure, it's life might be shorter; but I've been watching my hard drives have shorter and shorter lives too. Warranties are no longer 5 years, they're 2 if you get that. Out of 5 seagates I've owned in the last few years, 3 of the 5 have died. One of the replacements even died. I switched to hitatchi's and have done better. So, and SSD won't make any different to the regular backups I do now anyway.

We tech nerds often forget it's about the usability stupid. Usability as your mom and girlfriend see it. I.e. - you turn it on and it works. You don't see all kinds of cryptic mysterious stuff happening, or cross your fingers and hope something works. You press the button on the appliance and the toast comes out. Until we really grock that - these arguments are pedantic to non-techs who are just as likely to toss an old computer or give it to the kid because it's getting slow or the drive dies in it.

Comment It's not really about the language (Score 1) 437

Not meaning any disrespect to the creators/language, but Objective-C is grown because it's what iOS devices use for programming with. If Apple had used Python or C++ we'd be using that instead. The rise/fall of languages now has just as much (if not more) to do with what devices they're put in as much as any construct of the language.

I swear that at times Slashdot feels like a bunch of stuffy comp-sci folks stuck in the 80's. You'll argue day and night about minor technical details, but it's none of those details/minutia that really make a product take off. Sure, flaws will kill a product, but beyond a very simple point, they are just a tool to get a job done. Steve Jobs understood that, maybe we should too and stop the language circle-jerks.

Comment Re:Engineering shortage? (Score 2) 375

I think you made a fair assessment from my experience. One addition/correction I would add - and it's a big one.

Finding *qualified* candidates is the key phrase. We get lots of interviews from recent graduates, but many of us lament at how bad a great number of them are. The ones that went to U of Pheonix or other trade schools are just not suitable as systems coders (jobs we pay very well for). I don't know if it's universities failing to educate, or standards going up. I would guess a little of both. We need device driver writers, multithreaded system programers, graphics, etc, and it's very hard to find event experienced coders that are good at that. Most fail our screenings. It's not good enough anymore just to show up with a sheepskin in CS. You need to demonstrate you can actually build and debug something. For candidates that show initiative with personal projects and can demonstrate solid coding/debugging and can talk intelligently about architecture - we'll hire you in a MINUTE with some of the highest salaries in the industry. If you can't, then you're one of the 10,000 other global coders anyone can get. We do hiring lots of promising newbies all the time - but they have to show that initial spark and understanding at least. You can't just show up and be gimme a job cause I got a diploma.

I think this brings us to a great career point. You don't want to be floating in the middle as one of the masses in engineering - following just the trend lines. Sure, you need relevant skills so don't run out and become the ADA expert; but perhaps an example. If you were a Java/web coder in the very early 2000's when the bubble popped - a great number of those people never found work again and left the field. I felt bad for all the CS grads from my school that had just taught them Java (they don't know, they primarily teach C/C++ again and now we hire them). Unfortunately, when you're part of the big crowd, you're career and pay is driven much by the crowd and you must be really, really exceptional to stand out.

What will make you OODLES of money in engineering is being a master of certain much needed, but not well peopled, technique. I knew several guys that were above average coders but commanded insane amounts of money to come in and deliver a certain kind of feature. They went the contracting themselves out route and simply re-regurgitated the 1 of the 5 or so techniques they had implemented for this task really well, picked up a fat check, wash, rinse and repeat. I know personally of at least 2 different sets of people doing this with 2 completely different technologies. Sure, every year or two needed to spend some time updating and revamping the codebase, but they worked about 2-3 jobs a year and then took 3-6 mo off - and made large 6-figure salaries. Those are the people that get rich in engineering. You simply won't get rich working for a company unless you've got some kind of share of the sales.

So it is possible to get a job even as a new grad, but you need to stand out more now than before.

Comment Re:Who can blame them? (Score 1) 649

Amen sitkill - this has been my experience too. I'm absolutely no Apple fan - but the amount of anti-iPhone FUD on this thread is pretty terrible and continues a line of denial I've seen again and again. I've developed commercially for Android, Win7Phone, and yes, iPhone. When I test an app for Apple, I need *at most* 2 iPad's and 2-3 different iPhones. When I test for Android or Win7Phone, I need a box full of phones. Each one has a slightly different conrtrol layout, slightly different sensitivities, driver and version bugs, power-sucking hot paths, etc. How anyone can say it's as easy to validate for a Google phone market as it is for iPhone has no idea what they are talking about. Sure anybody can *develop* an app that should work just fine on any of those devices, but the reality is when you have a revenue stream on the line, you absolutely must validate it. In a marketplace where people will delete and badmouth your app if you so much as simply use their battery badly (let alone bugs or usability issues), you can't afford to live in an ivory palace. This is a huge problem for both Google and Win7/8 phones. It's certainly an additional cost to developers and I can only see a few solutions: 1. Common platform - require the use of one or only a few chip vendors and hardware mfcts with validated hardware/driver stacks 2. Simulators that can mimic all the hardware out there and all the driver stacks/OS versions and tell you of upcoming problems with power issues, rendering bugs, etc. 3. A defined set of control layouts garanteed to give good perf/responses and a means of changing control schemes in your game to whatever the user wants. This should help, but it sure seems like a very tall order to me since Google already has a slew of OS drops out there and some pretty badly behaving phone vendors racing to the bottom on some of their models.

Comment Re:Well, they're a good indicator of intelligence (Score 1) 672

It helps if you have 3 months salary in reserve for emergencies like you should so you don't end up entering a bad situation out of desperation.

Currently, I would recommend people have 6-12 months saved up. Jobs report in August said the AVERAGE length of unemployment is 40.4 weeks. Granted, that's including all industries; but the reality is that it's worse out there than people think.

Comment What about strange quantum effects? (Score 1) 197

What about this: Since they're taking photos from multiple 'flashes' of the illuminating laser over time - conceptually - shouldn't the quantum properties of light bending/scattering be visible?

What if we used this to shoot the standard diffraction grating quantum experiment or other examples of strange quantum properties. Would we see frame-to-frame quantum discontinuities based on when the sampling occurred?

Slashdot Top Deals

There are two kinds of egotists: 1) Those who admit it 2) The rest of us

Working...