Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:What a load of FUD! (Score 4, Interesting) 150

A NAS device is not a toaster. It's a file server running a lightweight but fully-featured operating system. You don't need to be a professional network administrator, but you do need to be careful enough to at least check in regularly for updates. One presumes such hardware was purchased because you had valuable data you wished to manage or protect. Honestly, a NAS is really not a purchase for "normal" people. Power-users and up, I'd say, are the minimum personnel requirements.

Even so, Synology machines are not hard to patch. They download OS updates automatically by default. All you have to do is log in via the administration page once in a while and click the "update" button, since it pops up right on the page after it sees you have an update to install. And every update has a link right next to it that points to a web page detailing exactly what changed or what was fixed. I'd suppose the reason there's no "auto-update" is because an update requires a 5-10 minute patch and reboot cycle, and you generally don't want your file server automatically rebooting at it's own convenience.

I'm presuming (since information is a bit scarce) that users either failed to patch their machines for six months or longer due to neglect, or they made a deliberate choice not to do so for some reason, yet kept their internet-facing services wide open (note that these are not installed or enabled by default). Unfortunately, that's pretty much a guaranteed recipe for an attack of this sort. It's a crappy way to have to learn a lesson.

Comment Re:Nuke it from orbit, then restore from backups. (Score 5, Informative) 150

My Synology NAS is my home-based business' file server, a local machine backup (for my development machine and my digital audio workstation), and a media server for my ripped DVDs and Blurays, although this third function is just a nice bonus for me. Synology NAS devices have a very handy cloud backup application as well, which I use to backup all my most critical files to Amazon S3 services. I hope most people made use of this, because if Cryptolocker has taught us anything, it's that you absolutely need offsite backups that are NOT connected to your network.

I bought it specifically because it makes it easy to set up a multi-tiered backup strategy like that - something that takes on new importance when you spend a few years writing code on your own dime. As a file server, it's fantastic for small operations. I had a drive begin to fail last year, and so had a chance to test out the hot-swapping / RAID rebuilding feature. Worked like a charm - was super simple and zero down-time.

Personally, I've never once considered opening up my NAS to the outside internet. That always seemed crazy risky to me - after all, a single software mistake, a buffer overrun in a protocol stack of some sort, and *poof*, there's direct access to your file server and all it's critical data. I guess sometimes being paranoid pays off, but it gives me no pleasure to say so.

Comment Re:Sponsors (Score 3, Insightful) 138

Internet of things, huh? I think I'll wait a generation or two until they hammer out the worst of the security issues. One of the latest missteps was caused by a smart bulb that embedded the encryption key in the firmware. Oops. Yeah, no one would think to look there, right? There's likely going to be an entire generation of devices that will have the same sort of flaws that early wireless routers had - essentially, the result of average programmers (i.e. non-cryptographic experts) trying to invent cryptographic solutions.

Comment Re:Well at least they saved the children! (Score 1) 790

If you have a gmail account google could bomb your computer with tons of child porn the next time you check your email. They could also serve up search results from their search engine with hidden images that your browser will cache. If you've got google drive or whatever its called then quite clearly you are fucked if google wants you to be fucked.

You're playing the "Could this theoretically happen?" game again. Yes, that's a possibility. One has to say that and remain intellectually honest. In the real world, not some Hollywood thriller, what are the odds that Google, or some Google employee, is actually going to target an individual in such a manner, electronically forging content and framing him? Or more to the point, what are the odds of it happening in any one individual case? Exceedingly low, in my opinion. The odds go up dramatically if you figure the possibility of it *every* happening, but you can't really work that way. It's like saying that the odds of winning the lottery are very high because someone, somewhere, at some time is virtually guaranteed to win it.

As such, we treat evidence from Google *or any other single source* as potentially suspect in line with the odds of that information being compromised in that instance. Just like a good academic paper will cite from multiple sources, so too should a criminal conviction require corroboration of evidence from multiple sources. The great thing is that mathematically, if we consider the odds of a Google employee or Google as a company working to ruin an individual as "low", and the probability of the police planting evidence at the scene of the crime as "low", the odds of both of those occurring in a single case is now *exceedingly low*.

To answer your question directly, if the data all came from Google or could be manipulated directly in some fashion by Google, then it's essentially a single source, and a highly malleable source at that (electronic data). It would depend on the details of the case, of course, but my gut answer is that I'd really want to see some external corroboration before I'd feel comfortable with a conviction. And, in line with where I think you're leading with this, yes, I agree that it's dangerous for one company to control too much of the Internet or the data in our personal lives for exactly the reasons you give.

Comment Re:"mobile first" strategy (Score 5, Insightful) 151

I learned quite a bit about Nadella from his e-mail which notified around eighteen thousand employees of impending layoffs and contained the word "synergies" no less than three times. Even his buzzwords are stale and unimaginative. This man either has no real vision, or he's very bad at communicating a clear vision. The article was correct in giving him a very bad grade in communication.

The one-platform tech base strategy actually seems sound, though, and in truth, is how they should have been pushing Windows 8 - not as a touch-first OS like we got, but one that's touch-capable, able to integrate seamlessly with other small form factor touch-focused Microsoft devices by using a unified API (write once, deploy everywhere). There's a lot of legacy products out there that people will still depend on for decades to come, and businesses are made nervous when the creator of the OS on which they depend veers off in a new direction, seemingly abandoning the current platform on which you rely.

It's a bit ironic to me that in trying to aim for the future, Microsoft is taking for granted and ultimately risking the core audience on which they've had a solid lock for the past twenty years. We'll see if Nadella manages to remember that while the desktop is no longer the face of new technology and is dwindling in importance, it's also a platform which is not likely to disappear as a significant market anytime in the near future. Rather than using that platform as a bully-pulpit to push it's other platforms, Microsoft needs to make it's other platforms compelling and attractive in their own right, and then demonstrate to businesses the value of a simple cross-platform deployment strategy, all while leaving it's "legacy" desktop platform in place in order to support more heavyweight computing tasks that individuals and business will still inevitably need.

Comment Re:Well at least they saved the children! (Score 4, Insightful) 790

The threshold is "beyond a reasonable doubt", which means that we have to weight the possibility of a conspiracy to fake evidence by some random employee at Google and police who found evidence at his house, versus the probability that this person was guilty of a crime - one he was convicted of previously, incidentally

If there's a Google employee or outside hacker with a wish to see this person go back to jail does not imply there has to be a conspiracy. That the person is formerly convicted would, I believe, make it more likely that the person is framed, not less. There are enough people who think anything less than life sentence is too mild, and some of those are more than willing to "do what it takes".

Well, if the Google evidence was the sole evidence used to try to convict someone, I'd hope that the accused would walk free. One would hope that a case wouldn't depend on a single piece of ANY evidence, because that brings up the obvious reasonable doubt. If the Google evidence is used in conjunction with evidence also found at a local residence by law enforcement, that obviously makes for a much stronger case.

I don't think it's unreasonable to apply Occam's razor to these scenarios. It's perhaps entertaining to imagine all sorts of crazy conspiracy theories that *might* occur, but the reality is that these sorts of things are undoubtedly *extremely unlikely* to actually occur. If we dismissed every case because of improbable scenarios that could theoretically punch holes in a case, we'd never convict anyone.

We have to draw a line somewhere so that innocent people wrongly accused are protected, yet standards aren't so impossible that we can never actually convict anyone who has actually committed a crime.

Comment Re:Well at least they saved the children! (Score 2) 790

Because, as an occasional server admin, I'm perfectly aware that it's easy to change the logs, timestamps, and permissions. Do you not know what a computer is? It's a tool for manipulating data. This is not reliable forensic evidence, it's something that anyone with fairly modest skills could fake up in fifteen minutes.

Sure, but it's no different than most other physical evidence, in that it's dependent upon the trustworthiness of the person presenting the information. That's why there are strict procedures dealing with evidence. It sets up a chain of trust which is used to gauge the validity of the evidence. You're making the mistake of trying to apply black and white rules in a matter that is, by it's very nature a very grey area.

Note that a conviction of a crime doesn't require "100% proof", because there's no such thing in this world. In theory, pretty much all evidence could be tampered with. The threshold is "beyond a reasonable doubt", which means that we have to weight the possibility of a conspiracy to fake evidence by some random employee at Google and police who found evidence at his house, versus the probability that this person was guilty of a crime - one he was convicted of previously, incidentally. That's what a jury of his peers will have to decide.

Comment Re:correlation, causation (Score 1) 387

A single story about someone getting fired over saying "dongle"

That sort of reminded me of when my sister-in-law first started working in IT. She pretty much learned everything on the job, as she had a non-technical degree and sort of accidentally fell into that field. And of course, as is typical, she's the only female in the department. She was initially surprised at the use of the terms "male" and "female" to describe cabling, plugs, and sockets, thinking the guys were having a chuckle at her expense before realizing that that was, in fact, official nomenclature.

Fortunately, she's a pretty level-headed person, gets along well with others, and isn't looking for reasons to be offended.

Comment Re:Sorry, but... why? (Score 1) 180

Honestly, forcing computer programming on kids will have the same effect as forcing math on them.

You mean introducing it to them? Without school "forcing" topics on kids, many wouldn't know that those topics even existed.

By "forcing it on them", I simply meant that it shouldn't be mandatory. A computer science course should be an optional track, same as high-level math. There's no need to push it on students who don't have an interest or inclination for it.

Don't misunderstand, I absolutely support the notion of giving kids the option of taking CS courses as early as possible. I started programming around the sixth or seventh grade myself, if I remember correctly, and it was great to have that early head start when I took my first real CS courses in college.

Comment Re:XP losing Market share is not bad news. (Score 1) 336

Yeah, I think you're correct. Windows 9 is just Windows 8.3 released as a new OS, since MS seems desperate to wash the taste of Windows 8 out of their mouths.

Unfortunately, it appears to be just Windows 8 with most of the glaring problems removed, but probably not compelling enough to make it anything of a must-buy, except for Windows 8 users. They're still too firmly focused their app store as a means to prop up their phone and tablet sales, rather than making actual improvements for their core users. It's sad that the features most anticipated are the return of the start screen, the ability to run Metro apps in a window, and generally not acting so much like a tablet OS. In other words, Windows 7, but with a flat, ugly UI. Whee.

I'm betting that Windows 10 will shift focus back to the desktop where it should have been all along, and we'll have finally broken Microsoft's "even=bad, odd=good" cycle, not with two successes in a row, but two failures. Apparently, they now have to release two OS failures in a row to have the lesson sink home. Probably the only way to avoid that will be if they give Windows 9 away as a free or very low-cost upgrade to Windows 7 and 8 users, in which case adoption rates might be boosted at the expense of sales revenue.

Comment Re:Who has the market share? (Score 3, Insightful) 336

Had they limited Windows 8 to touchscreens and digitizers only, it would have made things worse. Poor adoption rate is their big problem, and further limiting your user base with hardware restrictions would only exacerbate the situation. The platform doesn't move forward in practice if people don't actually upgrade. Here's the issue: Touch screens make sense for certain form factors, but not for desktops. Search the term "gorilla arm" to see why.

Even beyond that, the "metro" concept of full screen apps runs counter to what desktop users actually need for productivity. The desktop is not a "legacy" platform. It's a platform that's very specifically optimized for getting work done with a keyboard, mouse, and large form factor screen. That sort of work is not going away anytime soon, as the business world has demonstrated loud and clear by their absolute refusal to move to Windows 8. Naturally, the relevance of PCs is diminishing among home and casual users - people who didn't use the PC for production purposes, but mostly as a consumption, communications, and entertainment device. Smartphones and tablets are perfect for that. For actual production work, the desktop/laptop will remain king for the foreseeable future, albeit in much more of a specialized role than before.

Windows 8 would have been a fine OS had they discarded the idea of one-UI-fits-all devices, and instead focused on the coolness of Metro as a side-channel application experience. That would have meant allowing cross-platform tablet and phone apps to run on your desktop seamlessly with native or managed desktop applications, but without trying to make the whole OS touch-focused. Instead, the marketing hype overtook common sense and usability concerns, and they began touting it as the future replacement of the desktop, which is absurd. Not surprisingly, after the actual market kicked the marketing department's ass, they're starting to move in a sensible direction with Windows 9 by focusing on the benefits of cross-platform application development, and they're slowly backing off of the ridiculous notion that their desktop OS should behave like a tablet.

Comment Re:Who has the market share? (Score 2) 336

I don't think skeuomorphic means what you think it does.

Gah, you're right. I meant the move away from skeumorphic interfaces and toward... does the new flat, simple, textureless aesthetic have a name other than anti-skeumorphic? If it does, I can't think of it. Nothing like a lack of an edit function to make you look silly.

Comment Re:Who has the market share? (Score 4, Insightful) 336

From the article:

Microsoft will likely one day struggle to woo users off Windows 7, just like it is currently trying to do with the headache that is Windows XP.

I wonder if Microsoft is learning the wrong lessons from their "good" versions. They're having a hell of a time getting people to leave them. In the future, if people hate the version they're on, they'll be much more likely to buy a new version in the hopes that it's better. Brilliant!

That's the only think I can think of to fully explain Windows 8, and why even now they're refusing to admit that Metro apps are a steaming turd on top of an otherwise competent OS. The only idiots who like using those "apps" are the ones who would probably be better off with a tablet or smartphone instead of an actual desktop computer, for whom the actual power of a desktop is apparently wasted.

Ok, maybe I'm just a bitter throwback who's resentful that my desktop is being marginalized. Maybe it's also because I hate the new skeuomorphic design aesthetic. What's wrong with gloss, gradients, transparency, and attractive animations, or even a bevel or link here and there so we can actually tell something is clickable rather than playing mystery-meat navigation? I swear, everything is going flat-shaded, blocky, ugly, and indistinguishable, all because that's now the new "hip" look.

Comment Re:Sorry, but... why? (Score 4, Interesting) 180

Honestly, forcing computer programming on kids will have the same effect as forcing math on them. It's no different - it's a way of solving problems, except by using logic and algorithms rather than equations. It's still a general solution in search of a problem. I think a lot of tech people tend to view technology as being interesting or important for it's own sake, but that's not how the world at large sees technology (nor should it). It's only real-world value is in what it can do for us that we couldn't do before without it, as heretical as that may sound here on slashdot.

My advice to educators is to let kids make their own computer games. You'll sucker them into doing some of the hardest sort of programming there is, and they'll likely enjoy it. More importantly, for those that are more artistically or creatively inclined rather than technically inclined, they can help out with artwork, story, music, and sound effects. Videogame programming is a fantastic cross-discipline project that can involve all sorts of different skills and abilities. So, not everyone has to write code, but everyone can contribute in some way.

I've found that computer games are great at driving home the need to learn higher math as well. Geometry, linear algebra, and matrix math are all used extensively in many types of games. Kids will naturally run into these problems, and probably work in vain to come up with a home-grown solution. At that point they're primed to learn a more elegant solution using higher math. The big advantage is that this lesson concretely demonstrates the value and usefulness of that math, making it much easier to learn and appreciate since it has a useful context.

Slashdot Top Deals

"Life sucks, but it's better than the alternative." -- Peter da Silva

Working...