Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Calling Stallman (Score 1) 286

And if you believe "On Trusting Trust" (and if you don't, you're just in denial) you know that it's impossible to verify their claims even *with* the source...

Ultimately, trust can never be in things, systems, methods, processes, etc. Trust has to have *people* as its object.

The longer we go, the more Scott McNealy gets proven right when he said, "You have no privacy anyway - get over it." It's entirely possible to assemble the information MS is collecting, and quite a bit more, from other sources - and it's being done all day, every day. That's not an excuse, justification, or rationalization, but it is reality.

Comment Re:Thanks, but (Score 1) 286

Bingo. I'd go further and say that even though most people here are fully capable of keeping their machines updated and secure, most of us wind up doing a piss-poor job of it in the real world. The cobbler's barefoot kids and all that. Despite the fact that all of us know better, the frequency with which even technology professionals fail to apply security updates or even back up their data is really shocking... (I write this on a Surface Pro 4 that hasn't been backed up in over a month other than by the built-in Windows backup (I let my previous backup subscription expire and I haven't yet settled on a replacement), so I'm one of those guilty parties.)

Making at least the basic stuff happen "automagically" (at least by default) is not necessarily bad and can be wonderful for the folks who don't know (and don't want to know) how all the sausage gets made. I agree it would be nice to have more granular controls, but this is a good step in the right direction. It would be even better if this disclosure puts pressure on other big coercive companies to be more open and transparent about what data they collect and what they do with it. I'd bet lunch that a similar disclosure by Google would be nothing short of terrifying.

Also there seems to be a double standard at play here (possibly justified, given Microsoft's past actions and behaviors): Windows actually gives you quite a bit more control over much of this stuff than say, Android or Chrome, yet far fewer people seem to be lining up to bust Google's chops over even more egregious behavior.

Comment Re:Don't forget about open source projects. (Score 1) 286

Again, with WSL/Ubuntu for Windows, I've got a dev platform that gives me truly the best fusion of the Windows and Linux worlds. This is arguably superior to Apple's approach from an openness point of view...

If you're writing code to run in the cloud and/or containers, then it really doesn't make much difference what desktop OS you use, so long as it's one that makes things easy and has a good set of tools that make you productive - that's kinda the whole point of those sorts of abstractions in the first place...

Comment Re:Don't forget about open source projects. (Score 1) 286

No, I'm not shilling for Microsoft (check my posting history, especially back around 1997-2000 - I've been brutally and vocally critical of Redmond's abuses), but this is a good step that should be applauded.

The reality today is that Microsoft is among the most open and forthcoming of modern tech companies in disclosing what information it collects and how it's used.
They're not offering all the options we'd like to see, for sure, but it really seems to me that they're close to leading among big companies, and certainly way out ahead of the likes of Google, Facebook, and Apple, to name just a few of their competitors in one space or another. And for that, they deserve some credit...

Comment Re:Unity gone - who cares? (Score 1) 386

And the youngstas forget that probably the biggest thing that held Linux back in the early days was the lack of a decent window manager and desktop environment. (Back then, all the good ones were either strictly proprietary or required expensive corporate foundation licenses.)

Sun's open sourcing of OpenWin was definitely one of the things that really allowed Linux to take off. Sun started things rolling and soon, there were many good choices, and sadly, the contribution of OpenWin is mostly forgotten today. I've often really wondered if Linux would have ever had a prayer without OpenWin as its first, modern, great-looking GUI platform...

Comment Re:April Fools (Score 1) 386

With WSL (Ubuntu for Windows), there's a pretty decent chance that Windows will be the most widely used Linux *desktop* OS within two years.

This is Microsoft's deal to screw up - if they execute well, then Windows and Linux may become far stronger together than either could be separately. WSL is just the first step, bringing Unix/Posix text stream pipeline semantics to Windows in the most standard and useful way ever - way better than MKS/SFU/Interix/SFA. Imagine, now, that Canonical integrates all the 21st century structured object pipeline concepts from PowerShell and Mono/.NET into Linux - *that* is something that could benefit the entire Unix/Linux community, as well as the Windows community.

Comment Re:About time (Score 1) 386

I think it's a bit early to say they've given up on mobile convergence, given the persistent and reasonably well-backed reports that MS is working on a new kind of mobile device (usually tagged as the "Surface Phone" although that may not be an adequate or accurate description of it). In fact, it seems that they've doubled down on being able to go that direction, even though they have no really compelling Win10 Mobile devices right now (the HP Elite X3 being the most notable potential exception...)

Comment Re:Is this a late April Fool's joke? (Score 1) 386

While there are some valid security issues with X (and audio is a bloody nightmare), the fundamental idea of network extensibility in desktop environments is needed more now than ever. It seems like a real shame that most all of the things angling to replace X can't do one of the most important things it's always given us...

Comment Re:Wonderful? (Score 1) 386

because Redmond has conned the shiny new hardware vendors into writing drivers for them...

Not conned, they just recognize market share when they see it. Linux has conquered the server world when even Microsoft has a large portion of its Azure instances running Linux), but Linux on the desktop is walking dead. This loss of Unity is sad news because IMO, Unity had one of the few chances of reviving desktop Linux - for a number of reasons, including the really important one of being a native Linux for mobile. (No, Android really doesn't count as Linux just because there are a few Linux bits remaining if you dig deep enough.)

On the other hand, with the advent of Ubuntu for Windows (WSL), most Linux users and developers will find that Windows is now nearly as good a Linux as any other Debian/Ubuntu-based distro, and the combination of Windows and Linux without emulators, VMs, or the need to reboot really does offer the best of both OSes.

WSL is a work in progress, for sure, but it has awesome potential, and is already changing the way I work... (BTW, the Creators Update next week includes upgrades to both WSL and to the MS command shell to (finally) support proper color control, among other things.)

Comment Re:And so it begins... (Score 1) 407

Actually, it can be far more complicated than you may think - Consider that a typical robotic workcell may have several robots and quite a few other devices (tooling, clamps, material handling/motion control equipment, process equipment, etc.), each with it's own, mostly or entirely independent control logic. In most cases, there simply is no overarching view of the logic or the policy that's supposed to be implemented (that requires *understanding*, and hence humans), so it's surprisingly easy to run into potentially dangerous conditions that weren't anticipated by the people who designed the system. (I say this as someone who has 30 years experience in both robots and IoT.)

You're right that that's the way safety *should* work, but getting to that point in the real world (which is a messy place) is a lot harder than you might expect - as evidenced in this case by the failure to anticipate or realize the potential danger from a second robot.

Comment Re: And so it begins... (Score 1) 407

Well, I'm one of those people - my degree is in robotics, and yes, I've been whacked, hard, by robots while working with them. The second time taught me to be very careful, as it could have killed me if things had gone only a little differently.

It's hard to do a complete lock-out/tag-out type process when you're testing the robot, or more commonly, the interactions between the various devices in the workcell. (No, I'm not saying you shouldn't lock and tag...) There are some things that are much more easily debugged from up close - the danger comes when you *think* you're in a "safe" spot in the work envelope, but one or another of the various programs running has other ideas. (Keep in mind that the average robotic workcell may have a dozen or more controllers running their own mostly to entirely independent control logic.)

In my experience, most robot-related accidents (which thankfully, only rarely lead to serious injury or death) are due to a combination of both human error AND software errors. (Hardware errors are both far less common and far less likely to result in injury.) Like plane crashes, the root cause may be attributed to human error, but there are almost always a set of contributing factors and conditions that stack up to lead to a deadly accident. (And like SCUBA diving, you ALWAYS need a buddy - but in this case, with the big red switch in his hand.)

There are several bigger problems that need fixing - First, 20th century robot technology (which is still practically all that's in use) builds robots that are stupid - really, really stupid. Unlike the robots of SciFi, they have no concept of people or other things, and only the most rudimentary idea of themselves. Generally, they can't feel at all (except *maybe* at their end effector (hand)), and almost none of them can independently avoid collisions even with other machines and static objects in the workcell, much less unpredictable and strangely-shaped things like people.

Giving robots the ability to feel or detect impact (via skin-type force sensing) would go a long way, but then programming would have to catch up, too, so that there are good places to hang autonomic or low-level, high importance safety loops. (BTW, this sort of multi-layered control scheme was what the MIT Media Lab's Rodney Brooks was originally working on before he got seduced by shiny things. His early papers are still surprisingly relevant.) The vast majority of robots today still use what are more or less a series of GOTO instructions in threespace sprinkled with conditionals, with little to no ability to do their own path planning, or react to anything they haven't been preprogrammed specifically to deal with.

As for fixing blame - that's really hard, and very situational. (If it's a software problem, is it due to insufficient safeguards in the underlying system, insufficient care by the implementor, or something that was reasonably unexpected?) Even knowing the full story (which obviously we don't here), it can be very difficult to sort out who is (or should be) responsible for what - especially when the law may not always be congruent with expectations. In general though, it's hardly fair to hold manufacturers responsible for unwise or insufficiently careful use of products that are known to be potentially dangerous. I often use a variant of this quote humorously to refer to Unix/Linux, but it's literally true when applied to robots: "Keep in mind that robots are power tools. And power tools can kill."

Comment Re:MS PAINT SAVES THE DAY! (Score 1) 139

True, the lack of layers in Paint makes it a good choice for this kind of thing - perhaps the only thing it's really good at...

It's stunning how many people do this kind of thing in Photoshop or Acrobat, but leave the layers intact, so you can remove the obscuration with a little advanced editing...

Comment Re:Research is a bit blurry (Score 1) 139

Ten years ago, I was CTO for a company making smart touchscreen devices for restaurant and bar tabletops. We didn't have a camera in any of the ones we fielded (people were still to weirded out by that idea, then), but I did some serious technical investigation on whether we could use an intentionally low-res image to determine basic demographics of the diners w/o voilating their privacy.

In my research, I found an really interesting paper (from France, IIRC, it's been a while) showing that even a 16-pixel (!) image could still be used to determine the age and sex of a person to around 80-90% accuracy, and recognize the same person again over half the time. IIRC, it used both neural networks and some standard image processing, but nothing really exotic or so big that we couldn't run it locally in the display device, if we'd decided to. Even the author was amazed that this was possible, because neither he nor anyone else had thought there was enough information there to perform such a feat of recognition.

But computers really don't look at things like we do, and why even "just metadata" (and it's a lot more than that, now) is so dangerous - with some not-too-complicated processing, the machine can tease out patterns in the data that we cannot.... (Note that this means that the spooks probably really can do some of the "ridiculous" image processing and recognition we tend to laugh at in movies and TV shows. No Way Out, indeed....

Slashdot Top Deals

On the Internet, nobody knows you're a dog. -- Cartoon caption