Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Slashdot Deals: Cyber Monday Sale Extended! Courses ranging from coding to project management - all eLearning deals 20% off with coupon code "CYBERMONDAY20". ×

Comment News for (computer architecture) nerds... (Score 3, Interesting) 179

While supercomputing is a very small section of the computing world, it's not that hard to understand.

First of all, this would make for a terrible graphics card. This (deliberately) sits between a CPU and GPU. Each core in a Phi has more branching support, memory space, more complex instructions, etc than a GPU core, but is still more limited than a Xeon core (but it has wider SIMD paths).

A GPU has many more cores that have a much more limited set of operations, which is what is needed for rapid graphics render. But, those limited sets of operations can also very useful in scientific computing.

I haven't seen anybody try a three pronged approach (CPU/Phi/Nvidia Tesla), but I will admit I didn't look very hard. This is all in the name of solving really big problems.

Comment Re:CISC? (Score 3, Informative) 179

Kind of. The advantages of RISC faded pretty fast. The footprint of a decoder between something like x86 and say, ARM is really not that much, and a decoder is just a small part of a core these days. Clock speed is an issue of thermal footprint. So, all the disadvantages of the x86 (and it's x64 extensions) faded in the face of Intel's focus on process improvements. In the end, not even the Itanium could eek out enough of a win to dethrone the x86 architecture.

Comment Re:His analysis is wrong (Score 1) 208

Actually, the average case estimation is correct and is the most common. Your statement that hashing inserts and find are likely O(N) is O(log N) is not true at all. Two things, a perfect hash function for all integers of a given set is pretty trivial. Given than, O(1) worse case time is a given.

More generally, a cryptographic hash will (provably) have very few collisions, so it's not hard to create a hash table that really performs in constant time in all cases. The tradeoff is in space complexity

In this case, using a hash table is indeed the best solution. Your claim that hashing insert and find aren't O(1) is focusing on a basic level of understanding of hash tables. Modern implementations avoid non-constant performance with rebalancing, open addressing schemes and so on.

Comment First things first... (Score 1) 696

Which is convincing our field that diversity helps and lack of it has real impact on the people in it and the society we contribute to.

Because, if you look at discussion like this, too few are willing to admit it is an issue. Too many get defensive and throw up a ton of unrelated issues.

Because it's scary to admit you may not be as rational as you think you are, that you act on societal biases like most people. Even worse, it really hard to have a moment of doubt that you actually may not be a very good person, and all your technical skills and accomplishments does nothing to change that.

That's what this is about. Addressing diversity issues makes things better for everybody. But, no, too many steadfastly refused to accept anybody else's perspective if it is different from their own.

Look at this discussion. Not one well moderated comment had anything that remotely looked like an answer posted in the post itself. Not surprising, because it is clear that not enough people see the issues that do exist and wouldn't be that difficult to solve.

From this, it is clear that there are plenty of programmers that are threatened by the mere idea that they, as a man, could have to work with (or be replaced by) a woman that is just as skilled as they are. This is an ages old theme that plays out over and over again in every aspect of society.

The people that do really well welcome the challenge as an opportunity to improve and learn from all sides. I saw it the CS lab when I was in school. Most were clueless, many resentful, but the really smart ones just worked *with* the women in our class. They didn't tell them what to do, didn't judge, but just collaborated.

To this day, I know a few that resent that the women in our class ended up in the positions they did (most notably, one at JPL working on the Pathfinder mission). The simple fact is that she was that good and they weren't.

My career isn't going well at all, but I'm not going to blame it on so called social justice warriors and affirmative action just to feel better. Easy for me to do. One of my PhD cohort that worked with my advisor was a female. She got a faculty position somewhere, I didn't. In the end, I didn't set myself up like I thought I did, and she did better. End of story.

Making yourself feel better by dismissing progress as the action of out of control protestors (SJW) or as affirmative action gone out of control doesn't actually work in the long run.

Comment There's more to desktops than Gaming... (Score 1) 350

Sure, PC gaming tends to drive the high end market, but there are people that can use as much memory as you can get.

I just finished a new PC build. When I was looking at DDR4 memory, I decided to spend the extra 180 to go from 32G to 64G. Here's the thing. 32G of DDR4 2133 memory was 180 bucks. Memory is not nearly as expensive as PC manufacturers make it.

I know gamers are really obsessed with memory speeds and will pay a very large premium for higher clock speeds on everything. But, some of us do development and other PC tasks and need all the memory for VMs and so on, and we don't overclock because we choose stability over trying to get the last extra FPS out of a game.

Comment Re:Doesn't surprise me (Score 1) 378

Um, I'm not sure what laptops/desktops you are using, but all my laptops (from various makers as well as custom desktops) work just fine with hardware sleep modes. I've had my Dell sleep and wake up more than a thousand time with no problems. Sure, I do have to shut down and restart for updates, but I'd have multiple sleep and wake up cycles in a day, no problem. It can be done.

Comment I'm surprised... (Score 1) 155

I guess Android has the needed hooks to make this happen. That's good. I think that Microsoft would allow Cortana to be replaced as a default on Windows 10 Mobile, if anybody would care enough for it to happen (I doubt it).

Windows 10? Probably not, but again, if people really wanted it, they'd probably do it, but again, I doubt there'd be a call for it. You can use Google Now via Chrome, and Google has shown little interest in native apps for any platform versus Chrome add ins, and think that's probably all any Google Now user would really want anyway.

Of course, Apple has no interest or motivation to bring Siri outside of the iOS/OS X ecosystem.

Oddly, I wouldn't be surprised if Cortana ends up being the most flexible service in terms of third party support and applications, because in a real sense, it has to be.

Comment Not too bad, we will see what sticks... (Score 4, Interesting) 132

To be honest, I'd couldn't have seen even half of the stuff that they shipped every being there when Visual Studio 2013 came out. An Android emulator? Okay. Upcoming Objective-C support? Hum.

It's a big bet that there is enough demand for better cross-platform code sharing for people to start using the Xamarin environment, and it's even a bigger bet that mobile developers will want to bring iOS and Android applications onto Windows.

There is some method to the madness. The Windows Runtime (the engine underneath Universal Apps) and the Core CLR have some compelling technologies that may have appeal outside the Windows ecosystem.

The Windows Runtime is interesting. It is almost completely oriented around asynchronous APIs. Any operation that will (or can) take more than about 50-100 milliseconds will need to have an asynchronous form. Now, the trick is that async/await in C#, promises in JavaScript and Futures in C++ makes consuming that API tolerable (in C#, it's really not hard at all). It is oriented completely around try to make sure that applications can't block and become unresponsive. In short, if you make it harder to do the wrong thing, it will happen less often.

But, the first form was oriented only towards Modern (metro) applications, and we all know how that turned out. The Universal Apps is doubling down on the underlying runtime and support and seeing if they can get better adoption. Hard to say, but it'll be interesting to see how it turns out.

The other interesting front is Android; there's a bunch of libraries that provide alternatives to core Google APIs. I'm fine with that; alternatives are always good. And the Android subsystem in Windows 10, that's interesting.

Anyway, it may bring some hard-core Visual Studio shops into the mobile space, because you can still say "it's all VS". Lastly, it was a price drop. Ultimate doesn't exist anymore, and it's replacement is half the price. Even Premium was more expensive. I half expect more price drops and incentives to drive more people into the ecosystem.

Comment Holding things back... (Score 2) 166

The main problem with the environment is that between Epic Systems and VistA (the VA system), MUMPS holds back some real innovation. Sure, you'll hear tons of success stories about the VA or from EPIC, but the fact is that this vendor lock has huge costs. A major hospital chain I worked with spent a billion+ on a EPIC implementation. Did it improve interoperability with other hospital systems? Nope. Even two EPIC implementations will have a very hard time sharing records.

And that's the whole point of a EMR, to have a consistent version of a patient's history throughout their life. And these systems can't support that model in our current system. If every doctor had to use one system (like in some national health care systems), it'd be better, but that's not what we have. Sadly, the other attempt to standardize healthcare systems and interoperability (HL7) is an equally convoluted mess.

What is needed (but we won't get because of the players) is a standards driven process that is focused on building up a workable ecosystem for exchanging information.

It'd be hard to create a standard that would allow for healthcare professionals to: provide proof that they are a provider and to provide electronic evidence that they have a patient (via a secure exchange) to create a unique identifier for that patient/provider, but it is doable. Solve that, then move on to exchanging of information between providers and allowing providers to mark information as not shareable to the patient or other providers. All of this could be done, and it would provide a real basis for a proper EMR system.

Comment It's all about the environment... (Score 1) 126

Seriously, I'm still waiting from a company that realizes having private offices plus collaborative spaces (you know, a old school office) is the best way to go.

You need quiet to concentrate on a tricky problem. You have it. You need to get together as a team and work on something, you have it too. You have rooms with actual doors, you train people to use a proper conversational (not cell phone loud) tone and boom, productivity. Not chaos that mimics the appearance of creative work, but actual work.

Seriously, hire a developer for six figures and give him a few hundred bucks in desk space that doesn't even have four cube walls? That makes all the sense in the world, right. Argh.

Submission + - Interview: Ask Linus Torvalds a Question

samzenpus writes: Linus Torvalds, the man behind the development of the Linux kernel, needs no introduction to Slashdot readers. Recently, we talked about his opinion on C++, and he talked about the future of Linux when he's gone. It's been a while since we sat down with Linus to ask him questions, so he's agreed to do it again and answer any you may have. Ask as many questions as you'd like, but please keep them to one per post.

Submission + - Tech company finds stolen government log-ins all over Web (

schwit1 writes: A CIA-backed technology company has found logins and passwords for 47 government agencies strewn across the Web — available for hackers, spies and thieves.

Recorded Future, a social media data mining firm backed by the CIA's venture capital arm, says in a report that login credentials for nearly every federal agency have been posted on open Internet sites for those who know where to look.

According to the company, at least 12 agencies don't require authentication beyond passwords to access their networks, so those agencies are vulnerable to espionage and cyberattacks.

Comment Makes sense... (Score 2) 124

The R/D department for this lives in Portland (Perceptive Pixel, acquired by MS). Plenty of room in Wilsonville. Power is still fairly cheap here (hydro power from the Columbia dams). So, yea, makes sense. Sure, milk it for media points, but in the end, it's just a business decision.

Comment Hope it pans out... (Score 4, Insightful) 82

Seriously, severe migraine sufferers and those who suffer from cluster headaches need all the tools we can give them. As noted, if you really read about cluster headaches, it is truly shocking. It is noted sufferers are at a high risk for suicide; after I read what they go through, I was surprised that it is not even higher.

I suffered from migraines, but on the mild to moderate scale. I was lucky, I found a preventative regimen that works very, very well for me. For those with more severe cases, I do hope this is a successful treatment option.

"Everybody is talking about the weather but nobody does anything about it." -- Mark Twain