Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment That's good, but EHRs are needed... (Score 1) 111

Look, the truly awful, horribly expensive solutions that lock people into insanely overpriced development projects are truly bad. Federal investigation into this company for ripping people off bad. No question. For the very few hospital systems that had their own home-grown systems, they do and still do okay.

But, the law had a purpose. Not having access to a comprehensive medical records causes injury and death from decisions made without the full record. It's a fairly well researched fact. But, nothing about the current systems address that need in any real way. Frankly, vendors have made claims that are (in my mind) almost criminally false.

It'd be nice to point and say that engineers are programmers are at fault, but if you look at those vendors and the medical informatics field in general, those who make the decisions are often doctors, nurses and other health professionals. The field is littered those who are unhappy with practicing medicine and think they can be software engineers and researchers instead and the result is an unspeakable mess.

The first step is that doctors, nurses, and so on will have to work with software engineers, system analysts, interaction designers and so on as peers. Not as contractors, not as subordinates, but as peers. And there are way too few doctors, etc. that can accept that a set of programmers can have just as much of an impact on the health of our population as they can. For better or for worse. And, yes, sadly, too many engineers have too much hubris and disrespect for how hard the care of other human beings truly is.

And, until we in America accept that access to universal, affordable healthcare is a fundamental right, we won't look far enough past profit to make a difference anyway. So, in the meantime, people will get needlessly hurt, will needlessly suffer and needlessly die.

Comment Waste is still a problem... (Score 1) 645

While there is a lot of new proposed designs for reactors, the nature of nuclear energy is that there is waste. Waste that is insanely toxic, some of it for an incredibly long time. In the end, all nuclear energy creates a very, very long term problem that is much harder to address. Building a reactor is fairly easy compared to figuring out what to do with what it leaves behind.

Comment It is food for thought... (Score 1) 100

The basic point of the article is dead on. The major assumption that I/O is extremely slow has driven the organization of computer architecture from the beginning. But, as the article noted, in the last few years, that equation can be changed drastically. The memory hierarchy is going to get more complicated: DRAM, NVDIMM, NVM, SSD, HD, Optical/Tape, and best using that hierarchy means that there are changes that need to be made.

For one, I think there will be a lot of research in this area. Just like modern network cards do a lot of processing before involving the CPU, it may be necessary to have similar abilities. For example, allowing the network and I/O devices to work with each other with much less CPU intervention. Or making the I/O controllers smarter so that computation can be moved to the disks. How to best organize operating systems to use this new memory hierarchy well.

Comment Widely used, but vulnerable. (Score 1) 358

The Java language and frameworks still have their fans, but the platform has just ground to a halt and has ossified into an overly rigid and verbose environment that takes increasingly longer to get anything done in compared to other alternatives.

Type erasure was hailed as the right thing. Time has show it was not, not at all. C#, Swift and Scala have all shown that proper generics really make for better APIs and programming models. More and more programmers are comfortable with higher-order programming, and Java is just behind in this respect, even with regards to C++14.

Build tools are still a hodge podge of Ant, Maven and Gradle. Maven can make even worse MSBuild file seem reasonable. And package management is just a mess. Even Microsoft is managing to use NuGet to make projects more management and modular. And Java, well, it's module improvements will be too little, too late. And Java projects are just huge monoliths that are hard to maintain and improve.

The JCP has just ground to a complete halt, and once leading edge frameworks have just lagged in terms of new ideas and innovations. I know Java programmers that loath using relational databases. Object Relational Mapping was once a strength of Java, but things like JPA have made verbose and clunky. Entity Framework just makes anything in the Java space look outdated.

It is true that the JVM itself has proven to very effective, and it's use will continue. But, it has improving contenders in LLVM and the CLR (via .Net Core). And in many ways, Java prevents the JVM from evolving (invoke dynamic is a great example).

Between Scala, Clojure and C#, there are better alternatives to Java in the enterprise space. And, in terms of mobile, Google has no reason to keep using Java. Swift has shown that developers want better, more expressive languages. Google has plenty of choices to choose from.

The reasons for using C and C++ haven't changed. The domains they serve they serve well and continue to do so. But, there are less and less reasons to use Java, and it is much easier to replace and displace unlike COBOL and related mainframe tech.

Comment Re:NoSQL is amateur land. (Score 1) 96

I have to agree. It's crazy how far people go to avoid just having to deal with relational database in OO languages. Sure, things like Hibernate/JPA can be clunky, but there are better models out there.

I can't count how many times I've had to explain to people that because we are using .Net, it's really not hard to use relational databases, and we don't need all those things that NoSQL people thing are revolutionary and groundbreaking, because LINQ has been working fine for years, and when it comes down to it, you've got SQL as an option too.

Does anybody remember when the NoSQL people swore that wouldn't never create new query languages, that they weren't reinventing the wheel, that this was an evolution. Guess what, everybody is inventing new query languages and ideas and features that relational databases have had for ages.

Comment A start, but not enough (Score 1) 52

The NIH and NSF budgets would need a much larger bump to really kick off some major new initiatives, much less restore funding to useful programs. Translational medicine research programs have stalled, and major disease foundations are having to fund tons of foundational work. Also, there's no "moonshot" type of projects, for example, setting a goal of creating a battery that has 50% more capacity for the same weight (vs current best technology), notable gains in wind or solar efficiency, massive improvements to the power grid, and so on.

Comment Re:Intel not the only factor... (Score 1) 225

I disagree. AMD played the largest role in failing to press advantages they had in growing markets up to the K10 microarchitecture (2006/2007). The ATI acquisition spilt their focus and reduced their resources in CPU microarchitecture, as well as tying up a lot of cash. The APU idea didn't play out nearly as well as it needed to. Then, the spinoff of Global Foundries and some process misses. Then Bulldozer. Intel killed NetBurst pretty fast. They had too, AMD was beating them on all fronts. But, they got Core online pretty quick, then Nehalem. Bulldozer was introduced in 2011, and Zen won't be online until 2017 at the earliest. So, Intel executed and AMD didn't. In the end, the antitrust settlement was a moot point. The damage was done, and the biggest blows were self inflicted.

Comment Intel not the only factor... (Score 3, Interesting) 225

Sure, lots of controversy over their actions in the late 90s and early 2000s, but by 2005, Intel had recovered from the mistakes made in NetBurst. Starting with the Core microarchitecture, Intel made some very strong advances in process and gains in their CPU architectures in the consumer and server spaces. AMD got distracted with the APU designs and made a huge misstep with the Bulldozer line. I think the ATI acquisition was a distraction as well. Meanwhile, Sandy Bridge was in place and allow Intel to make gains all around. By the time Haswell was in place, their entire lineup was solid. They had the core counts to match the high end Opterons, they were pushing ahead on virtualization (VT-D, APICv) and AMD was and is in a rough spot.

Zen needs to have good parity with Skylake for AMD to regain market share, and that's a tough task. Also, Intel has major process advantages. They are at 14nm already, which helps keep yield up as transistor count rises (core count). They do have an advantage in the all in one market and do very well in the budget segments. We will see if their ARM based assets play out, but it's going to be tough going for AMD with Intel on one side and NVidia on the other.

Comment News for (computer architecture) nerds... (Score 3, Interesting) 179

While supercomputing is a very small section of the computing world, it's not that hard to understand.

First of all, this would make for a terrible graphics card. This (deliberately) sits between a CPU and GPU. Each core in a Phi has more branching support, memory space, more complex instructions, etc than a GPU core, but is still more limited than a Xeon core (but it has wider SIMD paths).

A GPU has many more cores that have a much more limited set of operations, which is what is needed for rapid graphics render. But, those limited sets of operations can also very useful in scientific computing.

I haven't seen anybody try a three pronged approach (CPU/Phi/Nvidia Tesla), but I will admit I didn't look very hard. This is all in the name of solving really big problems.

Comment Re:CISC? (Score 3, Informative) 179

Kind of. The advantages of RISC faded pretty fast. The footprint of a decoder between something like x86 and say, ARM is really not that much, and a decoder is just a small part of a core these days. Clock speed is an issue of thermal footprint. So, all the disadvantages of the x86 (and it's x64 extensions) faded in the face of Intel's focus on process improvements. In the end, not even the Itanium could eek out enough of a win to dethrone the x86 architecture.

Comment Re:His analysis is wrong (Score 1) 208

Actually, the average case estimation is correct and is the most common. Your statement that hashing inserts and find are likely O(N) is O(log N) is not true at all. Two things, a perfect hash function for all integers of a given set is pretty trivial. Given than, O(1) worse case time is a given.

More generally, a cryptographic hash will (provably) have very few collisions, so it's not hard to create a hash table that really performs in constant time in all cases. The tradeoff is in space complexity

In this case, using a hash table is indeed the best solution. Your claim that hashing insert and find aren't O(1) is focusing on a basic level of understanding of hash tables. Modern implementations avoid non-constant performance with rebalancing, open addressing schemes and so on.

Comment First things first... (Score 1) 696

Which is convincing our field that diversity helps and lack of it has real impact on the people in it and the society we contribute to.

Because, if you look at discussion like this, too few are willing to admit it is an issue. Too many get defensive and throw up a ton of unrelated issues.

Because it's scary to admit you may not be as rational as you think you are, that you act on societal biases like most people. Even worse, it really hard to have a moment of doubt that you actually may not be a very good person, and all your technical skills and accomplishments does nothing to change that.

That's what this is about. Addressing diversity issues makes things better for everybody. But, no, too many steadfastly refused to accept anybody else's perspective if it is different from their own.

Look at this discussion. Not one well moderated comment had anything that remotely looked like an answer posted in the post itself. Not surprising, because it is clear that not enough people see the issues that do exist and wouldn't be that difficult to solve.

From this, it is clear that there are plenty of programmers that are threatened by the mere idea that they, as a man, could have to work with (or be replaced by) a woman that is just as skilled as they are. This is an ages old theme that plays out over and over again in every aspect of society.

The people that do really well welcome the challenge as an opportunity to improve and learn from all sides. I saw it the CS lab when I was in school. Most were clueless, many resentful, but the really smart ones just worked *with* the women in our class. They didn't tell them what to do, didn't judge, but just collaborated.

To this day, I know a few that resent that the women in our class ended up in the positions they did (most notably, one at JPL working on the Pathfinder mission). The simple fact is that she was that good and they weren't.

My career isn't going well at all, but I'm not going to blame it on so called social justice warriors and affirmative action just to feel better. Easy for me to do. One of my PhD cohort that worked with my advisor was a female. She got a faculty position somewhere, I didn't. In the end, I didn't set myself up like I thought I did, and she did better. End of story.

Making yourself feel better by dismissing progress as the action of out of control protestors (SJW) or as affirmative action gone out of control doesn't actually work in the long run.

Comment There's more to desktops than Gaming... (Score 1) 350

Sure, PC gaming tends to drive the high end market, but there are people that can use as much memory as you can get.

I just finished a new PC build. When I was looking at DDR4 memory, I decided to spend the extra 180 to go from 32G to 64G. Here's the thing. 32G of DDR4 2133 memory was 180 bucks. Memory is not nearly as expensive as PC manufacturers make it.

I know gamers are really obsessed with memory speeds and will pay a very large premium for higher clock speeds on everything. But, some of us do development and other PC tasks and need all the memory for VMs and so on, and we don't overclock because we choose stability over trying to get the last extra FPS out of a game.

Comment Re:Doesn't surprise me (Score 1) 378

Um, I'm not sure what laptops/desktops you are using, but all my laptops (from various makers as well as custom desktops) work just fine with hardware sleep modes. I've had my Dell sleep and wake up more than a thousand time with no problems. Sure, I do have to shut down and restart for updates, but I'd have multiple sleep and wake up cycles in a day, no problem. It can be done.

Slashdot Top Deals

It is contrary to reasoning to say that there is a vacuum or space in which there is absolutely nothing. -- Descartes

Working...