Parallelism -- the problem with parallelism is that everyone assumes that all problems can be decomposed into problems which can be solved in parallel. This is the "I all you have is a hammer, everything looks like a nail" problem. There hasn't been a lot of progress on the P vs. NP front, nor does it look like there's likely to be one soon, short of true quantum computing. And no, D-Wave fans: quantum annealing is not the same thing as collapsing the composite wave for into the correct answer because you happen to own the computer in "the most sincere universe".
Productive programming -- It's amusing that a semiconductor vendor would complain about programming productivity. The main barrier to programming productivity is that the silicon doesn't think about problem solving the way you have to think about problem solving in order to get a stepwise improvement. In other words: the chip vendors are making the wrong chips. This is really easy to see if you've done VLSI design in Verilog or VHDL, or even if you've only had to deal with an FPGA. The primary difference is that the chip folks never have to deal with "can't happen" states -- so their silicon compilers simply ignore them, because you on'y ever correctly hook up a chip one way. Take a software engineer and have them code up a bit decoder in VHDL -- it's going to be 10 times larger than what a chip designer would produce because of collapsing "don't care" to something reasonable.
Other than that... interesting interview, even if it doesn't cover a lot of ground, overall.
Oh no, a system makes an improvement, but not a perfect, 100 percent improvement, so what, lets throw out the improvement it *does* make?
It's not an improvement across the board. It's likely not an improvement at all, if you are listening to elevator music to make you calm enough to drive in the first place, and suddenly there's a startling "BRAAAAAAAAAAAAATTTTTTT!" that could just as easily come out of the ambulances horn, but didn't, it came out of your radio.
Also: call me back when it can turn an off radio to on, or force your stereo away from whatever you're listening to, over to the FM band so the ambulance can scream at you more than the flashing lights, siren, and horn are already screaming at you.
Also also: so I assume the computer in self driving cars will now listen to NPR most of the time so that the FM radio will alert the car's driver -- a computer that apparently likes "Lake Woebegone Tales" -- will "hear" the ambulance.
5G: 0 to data cap in 30 seconds! Now that's a fast connection!
It doesn't. Someone has to authorize it with the admin password.
Is this based on anything, or are you just guessing?
The article makes it clear that in order to extract and run the malware, you have to extract and install other malware named "Java".
This "Java" is apparently malware developed by a large database company in order to install security holes in otherwise secure computers, and is so named to trick tired programmers into believing that they are installing coffee.
Whoa whoa whoa.. back up a bit with the gender mud, I'm pretty sure Obama wants to be called he.
Doesn't matter. They'll just be calling us all "cucks" since we don't beat our wives or espouse genocide against people of color.
As of 2004, there were ~530,000 deaf and hard-of-hearing people in Sweden (Encyclopedia of Deafness and Hearing Disorders, p.197.)
So basically 5% of the population isn't going to hear the radio announcements, even if they have their radio on. Which they probably don't, or it's tuned to Sirius Satellite or plugged into their iPod/iPhone.
About as useful as touch screens for amputees whose prosthetic hands can't capacitively couple with trackpads or iPhones...
This is silly. It's like saying the nation's librarians need to teach kids to perform appendectomies, or how to fly a jet airplane, or how to speak Swahili. There's no way that the majority of librarians are qualified to teach programming. If they were, they probably would be doing something related to writing software and not related to library science. And learning to code is no different than learning to engineer a bridge or learning to perform brain surgery. It requires aptitude in the student and competency in the teacher and years of hard work. Trivializing "coding" as if it were something like "typing" or "burger flipping" shows how out of touch the people proposing this actually are. Shame on them for wasting our time and money.
In other news... boomers 20% more valuable at every stage of life than millennials.
"Obama Promises, Including Whistleblower Protections, Disappear From Website"
Who else read this as... "Obama OK's 16 more agencies to engage in domestic spying, in addition to the NSA, which already engaged in the practice"?
You need tender loving care once a week - so that I can slap you into shape. - Ellyn Mustard