Comment in this thread (Score 0) 244
ITT: people bragging about not watching TV for a decade...
ITT: people bragging about not watching TV for a decade...
Also, I forgot to mention, Puppet Labs' IT automated config is pretty effing amazing. I've been trying to dedicate time to ramping on it, and have been to a few classes at their office in Portland, but it is definitely on my list of tools to learn.
A bit tongue-in-cheek, but...
Package.json + npm install is a lot easier than dealing with zypper, yum, rpm, and then 30 other package managers I'm forced to juggle with on all the different distros I encounter. Obviously I'm brainwashed, but I've been 100%* node for over a year.
Granted, setting up ___sql will pretty much always be a 1-hour job, i'm glad to be free of the A and P in LAMP.
* except when a new contract requires me to dive into LAMP again
I ask, as my computer churns away deleting the millions of temp files that a buggy printer subsystem created.
Stupid software must have been doing what its programmer told it to do instead of doing what its programmer intended it to do. Is the alternative, perfectly bug-free software, almost here yet? If not, then it's not silly to worry about what happens when software has write access not only to
"Self-interest" is an instrumental goal toward any terminal goal whatsoever, because "I want X" implies "I want to help with X" for any goal set X the AI can positively affect, and "I want to help with X" entails "I want to exist". You can avoid this by creating software which isn't smart enough to independently identify such obvious subgoals, but then calling the result "AI" is a bit of a stretch.
Thanks for clarifying. I misunderstood in my original post and went "full-rant".
This whole discussion just made me laugh whilst remembering the hype around the Transmeta / Torvalds code-morphing engine.
Ah, the 90's. They were fun.
CPUs have been "general purpose" since day one. The only non-general purpose hardware are ASICs (like the article says). Everything else is just marketing hype from Intel, et al.
This is such an amazing rehash of what Intel used to call *T technologies in the 90's, starting from the 80's, when coprocessors started appearing (x87). The big trend was toward DSPs in the 90's, but that never happened, instead they pushed on new hardware like MMX, SSE and now vector processors. That's why we have graphics processors as non-general-purpose CPUs.
To call something a GPGPU is just an egregious assault of on common sense.
"Dark silicon", while a catchy name, is simply a side effect of latency, something the article mostly skips (hints at it with locality): the memory hierarchy exists and dark silicon is a result. When latency is zero, more of the silicon will be engaged.
While one could easily claim that because parts of any chip power down that means it's not general purpose, that's an oversimplification: 100% utilization is fundamentally impossible because problems aren't solved that way, there is no infinite parallelism.
I really think the author's analysis isn't fully developed. While the conclusion that hardware looks like the software may be a pleasant tautology, it overlooks Turing's thesis entirely. Which is odd, because that's what they author -started- with!
Bank account or debit card?!? That's audacious.
Wow. I'm happy with my old fashioned pieces of plastic.
I have 22 years of Quicken data, 20 years with a credit card, and only 3 fraudulent transactions out of ~16,000.
Financial security: check.
Personal information security: ah jeez....
Judging by the # of people on social networks using their real names, I suspect the vast majority of the world will trust the corporation giving you something for "free".
Checks?
What are these "checks" you speak of?
Seriously, I've written one check in 7 years. I thought people under 50 pretty much stopped using them since credit cards and electronic (non-check) transfers are so easy.
Validation is way more important than writing code. Coding is grunt work that literally anyone can do. There is a huge demand for programmers, and very few are "good" programmers, 90% are just grunts who will never get any better, and that's life due to demand. So you need validation. I wrote and managed RTL development for 15 years at Intel and code coverage is simply mission critical. No other way around it.
If you think being able to "read code" is enough to see all the corner cases, you're either very young, or one of the aforementioned grunts.
I'd hire the person in the blink of an eye. That kind of discipline is sorely missing among younger programmers these days.
It would be a huge help to the community if you would read the paper and point out where the study's methods, analysis, or computations are flawed. You lead on like you know quite a bit about this.
Pssst... this is
#gamergate is over thataway, young man.
I could harvest 5m gmail names from google searches, and then publish them with bogus passwords and create panic. Is there some statistic that says how many of these were real passwords? Because wouldn't it be illegal to use them (accessing another person's account w/o their permission is a crime in the USA).
Seems like it would be easy to manufacture a lot of FUD by making these claims w/o really having any passwords at all, and no one could verify it?
Anything free is worth what you pay for it.