Maybe the drones can fly around picking up rubbish. That's one thing the side roads are filled with where I live.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
"only a tiny percentage of kids have a natural aptitude for it."
You have it completely backwards: coding is like playing the guitar: nearly anyone can do it to a satisfactory level.
"The average Global Warming advocate that would call you an industry shill, doesn't even know what the Garbage Patch is"
Brah, do you even THINK?
What exactly is coding talent?
I'm being a bit coy but mostly to spur discussion: I've been coding since the late 70's, and I think of coding like playing guitar: just about anyone can do it to a reasonable level, most people think they are rockstars, but only a handful really are.
When I was first interviewing for jobs circa 1990 there weren't many people who knew x86 protected mode, so there was always work writing hardware drivers. I was mediocre, I'll admit it, and so were most of my peers, but we got the job done.
Today there are literally thousands of languages, frameworks and tools depending on the application. Ironically, "talent" seems largely the same today as it was in the 80's: if you understand the unique collection (and versions!) of tools a company uses, you're in.
When I hired programmers in the 90's and 00's it was clear some folks got it, and some folks didn't. But even the folks that didn't still got high-paying jobs.
So it really begs the question, "What is talent?" and how do you measure it, and how much do you need? Finding talent means rating talent, and therein is a loaded debate.
ITT: people bragging about not watching TV for a decade...
Also, I forgot to mention, Puppet Labs' IT automated config is pretty effing amazing. I've been trying to dedicate time to ramping on it, and have been to a few classes at their office in Portland, but it is definitely on my list of tools to learn.
A bit tongue-in-cheek, but...
Package.json + npm install is a lot easier than dealing with zypper, yum, rpm, and then 30 other package managers I'm forced to juggle with on all the different distros I encounter. Obviously I'm brainwashed, but I've been 100%* node for over a year.
Granted, setting up ___sql will pretty much always be a 1-hour job, i'm glad to be free of the A and P in LAMP.
* except when a new contract requires me to dive into LAMP again
Thanks for clarifying. I misunderstood in my original post and went "full-rant".
This whole discussion just made me laugh whilst remembering the hype around the Transmeta / Torvalds code-morphing engine.
Ah, the 90's. They were fun.
CPUs have been "general purpose" since day one. The only non-general purpose hardware are ASICs (like the article says). Everything else is just marketing hype from Intel, et al.
This is such an amazing rehash of what Intel used to call *T technologies in the 90's, starting from the 80's, when coprocessors started appearing (x87). The big trend was toward DSPs in the 90's, but that never happened, instead they pushed on new hardware like MMX, SSE and now vector processors. That's why we have graphics processors as non-general-purpose CPUs.
To call something a GPGPU is just an egregious assault of on common sense.
"Dark silicon", while a catchy name, is simply a side effect of latency, something the article mostly skips (hints at it with locality): the memory hierarchy exists and dark silicon is a result. When latency is zero, more of the silicon will be engaged.
While one could easily claim that because parts of any chip power down that means it's not general purpose, that's an oversimplification: 100% utilization is fundamentally impossible because problems aren't solved that way, there is no infinite parallelism.
I really think the author's analysis isn't fully developed. While the conclusion that hardware looks like the software may be a pleasant tautology, it overlooks Turing's thesis entirely. Which is odd, because that's what they author -started- with!
Bank account or debit card?!? That's audacious.
Wow. I'm happy with my old fashioned pieces of plastic.
I have 22 years of Quicken data, 20 years with a credit card, and only 3 fraudulent transactions out of ~16,000.
Financial security: check.
Personal information security: ah jeez....
Judging by the # of people on social networks using their real names, I suspect the vast majority of the world will trust the corporation giving you something for "free".
What are these "checks" you speak of?
Seriously, I've written one check in 7 years. I thought people under 50 pretty much stopped using them since credit cards and electronic (non-check) transfers are so easy.
Validation is way more important than writing code. Coding is grunt work that literally anyone can do. There is a huge demand for programmers, and very few are "good" programmers, 90% are just grunts who will never get any better, and that's life due to demand. So you need validation. I wrote and managed RTL development for 15 years at Intel and code coverage is simply mission critical. No other way around it.
If you think being able to "read code" is enough to see all the corner cases, you're either very young, or one of the aforementioned grunts.
I'd hire the person in the blink of an eye. That kind of discipline is sorely missing among younger programmers these days.
It would be a huge help to the community if you would read the paper and point out where the study's methods, analysis, or computations are flawed. You lead on like you know quite a bit about this.