Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Comment Re:Bandwagon (Score 1) 311

I'm one of those people who just listens to opinions and picks the best sounding one with very little thinking, mostly emotional. And I'm OK with that because it has served me well for 54 years. Caring about politics is as dumb now as it was the first time I voted almost 40 years ago, and not having my own opinion hasn't changed dick.

Comment Re:Weak Premise (Score 1) 398

What do you mean "best people" and "best universities"?

Why do you assume the best people come from the best universities?

Did you come from the "best" university?

If not, does that mean you're not the "best"? So you don't deserve a job?

If yes, then no one else but your peers from the "Best" universities deserve to be hired?

You make a lot of assumptions: a "best" university, everyone else is not the "best", even the idea of "best" ... this makes for a meaningless argument.

You're arguing that only the "top" people should have jobs, which is utterly laughable, because (a) good look measuring, and (b) that would ultimately mean only one person deserves a job.

This is a big part of the meritocracy myth that drives inequity. And you buy it hook line and sinker.

Comment Why so angry? (Score 0) 398

The reason why Silicon Valley is struggling is very clear: look at the rage in this thread. These are the same people who think diversity is bad because minorities are too stupid to be a part of technology. They're still humping the meritocracy myth.

If you are angry, it means you are smart enough to know they are right, but too worried about your identity to do anything about it. And it is easy for you to do nothing because it doesn't affect you. But, it's not about making you look bad, it's about helping other people who don't have the advantages you were born with.

Realizing you're acting in a way that makes life harder for strangers doesn't make you a bad person, it is what makes you wise.

Comment Define Coding Talent (Score 1, Interesting) 23

What exactly is coding talent?

I'm being a bit coy but mostly to spur discussion: I've been coding since the late 70's, and I think of coding like playing guitar: just about anyone can do it to a reasonable level, most people think they are rockstars, but only a handful really are.

When I was first interviewing for jobs circa 1990 there weren't many people who knew x86 protected mode, so there was always work writing hardware drivers. I was mediocre, I'll admit it, and so were most of my peers, but we got the job done.

Today there are literally thousands of languages, frameworks and tools depending on the application. Ironically, "talent" seems largely the same today as it was in the 80's: if you understand the unique collection (and versions!) of tools a company uses, you're in.

When I hired programmers in the 90's and 00's it was clear some folks got it, and some folks didn't. But even the folks that didn't still got high-paying jobs.

So it really begs the question, "What is talent?" and how do you measure it, and how much do you need? Finding talent means rating talent, and therein is a loaded debate.

Comment LAMP? There's your problem... (Score 0) 136

A bit tongue-in-cheek, but...

Package.json + npm install is a lot easier than dealing with zypper, yum, rpm, and then 30 other package managers I'm forced to juggle with on all the different distros I encounter. Obviously I'm brainwashed, but I've been 100%* node for over a year.

Granted, setting up ___sql will pretty much always be a 1-hour job, i'm glad to be free of the A and P in LAMP.

* except when a new contract requires me to dive into LAMP again

Comment Transmeta (Score 2) 181

This whole discussion just made me laugh whilst remembering the hype around the Transmeta / Torvalds code-morphing engine.

Ah, the 90's. They were fun.

CPUs have been "general purpose" since day one. The only non-general purpose hardware are ASICs (like the article says). Everything else is just marketing hype from Intel, et al.

This is such an amazing rehash of what Intel used to call *T technologies in the 90's, starting from the 80's, when coprocessors started appearing (x87). The big trend was toward DSPs in the 90's, but that never happened, instead they pushed on new hardware like MMX, SSE and now vector processors. That's why we have graphics processors as non-general-purpose CPUs.

To call something a GPGPU is just an egregious assault of on common sense.

"Dark silicon", while a catchy name, is simply a side effect of latency, something the article mostly skips (hints at it with locality): the memory hierarchy exists and dark silicon is a result. When latency is zero, more of the silicon will be engaged.

While one could easily claim that because parts of any chip power down that means it's not general purpose, that's an oversimplification: 100% utilization is fundamentally impossible because problems aren't solved that way, there is no infinite parallelism.

I really think the author's analysis isn't fully developed. While the conclusion that hardware looks like the software may be a pleasant tautology, it overlooks Turing's thesis entirely. Which is odd, because that's what they author -started- with!

Slashdot Top Deals

Successful and fortunate crime is called virtue. - Seneca