Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Have times changed that much? (Score 1) 280

It used to be that any degree would get your "foot in the door" with HR. Some of the best programmers I worked with over the years had degrees in English, Philosophy, and even a History major.

University teaches you how to learn new material, how to prioritize it, how to summarize, how to reach the meat in the middle of the chaff. It does not teach you how to program. While there are benefits to knowing computing theory, it's not theory that gets the job done -- experience does that.

I'm surprised you're having such a tough time finding work if you're actually good at programming. Perhaps it's the way you're presenting yourself in your resume, because, as I said, it doesn't really matter what your degree is in for getting your foot in the door.

Comment And 1...2...3... (Score 0) 433

And in 1...2...3...

Cue all the math junkies who claim that there is "proof" you can't hear the difference between 44.1/16 bit audio streams and higher quality rates like 192/24 or analogue. Because the math "prooves" that thousands upon thousands of people who claim to hear a difference are "delusional liars."

I am neither delusional nor a liar. I hear the difference. It's clear as night and day.

Comment Re:Doubt it (Score 1) 299

I disagree completely. Good science fiction has never been about the technology, but about human and alien personalities and moral questions brought about by the technology. Good science fiction explores interpersonal relationships, character traits, philosophical stances, and other such subject matter.

The science fiction of the mid-late '80s made good movies because the directors and script writers were selecting stories with deep connotations, instead of viewing them with an eye towards turning them into CGI action flicks emphasizing trivia like "the technology" instead of the plot.

There is still a tremendous amount of good science fiction written over the years that would make terrific movies. But hollywood won't back those "risks" -- they're too busy investing in action movies pretending to be science fiction. There are exceptions to that, but for the most part you know it's true: hollywood doesn't want to discuss morality, philosophy, and personal interactions in a script. They want a nice "safe" piece of pablum that will make audiences go "ooh" and "aah" over the mindless special F/X, not cause them to think for themselves.

Comment The problem has never changed (Score 2) 241

The problem has been the same since the PC first came out: users can "do things" with a PC/laptop/smartphone/tablet and think that "doing things" makes them an expert on IT. So when they come up with a "great idea for a new application", they can not and will not fathom the fact that it can take months or years to implement, is going to cost hundreds of thousands if not millions of dollars, and will be obsolete before it ever hits production due to changing business needs.

There is no cure for the "wisdom" of people who tell you how to do your job, or how their 14 year old nephew could write the application in a few weeks. They've made up their mind that you're just a lazy SOB trying to milk the company for money and a cushy job, and will never, ever, ever understand just how much effort goes into security, design, testing, porting, etc. To them, everything is "easy."

The real problem is that companies let such users and managers make business decisions based on "their gut instinct" instead of properly planned and projected schedules. Because heaven forbid you should ever tell the marketting manager that he can't have his shiny Sharepoint solution because it doesn't provide anything useful to the company that can't be accomplished with a properly organized set of folders on a shared drive/server somewhere.

No, they're the ones who sign for the budgets, and they're the ones who like the "shiny", so you're the one who gets stuck trying to make the shiny work with all the line of business systems that are actually important to the operation of the business.

And if you even hint that you can't do it, well, there's a company overseas that's promising to do it in a month as an offshore service, so you're fired.

Which, in a nutshell, is how the bean counters and their ilk get away with their bad business decisions: they constantly hold the threat of offshoring and termination over your head to beat Mr. IT into submission.

Comment Intelligence does not imply volition (Score 1) 417

Artificial Intelligence does not imply volition. I know of no reason to expect an early AI to have a will or to come up with results expect in response to events and information it's designed to respond to. While some might try to simulate the volition of a live entity, I do not feel it's necessary to include such a component in order to qualify something as an Artificial Intelligence.

Artificial Intelligence just means artificial thought about something. Sufficient understanding of the subject matter to reach conclusions and produce outputs relevant to what is known or implied. Creativity and volition are another kettle of fish entirely.

Comment I fail to see how they're "losing money" (Score 1) 280

I can see them losing market share to renewables, but that's not the same as losing money.

There is nothing about legislation anywhere in North America that guarantees the continued success of an obsolete business model. No matter how many congressmen and senators the MPAA and RIAA have bought off.

Comment It was and is the ultimate macro assembler (Score 2) 641

C started out with high level "constructs" that were basically the operators of the PDP-11 and VAX processors from DEC. While those constructs have mapped well to other processors, virtually every statement in C originally compiled to one instruction on those processors.

To this day, C still gives you the power and flexibility of any low-level language worth it's salt, and ample opportunity to hang yourself and your code. Don't forget -- C++ originally targetted C as it's output, not machine code. C has similarly provided the "back end" for no small number of special-purpose compilers.

Then there are the operating systems and device drivers that have been written with it, and all the embedded systems logic for devices all and sundry.

C will never die any more than assembly or COBOL and FORTRAN will. There will always be those special-purpose high-performance tasks where it is worthwhile to use C instead of a higher level language. Just as there are times where it still makes sense to drop even lower and into assembly.

You go ahead and try to bootstrap an entire C++ implementation so you can write a kernel in it. Good luck with that. Getting a C library framework running on a bootstrapped kernel is hard enough. C++ would be orders of magnitude harder.

Comment He doesn't get it (Score 2) 205

I work on my pet project (http://msscodefactory.sourceforge.net) because it's a fun challenge I set myself many years ago. Whether others use it is irrelevant. Whether I ever make money off it is irrelevant. There is only one thing that matters to me:

Having fun coding.

That's it. Beginning and end of story. I work on it for fun.

Comment Re:What's wrong with emacs and make ? (Score 1) 115

Use an IDE to edit? You're kidding, right?

Why in all that's holy would I load up a multi-megabyte behemoth instead of using a text editor for editing code? I use the IDE to fix build errors that result, and to do the debugging.

But with ant handling the build process and a decent debugger, I see absolutely no need for an IDE. In fact, Eclipse crashes about half the time I try to use it, so I can't use it for projects the size I work on as a build manager. It pukes itself far too often, forcing a complete rebuild every time. And the more code has to be rebuilt, the more likely it is to puke on itself.

No man. A decent editor like vi or emacs, a build manager, and a debugger are all you need. Loading up a whole IDE is overkill.

But then again, I've never seen any debuggers other than IDEs for Java.

Comment Fluff piece (Score 4, Informative) 56

The summary and the article itself are so fluffy and short that they don't give any useful information about how this material relates to quantum computing, nor why it's properties are significant. There is mention of a class of electrons involved, but not how nor why this particular type of electron is relevant to quantum computing.

It sounds interesting and all, but it would have been nice to have enough information to give one something to think about instead of just having to assume that the high faluting professors know their shit and must be right. :P

Slashdot Top Deals

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...