Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment: Have times changed that much? (Score 1) 279

by msobkow (#48613071) Attached to: Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

It used to be that any degree would get your "foot in the door" with HR. Some of the best programmers I worked with over the years had degrees in English, Philosophy, and even a History major.

University teaches you how to learn new material, how to prioritize it, how to summarize, how to reach the meat in the middle of the chaff. It does not teach you how to program. While there are benefits to knowing computing theory, it's not theory that gets the job done -- experience does that.

I'm surprised you're having such a tough time finding work if you're actually good at programming. Perhaps it's the way you're presenting yourself in your resume, because, as I said, it doesn't really matter what your degree is in for getting your foot in the door.

Comment: Ph.D. Program? (Score 1) 279

by kramer2718 (#48612027) Attached to: Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

You might consider a Ph.D. program. If your grades are good and you have the basics, and you can tell the department a good story, you can get admitted and get funding in many STEM disciplines.

You'll have to spend a long time getting your Ph.D., but if it's what you want to do, it may be worth it. You should probably choose a program that grants a Master's along the way so that if you don't finish, you'll have something to show for your time.

Comment: And 1...2...3... (Score 0) 433

by msobkow (#48594803) Attached to: Vinyl Record Pressing Plants Struggle To Keep Up With Demand

And in 1...2...3...

Cue all the math junkies who claim that there is "proof" you can't hear the difference between 44.1/16 bit audio streams and higher quality rates like 192/24 or analogue. Because the math "prooves" that thousands upon thousands of people who claim to hear a difference are "delusional liars."

I am neither delusional nor a liar. I hear the difference. It's clear as night and day.

Comment: Re:Doubt it (Score 1) 297

by msobkow (#48594575) Attached to: Blade Runner 2 Script Done, Harrison Ford Says "the Best Ever"

I disagree completely. Good science fiction has never been about the technology, but about human and alien personalities and moral questions brought about by the technology. Good science fiction explores interpersonal relationships, character traits, philosophical stances, and other such subject matter.

The science fiction of the mid-late '80s made good movies because the directors and script writers were selecting stories with deep connotations, instead of viewing them with an eye towards turning them into CGI action flicks emphasizing trivia like "the technology" instead of the plot.

There is still a tremendous amount of good science fiction written over the years that would make terrific movies. But hollywood won't back those "risks" -- they're too busy investing in action movies pretending to be science fiction. There are exceptions to that, but for the most part you know it's true: hollywood doesn't want to discuss morality, philosophy, and personal interactions in a script. They want a nice "safe" piece of pablum that will make audiences go "ooh" and "aah" over the mindless special F/X, not cause them to think for themselves.

Comment: The problem has never changed (Score 2) 240

by msobkow (#48588417) Attached to: Is Enterprise IT More Difficult To Manage Now Than Ever?

The problem has been the same since the PC first came out: users can "do things" with a PC/laptop/smartphone/tablet and think that "doing things" makes them an expert on IT. So when they come up with a "great idea for a new application", they can not and will not fathom the fact that it can take months or years to implement, is going to cost hundreds of thousands if not millions of dollars, and will be obsolete before it ever hits production due to changing business needs.

There is no cure for the "wisdom" of people who tell you how to do your job, or how their 14 year old nephew could write the application in a few weeks. They've made up their mind that you're just a lazy SOB trying to milk the company for money and a cushy job, and will never, ever, ever understand just how much effort goes into security, design, testing, porting, etc. To them, everything is "easy."

The real problem is that companies let such users and managers make business decisions based on "their gut instinct" instead of properly planned and projected schedules. Because heaven forbid you should ever tell the marketting manager that he can't have his shiny Sharepoint solution because it doesn't provide anything useful to the company that can't be accomplished with a properly organized set of folders on a shared drive/server somewhere.

No, they're the ones who sign for the budgets, and they're the ones who like the "shiny", so you're the one who gets stuck trying to make the shiny work with all the line of business systems that are actually important to the operation of the business.

And if you even hint that you can't do it, well, there's a company overseas that's promising to do it in a month as an offshore service, so you're fired.

Which, in a nutshell, is how the bean counters and their ilk get away with their bad business decisions: they constantly hold the threat of offshoring and termination over your head to beat Mr. IT into submission.

Comment: Intelligence does not imply volition (Score 1) 416

by msobkow (#48571093) Attached to: AI Expert: AI Won't Exterminate Us -- It Will Empower Us

Artificial Intelligence does not imply volition. I know of no reason to expect an early AI to have a will or to come up with results expect in response to events and information it's designed to respond to. While some might try to simulate the volition of a live entity, I do not feel it's necessary to include such a component in order to qualify something as an Artificial Intelligence.

Artificial Intelligence just means artificial thought about something. Sufficient understanding of the subject matter to reach conclusions and produce outputs relevant to what is known or implied. Creativity and volition are another kettle of fish entirely.

Comment: It will empower the people who own/direct it (Score 1) 416

by WillAdams (#48569437) Attached to: AI Expert: AI Won't Exterminate Us -- It Will Empower Us

the people who are paying for the development and paying the power bills. Everyone else will be viewed as just a resource to be exploited.

Fictional take on this --- Marshall Brain's novella _Manna_ --- available free on-line: http://marshallbrain.com/manna...

The first half seems all-too-likely, the second, likely impossible.

Comment: I fail to see how they're "losing money" (Score 1) 280

by msobkow (#48560899) Attached to: Utilities Face Billions In Losses From Distributed Renewables

I can see them losing market share to renewables, but that's not the same as losing money.

There is nothing about legislation anywhere in North America that guarantees the continued success of an obsolete business model. No matter how many congressmen and senators the MPAA and RIAA have bought off.

Comment: It was and is the ultimate macro assembler (Score 2) 641

by msobkow (#48554011) Attached to: How Relevant is C in 2014?

C started out with high level "constructs" that were basically the operators of the PDP-11 and VAX processors from DEC. While those constructs have mapped well to other processors, virtually every statement in C originally compiled to one instruction on those processors.

To this day, C still gives you the power and flexibility of any low-level language worth it's salt, and ample opportunity to hang yourself and your code. Don't forget -- C++ originally targetted C as it's output, not machine code. C has similarly provided the "back end" for no small number of special-purpose compilers.

Then there are the operating systems and device drivers that have been written with it, and all the embedded systems logic for devices all and sundry.

C will never die any more than assembly or COBOL and FORTRAN will. There will always be those special-purpose high-performance tasks where it is worthwhile to use C instead of a higher level language. Just as there are times where it still makes sense to drop even lower and into assembly.

You go ahead and try to bootstrap an entire C++ implementation so you can write a kernel in it. Good luck with that. Getting a C library framework running on a bootstrapped kernel is hard enough. C++ would be orders of magnitude harder.

Comment: He doesn't get it (Score 2) 205

by msobkow (#48552731) Attached to: The Failed Economics of Our Software Commons

I work on my pet project (http://msscodefactory.sourceforge.net) because it's a fun challenge I set myself many years ago. Whether others use it is irrelevant. Whether I ever make money off it is irrelevant. There is only one thing that matters to me:

Having fun coding.

That's it. Beginning and end of story. I work on it for fun.

The most exciting phrase to hear in science, the one that heralds new discoveries, is not "Eureka!" (I found it!) but "That's funny ..." -- Isaac Asimov

Working...