Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment Clone a Pet (Score 1) 350

Considering that a cat or dog matures in about 6 months while a human matures (intellectually/emotionally, not just physically) in perhaps 25 years, a cloned pet would probably be much closer in personality to the "original" than a cloned human and you (as the cloner) would be able to enjoy their companionship much sooner.

Comment Thanks! (Score 1) 575

OP here. Wow! Thanks for the amazing and insightful responses, everyone.

As I read the comments, many of them reminded me of an old programmer's aphorism: "You can program in assembly in any language." I need to stop trying to program in Java in JavaScript, get past the superficial syntactic similarities, and immerse myself in JavaScript and its paradigm to learn to "speak it like a native."

I had already discovered Douglas Crockford's JavaScript: The Good Parts and jQuery but I will grab some of other books and learning resources. The explanation of how the various other frameworks, like MooTools, CoreJS, Dojo, ExtJS, etc., relate to each other was also helpful in understanding the overall JS ecosystem.

Finally, I will grab JSLint, JSCoverage, and QUnit (as one poster said, "QUnit is now your compiler"). I've been oscillating between Komodo and WebStorm, but neither feels quite right for me so will give Aptana a look.

Again, thanks for all the feedback!


Submission + - Making JavaScript Tolerable for A Dyed-in-the-Wool C/C++/Java Guy 6

DocDyson writes: I'm a dyed-in-the-wool C/C++/Java developer with over 20 years of experience. I'm making a good living and having fun doing back-end Java work right now, but I strongly believe in being a generalist, so I'm finally trying to learn the HTML5/CSS3/JavaScript future of the Web. However, I find JavaScript's weak typing and dynamic nature difficult to adapt to because I'm so used to strongly-typed, compiled languages with lots of compile-time error-checking and help from the IDE. Does anyone out there who has made this transition have any tips in terms of the best tools and libraries to use to make JavaScript more palatable to us old-school developers?

Comment It Already Is (Score 1) 317

(more or less). I went through school in the 1970's and 1980's and the last important thing I learned was how to read, write, and do basic arithmetic. I self-educated on everything else. At the insistence of my wife, who is less enthusiastic about home-schooling and non-traditional learning than I am, I am now watching my son suffer through the same thing and he has reached the same point where the learning stops and the boredom and misery begins: he knows enough reading, writing, and arithmetic that he could learn everything else on his own, on demand, when he wants/needs to, with just a little guidance from us (as parents) and mentors.

Comment Re:Sigh (Score 1) 348

You have just made the point that I was trying to make. "Licensing, permits, regulations, insurances, bonds, etc." "Approved tools and technologies." Compliance has seemingly become the primary function of most enterprises. What happened to innovation? What happened to freedom? What happened to creativity? What happened to the power and potential that drew so many of us to computers in the first place? Where are the flying cars?

Comment Re:Hardly. (Score 3, Informative) 348

IT security has to be about risk management, not absolute risk avoidance. I've worked in organizations where security paranoia dominated all IT decision-making and it cost them dearly: tons and tons of money spent on IT and all it really did for the end-users was email and the Office suite. The organization had enterprise licenses for Visio, the Adobe Creative Suite, Visual Studio, CASE tools, and all kinds of other goodies, but it effectively took an act of god to get them installed on your machine, so most people just gave up. IT spent all its time resetting people's ridiculously long, impossible to remember, and always-expiring passwords. Right after Windows 7 came out, they finally "upgraded" to Vista. We probably would have been better off with a notepad, a bunch of inter-office mailers, and a nice mechanical pencil.

The cat, however, is out of the bag. The managers and executives who had a little vision (almost all in the business side, almost none from IT) leave the office, use all this cool tech in their personal lives, and start asking questions:

"Why does Quicken give me more insight into my personal finances than SAP gives me into my company's finances?"

"How come I have to send my people to a week of training on SAP anyway? Nobody came to my house and showed me how to use Quicken."

"How come I've never had a virus infection on my PC at home? All I do is keep the OS and apps updated and run a decent, up-to-date anti-virus package that cost me like $50. We spend a small fortune on anti-virus software at work, IT has gotten so paranoid they've disabled flash drives, and we still get viruses all the time!"

I understand that losing thousands of credit card numbers is a Bad Thing. But very few end-point devices, users, or applications should have access to that kind of data. Not even the CEO needs it and a sane CEO wouldn't even want it. For that matter, do you REALLY have to be storing credit card numbers?

Of course, there are other kinds of confidential data. But it would seem to me (as a developer, admittedly not a security guy) that there should be different levels of security for different kinds of data and different applications. Truly confidential data could, for example, require two-factor authentication with a smartcard, biometrics, or whatever. You could require digital signatures and encryption on confidential email. But giving every user a crippled Blackberry to carry around when what they really want to be able to do is see their (unencrypted) work email and calendar on the iPhone or Android device that they love and already own is just not acceptable any more.

Both sides are going to have meet in the middle. Freedom and responsibility go together. Users are going to have to step up, get educated, take more responsibility for their IT, and exercise the common sense that stops the vast majority of common threats like virus infections. IT is going to have to figure out how to be responsive to the users and add value to the business. Otherwise, it's just going to be bypassed, have its budget cut, and, as an AC below said, the business will just go "to the cloud."

Comment Re:Sigh (Score 2) 348

So, in essence, our litigious society and the risk-averse enterprise culture that litigation and regulation foster are the reason why enterprise IT is, in many organizations, in the Dark Ages compared to what a tech-savvy user can do with their personal IT.

Submission + - Lisp Creator John McCarth dead at 84 (

johnjaydk writes: The creator of Lisp and arguably the father of modern artificial intelligence, John McCarthy, died last night.

Lisp was (and to some extend still is) a radical leap forward and have had a strong influence on a lot of other languages although many still refuse to see beauty between parens.


Submission + - John McCarthy, creator of Lisp, has died (

mikejuk writes: John McCarthy, the man who, among other things, first coined the term "Artificial Intelligence" and who invented the Lisp programming language died, aged 84, on October 23, 2011. The first use of the term "Artificial Intelligence" came in John McCarthy's proposal for a two-month, ten-man workshop to be carried out at Dartmouth College in 1956. This event went ahead, with Marvin Minsky, Claude Shannon, Nathaniel Rochester, Arthur Samuel, Allen Newell, Herbert Simon, Trenchard More, Ray Solomonoff and Oliver Selfridge, and is considered as "the birth" of AI.
McCarthy went on to create LISP. motivated by his
"desire for an algebraic list processing language for artificial intelligence work",
Best known as a way to torment students with brackets it is still considered to be the language of AI and it has influenced languages as different as JavaScript and Clojure.

Comment Where To? (Score 1) 314

I am a long-time Windows/.NET developer, but have reached a point where I want to become part of the much stronger, more vibrant open-source community that has developed around Linux, Java, Apache, MySQL, etc. Just as I started making this transition, Oracle's acquisition of two of the key pieces of this ecosystem (Java and MySQL) seems to be disrupting this (comparative) paradise. What's the consensus of the hive-mind on the future? Can the Linux vendors, the Apache Foundation, and their alies sustain the Java ecosystem without/in spite of Oracle? If not, where do we go from here? Dust off our old C++ skills? Adopt Google Go, Haskell, or some other next-generation language and re-build the ecosystem around it? Or are we collectively doomed to fragmentation again?

Slashdot Top Deals

The "cutting edge" is getting rather dull. -- Andy Purshottam