Please create an account to participate in the Slashdot moderation system


Forgot your password?

Comment: Re: Paul Graham: Let the Other 95% of Great Progra (Score 1) 318

by aardvarkjoe (#48678143) Attached to: Paul Graham: Let the Other 95% of Great Programmers In

Exceptional people are exceptional because of their obsessive unquenchable interest in a subject. Exceptional people don't need "training", they just need experience, and even without experience, can still be much better than "normals". A lot of training focuses on rules of thumbs, and dumbs down those rules of thumbs to the point of being "written in stone". There are so many things taught as "never do this", when really they mean, "you're too stupid to know when to do this correctly". Training can help, but much of it is a waste of time.

In every other discipline that I can think of, the people who are exceptional in their field have undergone extensive training. Scientists, engineers, musicians, artists, athletes -- the list goes on and on. It takes training for people to reach the potential of their innate ability. Why do you think that programming is somehow different?

Comment: Re: Jobs is Jesus (Score 1) 118

I think what's relevant to your claim as you lack something called "a sense of humor". It can commonly be found as something outrageous which is clearly untrue. A statement sort of like "NASA invented computers to go to the Moon, everyone knows that". You see, the "everyone knows that" portion is the hint to that it was a joke.

Yes, and I pointed out that while ACs on /. for some incomprehensible reason often throw around sentences like these, there is actually a significant grain of truth in saying that the Apollo project stimulated certain major developments in the realm of computing (especially real-time computing, but not only there).

And if you're going to go for serious stuff, I think you'll find that the IC was invented at TI by a 30 something engineer by the name of Jack Kilby who had just been hired.

Kilby's design proved to be an evolutionary dead end. His handcrafted germanium stick design was unsuitable for mass production. It was more of a gimmick than anything else. The independent and contemporary planar silicon design by Hoerni and Noyce was vastly superior, and became the foundation for the Fairchild Micrologic line of circuits, which, second-sourced from Philco, were actually flown to the Moon. In fact, it's basically the ancestor of the silicon technology we're still using today.

The first to use it was the Air Force. You notice MIT hasn't appeared yet?

The AGC, designed by MIT IL and manufactured by Raytheon, was the first serially manufactured stored program, general purpose digital computer with logic fully fabricated from standardized IC gate circuits, with the ICs sourced in 1962 and first unit built in 1963. Do you have anything against that claim?

That being said, something like that has a massive history that can hardly be fully attributed to a single person, establishment, country or even time.

Yes, that must be the reason why you mention the historically less important Kilby, without whom the development at Fairchild would have continued unimpeded. Talk about simplifications of history...

As for the software portion you attribute to MIT, well, those are constructs used in generic processing units, and Apollo was one of the first to adopt ICs, but the number of transistors were too small to actually be generic processors, so RTOS and priority scheduling wouldn't have been issues, it would have all been interrupt driven, processing the interrupts in a given sequence, no software scheduling at all. All of these would have been in the form of ASIC (application specific integrated circuit) which though now often have a programmable element, are really pieces of hardware geared towards performing a single task or small set of tasks.

What are you going on about? The ACG was a fully general, stored program digital computer with a priority-driven scheduler implemented in software by Hal Laning, so what is this "no software scheduling at all" supposed to mean? And "processing the interrupts in a given sequence"? The design was asynchronous, that was precisely the novelty of the whole design compared to the state of the art.

So, I ask you, how's your making up history lately? And how's your understanding a joke lately?

I'm not making up history, I'm simply fact-checking extensively. And despite being a schizoid, I actually enjoy humor, but I do have some standards for it.

Comment: Re: Paul Graham: Let the Other 95% of Great Progra (Score 2) 318

by aardvarkjoe (#48676963) Attached to: Paul Graham: Let the Other 95% of Great Programmers In

...while you can train people to be competent, you can't train them to be exceptional.

Two things:

First off, American companies love programmers that are merely "competent" -- or that don't even meet that standard. That's why jobs keep getting shipped overseas to shops that can hire three incompetent programmers for the cost of one good programmer here in the US. The tech companies' actions speak louder than their words here.

Second, while you might not be able to train everyone to become exceptional, it's safe to say that most people with the ability to become exceptional will not do so without training. Mr. Graham is relying on the argument that the only way to get more exceptional programmers in the US is to import them. That is flat-out not true.

Comment: Re: Jobs is Jesus (Score 2) 118

How is that relevant to my claim that MIT pushed for a modern-style embedded system (IC-fabricated stored program digital computer with a priority scheduling, event processing RTOS) as the solution for the control, navigation, and guidance of Apollo?

How's been your reading comprehension lately?

Comment: Re:W3C, please. (Score 2) 167

by K. S. Kyosuke (#48676161) Attached to: MIT Unifies Web Development In Single, Speedy New Language
Except that both languages and "application architectures" are, so as to speak, both based on usefully constraining the set of valid programs. In the long run, though, stuff tends to move into languages, among other things because it allows checking of correctness at the earliest possible moment during development.

Comment: Re:Don't try to abstract a web page (Score 1) 167

by K. S. Kyosuke (#48675539) Attached to: MIT Unifies Web Development In Single, Speedy New Language
Could you explain what you mean by "old desktop program flow"? And what you mean by "working way different than that"? Since the only way I'm aware of doing web applications that work in the way that the web itself works is continuation-based frameworks, and those really don't get used by a lot of people. Most people aren't even aware of them, I'd think.

Comment: Re:W3C, please. (Score 1) 167

by K. S. Kyosuke (#48675515) Attached to: MIT Unifies Web Development In Single, Speedy New Language

I'm really sick of languages that are going to solve all our so-called problems.

Languages have already solved for us the problems of numerically addressed storage, register allocation, memory allocation, compound data types, structured control flow, genericity/polymorphism of pieces of code etc. etc. Virtually all - if not all - pesky problems in day-to-day programming have been solved by appropriate language design. Why shouldn't a language solve the problem of concurrency and distributed applications?

Comment: Re: The Interview hits warez sites (Score 1) 155

Hah, time to rewrite major codecs and file formats in Lisp? (:-) I find it rather somewhat amusing that such things as exploits of data file formats should be even technically possible. One feels like living in the 20th century. Is there any list or summary of those 300 issues found by Google? Just for me to check what kinds of problems were found. It seems intriguing.

Comment: Re: The Interview hits warez sites (Score 1) 155

Wouldn't a file like this have to exploit a whole variety of codecs simultaneously? Surely there must be many decoders on the market, some of them even in hardware. Or has libavcodec recently become the most popular target? I would have thought that an attacker would go after the Windows Media Player instead, simply because of the installed base.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android