Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror

Comment: Re: Jobs is Jesus (Score 1) 100

I think what's relevant to your claim as you lack something called "a sense of humor". It can commonly be found as something outrageous which is clearly untrue. A statement sort of like "NASA invented computers to go to the Moon, everyone knows that". You see, the "everyone knows that" portion is the hint to that it was a joke.

Yes, and I pointed out that while ACs on /. for some incomprehensible reason often throw around sentences like these, there is actually a significant grain of truth in saying that the Apollo project stimulated certain major developments in the realm of computing (especially real-time computing, but not only there).

And if you're going to go for serious stuff, I think you'll find that the IC was invented at TI by a 30 something engineer by the name of Jack Kilby who had just been hired.

Kilby's design proved to be an evolutionary dead end. His handcrafted germanium stick design was unsuitable for mass production. It was more of a gimmick than anything else. The independent and contemporary planar silicon design by Hoerni and Noyce was vastly superior, and became the foundation for the Fairchild Micrologic line of circuits, which, second-sourced from Philco, were actually flown to the Moon. In fact, it's basically the ancestor of the silicon technology we're still using today.

The first to use it was the Air Force. You notice MIT hasn't appeared yet?

The AGC, designed by MIT IL and manufactured by Raytheon, was the first serially manufactured stored program, general purpose digital computer with logic fully fabricated from standardized IC gate circuits, with the ICs sourced in 1962 and first unit built in 1963. Do you have anything against that claim?

That being said, something like that has a massive history that can hardly be fully attributed to a single person, establishment, country or even time.

Yes, that must be the reason why you mention the historically less important Kilby, without whom the development at Fairchild would have continued unimpeded. Talk about simplifications of history...

As for the software portion you attribute to MIT, well, those are constructs used in generic processing units, and Apollo was one of the first to adopt ICs, but the number of transistors were too small to actually be generic processors, so RTOS and priority scheduling wouldn't have been issues, it would have all been interrupt driven, processing the interrupts in a given sequence, no software scheduling at all. All of these would have been in the form of ASIC (application specific integrated circuit) which though now often have a programmable element, are really pieces of hardware geared towards performing a single task or small set of tasks.

What are you going on about? The ACG was a fully general, stored program digital computer with a priority-driven scheduler implemented in software by Hal Laning, so what is this "no software scheduling at all" supposed to mean? And "processing the interrupts in a given sequence"? The design was asynchronous, that was precisely the novelty of the whole design compared to the state of the art.

So, I ask you, how's your making up history lately? And how's your understanding a joke lately?

I'm not making up history, I'm simply fact-checking extensively. And despite being a schizoid, I actually enjoy humor, but I do have some standards for it.

Comment: Re: Jobs is Jesus (Score 2) 100

How is that relevant to my claim that MIT pushed for a modern-style embedded system (IC-fabricated stored program digital computer with a priority scheduling, event processing RTOS) as the solution for the control, navigation, and guidance of Apollo?

How's been your reading comprehension lately?

Comment: Re:W3C, please. (Score 1) 154

by K. S. Kyosuke (#48676161) Attached to: MIT Unifies Web Development In Single, Speedy New Language
Except that both languages and "application architectures" are, so as to speak, both based on usefully constraining the set of valid programs. In the long run, though, stuff tends to move into languages, among other things because it allows checking of correctness at the earliest possible moment during development.

Comment: Re:Don't try to abstract a web page (Score 1) 154

by K. S. Kyosuke (#48675539) Attached to: MIT Unifies Web Development In Single, Speedy New Language
Could you explain what you mean by "old desktop program flow"? And what you mean by "working way different than that"? Since the only way I'm aware of doing web applications that work in the way that the web itself works is continuation-based frameworks, and those really don't get used by a lot of people. Most people aren't even aware of them, I'd think.

Comment: Re:W3C, please. (Score 1) 154

by K. S. Kyosuke (#48675515) Attached to: MIT Unifies Web Development In Single, Speedy New Language

I'm really sick of languages that are going to solve all our so-called problems.

Languages have already solved for us the problems of numerically addressed storage, register allocation, memory allocation, compound data types, structured control flow, genericity/polymorphism of pieces of code etc. etc. Virtually all - if not all - pesky problems in day-to-day programming have been solved by appropriate language design. Why shouldn't a language solve the problem of concurrency and distributed applications?

Comment: Re: The Interview hits warez sites (Score 1) 154

Hah, time to rewrite major codecs and file formats in Lisp? (:-) I find it rather somewhat amusing that such things as exploits of data file formats should be even technically possible. One feels like living in the 20th century. Is there any list or summary of those 300 issues found by Google? Just for me to check what kinds of problems were found. It seems intriguing.

Comment: Re: The Interview hits warez sites (Score 1) 154

Wouldn't a file like this have to exploit a whole variety of codecs simultaneously? Surely there must be many decoders on the market, some of them even in hardware. Or has libavcodec recently become the most popular target? I would have thought that an attacker would go after the Windows Media Player instead, simply because of the installed base.

Comment: Re:Slashdot is exceeding itself lately... (Score 1) 216

by K. S. Kyosuke (#48666255) Attached to: Tech's Gender Gap Started At Stanford

1943 to 1945 - women were about 95% of the computing workforce.

And by computing workforce, you mean this?

Furthermore, even quite some time after the advent of digital computers, there was this period in which the prevalent opinion in the field was that computer hardware design happened to be the actually important (and perhaps prestigious) job whereas programming said computers was a lowly, clerical work... I don't think that anyone should be surprised to whom these new jobs initially went. In other words, I find it plausible that the initial high involvement of women in early computing was actually a symptom of pre-existing sexism rather than of later-lost equality.

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...