I think what's relevant to your claim as you lack something called "a sense of humor". It can commonly be found as something outrageous which is clearly untrue. A statement sort of like "NASA invented computers to go to the Moon, everyone knows that". You see, the "everyone knows that" portion is the hint to that it was a joke.
Yes, and I pointed out that while ACs on /. for some incomprehensible reason often throw around sentences like these, there is actually a significant grain of truth in saying that the Apollo project stimulated certain major developments in the realm of computing (especially real-time computing, but not only there).
And if you're going to go for serious stuff, I think you'll find that the IC was invented at TI by a 30 something engineer by the name of Jack Kilby who had just been hired.
Kilby's design proved to be an evolutionary dead end. His handcrafted germanium stick design was unsuitable for mass production. It was more of a gimmick than anything else. The independent and contemporary planar silicon design by Hoerni and Noyce was vastly superior, and became the foundation for the Fairchild Micrologic line of circuits, which, second-sourced from Philco, were actually flown to the Moon. In fact, it's basically the ancestor of the silicon technology we're still using today.
The first to use it was the Air Force. You notice MIT hasn't appeared yet?
The AGC, designed by MIT IL and manufactured by Raytheon, was the first serially manufactured stored program, general purpose digital computer with logic fully fabricated from standardized IC gate circuits, with the ICs sourced in 1962 and first unit built in 1963. Do you have anything against that claim?
That being said, something like that has a massive history that can hardly be fully attributed to a single person, establishment, country or even time.
Yes, that must be the reason why you mention the historically less important Kilby, without whom the development at Fairchild would have continued unimpeded. Talk about simplifications of history...
As for the software portion you attribute to MIT, well, those are constructs used in generic processing units, and Apollo was one of the first to adopt ICs, but the number of transistors were too small to actually be generic processors, so RTOS and priority scheduling wouldn't have been issues, it would have all been interrupt driven, processing the interrupts in a given sequence, no software scheduling at all. All of these would have been in the form of ASIC (application specific integrated circuit) which though now often have a programmable element, are really pieces of hardware geared towards performing a single task or small set of tasks.
What are you going on about? The ACG was a fully general, stored program digital computer with a priority-driven scheduler implemented in software by Hal Laning, so what is this "no software scheduling at all" supposed to mean? And "processing the interrupts in a given sequence"? The design was asynchronous, that was precisely the novelty of the whole design compared to the state of the art.
So, I ask you, how's your making up history lately? And how's your understanding a joke lately?
I'm not making up history, I'm simply fact-checking extensively. And despite being a schizoid, I actually enjoy humor, but I do have some standards for it.