I think what's relevant to your claim as you lack something called "a sense of humor". It can commonly be found as something outrageous which is clearly untrue. A statement sort of like "NASA invented computers to go to the Moon, everyone knows that". You see, the "everyone knows that" portion is the hint to that it was a joke.
Yes, and I pointed out that while ACs on
And if you're going to go for serious stuff, I think you'll find that the IC was invented at TI by a 30 something engineer by the name of Jack Kilby who had just been hired.
Kilby's design proved to be an evolutionary dead end. His handcrafted germanium stick design was unsuitable for mass production. It was more of a gimmick than anything else. The independent and contemporary planar silicon design by Hoerni and Noyce was vastly superior, and became the foundation for the Fairchild Micrologic line of circuits, which, second-sourced from Philco, were actually flown to the Moon. In fact, it's basically the ancestor of the silicon technology we're still using today.
The first to use it was the Air Force. You notice MIT hasn't appeared yet?
The AGC, designed by MIT IL and manufactured by Raytheon, was the first serially manufactured stored program, general purpose digital computer with logic fully fabricated from standardized IC gate circuits, with the ICs sourced in 1962 and first unit built in 1963. Do you have anything against that claim?
That being said, something like that has a massive history that can hardly be fully attributed to a single person, establishment, country or even time.
Yes, that must be the reason why you mention the historically less important Kilby, without whom the development at Fairchild would have continued unimpeded. Talk about simplifications of history...
As for the software portion you attribute to MIT, well, those are constructs used in generic processing units, and Apollo was one of the first to adopt ICs, but the number of transistors were too small to actually be generic processors, so RTOS and priority scheduling wouldn't have been issues, it would have all been interrupt driven, processing the interrupts in a given sequence, no software scheduling at all. All of these would have been in the form of ASIC (application specific integrated circuit) which though now often have a programmable element, are really pieces of hardware geared towards performing a single task or small set of tasks.
What are you going on about? The ACG was a fully general, stored program digital computer with a priority-driven scheduler implemented in software by Hal Laning, so what is this "no software scheduling at all" supposed to mean? And "processing the interrupts in a given sequence"? The design was asynchronous, that was precisely the novelty of the whole design compared to the state of the art.
So, I ask you, how's your making up history lately? And how's your understanding a joke lately?
I'm not making up history, I'm simply fact-checking extensively. And despite being a schizoid, I actually enjoy humor, but I do have some standards for it.
How is that relevant to my claim that MIT pushed for a modern-style embedded system (IC-fabricated stored program digital computer with a priority scheduling, event processing RTOS) as the solution for the control, navigation, and guidance of Apollo?
How's been your reading comprehension lately?
I'm really sick of languages that are going to solve all our so-called problems.
Languages have already solved for us the problems of numerically addressed storage, register allocation, memory allocation, compound data types, structured control flow, genericity/polymorphism of pieces of code etc. etc. Virtually all - if not all - pesky problems in day-to-day programming have been solved by appropriate language design. Why shouldn't a language solve the problem of concurrency and distributed applications?
1943 to 1945 - women were about 95% of the computing workforce.
And by computing workforce, you mean this?
Furthermore, even quite some time after the advent of digital computers, there was this period in which the prevalent opinion in the field was that computer hardware design happened to be the actually important (and perhaps prestigious) job whereas programming said computers was a lowly, clerical work... I don't think that anyone should be surprised to whom these new jobs initially went. In other words, I find it plausible that the initial high involvement of women in early computing was actually a symptom of pre-existing sexism rather than of later-lost equality.
For every JFK there were several Richard Nixon REMFs sitting at a base playing poker.
So...who did the one guy play with?
Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.