I got aTI-99/4A for my birthday when I was 8. I didn't know how to type yet, so my mom helped by typing in the examples from the Beginner's BASIC programming guide. Then I would edit them and figure out how to do stuff on my own. Eventually I was writing my own simple programs and trying to figure out how to make databases saved to cassette tapes.
A few years later, my dad bought an IBM PS/2 80 that came with a BASIC interpreter. A couple of years after that, I was in junior high school and had to take a programming class (taught in BASIC). I had been programming for several years at that point and was already well beyond what the class covered. This was still the 80s and the school didn't have a specialized computer teacher. Since computers had keyboards, they recruited the typing teacher for the class. Being a computer class, at some point Borland sent her a Pascal compiler. She knew I was way more into computers than anyone, so she gave it to me. I installed it at home and taught myself Pascal by writing programs in BASIC, then re-implementing it in Pascal.
When I got to college, I learned the "computer science" aspects (e.g. data structures and algorithms) of programming in C, then C++, and also learned things like COBOL, x86 assembly, and OpenGL. Along the way, I've also taught myself other languages as needed, like HTML, perl, php, SQL, etc.
I think the idea of making programming a requirement is pretty stupid. I have taught many programming classes myself and have observed the same thing as every other programming instructor I've ever met: there are a few people who pick it up instantly and advance far beyond the rest of the class in a very short time, a large number of people who are capable of doing it at a basic level, and a few people who just cannot get it at all. We should definitely have computer usage classes, especially for standard business applications, but what is far more important is how to think critically about information. We don't need any more anti-vaxxers or climate-change deniers.