I think a key part is simple: good story quality. Key steps:
The discussions are sometimes interesting - and sometimes not. But I think if the stories start higher-quality, the follow-up discussion is more likely to be better.
In the longer term, the system for entering text is... quirky. Has someone considered using Markdown? Yeah, Markdown processors vary, but lots of people know Markdown (e.g., via GitHub), and specs like CommonMark and libraries like Red Carpet make it fairly painless.
Good luck!
The point of the article How are students learning programming in a post-Basic world? isn't that we should all use Basic. The point is that there's a need for a single 'starter' language so that people who have no experience can get started using something. That language should come with practically all computers, should be portable enough so that you can write programs that port to many computers, should be immediately accessible so beginners can quickly learn some basics, and should be useful enough so that beginners can create useful programs.
There are a number of reasonable contenders, including Python, Ruby, and Java. A version of Ruby comes with MacOS, but none of these 'just comes' with the computer regardless of what OS you run - so in most cases, before you even get started, you have to explain how to download and install something. Not ideal. Java is what a lot of people use professionally, but it does take more time to get started when you know nothing. Python has many advantages for simplicity, but you need to install it in many cases.
Perhaps the dark horse here is Javascript ES6. Javascript is available almost everywhere, and people can get started quickly. As a first language Javascript's unusual approach to OO programming (with prototyping) has probably held it back, but ES6 adds standard class notation, and that might make it much easier to use as a starter language.
If you're worried about compromised CPUs being used to compile executables that are used by others, then reproduceable builds are a great countermeasure. Just use reproduceable builds on many different CPUs, and compare them to ensure they are the same (for a given version of source and tools). The more variations, the less likely that there is a subversion. If what you're compiling is itself a compiler, then use diverse double-compiling (DDC) on many CPUs.
If you're worried that an INDIVIDUAL may end up with a compromised CPU, then yes, it's much harder to counter attack. On some systems, you can isolate the system (no network traffic, etc.). That said, an adversary has to send packets to subvert a specific system, then every time they do the subversion they risk being detected, so it's far less likely to be used for bulk surveillance... it would more likely be one well-resourced organization (e.g., a government) working against another well-resourced organization.
If scholar just means "one who studies", then obviously anyone who studies a religious text for a long time BECAUSE they're a believer is by definition a scholar. I don't think that's what you mean.
If we change "scholar" to "scientist", it's quite clear that scientist is not synonymous with atheist. Pew research found that "just over half of scientists (51%) believe in some form of deity or higher power; specifically, 33% of scientists say they believe in God, while 18% believe in a universal spirit or higher power". Besides, many would say that science requires repeatable experiments, and many truths simply aren't repeatable (e.g., history).
Most scholars don't think that the Talpiot Tomb has anything to do with Jesus. For exampel, Géza Vermes says the arguments for the Talpiot tomb are not "just unconvincing but insignificant" (see the Wikipedia page). Also, Christian theology does not depend on whether or not the shroud of Turin is real.
I'm not muslim, but even the summary notes a perfectly reasonable explanation - the parchment could be an old one. And frankly, I'm skeptical that the carbon dating is that precise; carbon dating depends on a lot of assumptions that can easily be false in specific circumstances. (Yes, radioactivity decreases at a fixed rate... but you have to make BIG assumptions about its starting value.) So while this article makes for a good headline, the current actual evidence is rather worthless.
What about the disastrous SwiftKey vulnerability? It makes Samsung Android systems vulnerable too. Samsung said they'd fix it back in June, but we still have no patch.
When buying an Android phone: Measure how many days it takes from the vulnerability report (at least publicly) until it's patched in phones already used by customers. Focus on phones more than 2 years old, since your phone will be that age someday. Then: Don't buy from unresponsive makers. I suspect that if a few buying guides included those numbers, some manufacturers and service providers would start paying attention.
"How would an experienced developer get these problems in the first place?"
A lot of projects do not follow widely-accepted best practices... even if they are experienced... and that is a problem!
A remarkable number of OSS projects fail to have a public source control system (#2). That includes many established projects that everyone depends on. Actually, a number of OSS projects - and projects that people THINK are OSS but are not (because they have no license) - fail many of these points. It's not that Red Hat's internal processes are immature; Tom was trying to bring in software from someone else (Google in this case) and was fed up by the poor practices from people who should know better.
Yes, #7 refers to a best practice (let people pick their install directory) that's been around for at least 20 years and probably much longer, but it's still widely NOT followed.
Anyway, that's Tom's point; there are a lot of widely-accepted best practices that are NOT followed, and that needs to change.
"Money is the root of all money." -- the moving finger