That is Kurzweil's solution. His 2017 DVD set Singularity, which describes it, is oddly not listed at Amazon, but it is available on ebay https://www.ebay.com/sch/i.html?_nkw=singularity+dvd+2017.
The idea has merit, and it may even be the best idea, but I don't think it will prove sufficient. I'm an AI Doomer. Here's my 11-minute explanation on why from 2014: https://www.youtube.com/watch?v=Tk-0nu4fg1w.
There is a conceptual similarity between dataflow programming, as LabVIEW supports, and functional programming. While it's understandable that the developers may have been unfamiliar with functional programming in the 1980s, there is no good reason why LabVIEW hasn't adopted it yet. See https://forums.ni.com/t5/LabVIEW-Idea-Exchange/For-Each-Element-in-Map/idi-p/4219633.
Instead, LabVIEW is stuck with clunky while loop frames with a stop sign node to wire up in the middle, broadcast events symbolized by satellite dishes that go who knows where due to no command line grep, and no easy way to maintain and pass around global state (like Scala implicits do).
Implied (yet strong) typing solves this another way (TypeScript, Scala, many others):
const processingComplete = true
1. Wikipedia handily has a table listing lightweight distributions. https://en.wikipedia.org/wiki/Light-weight_Linux_distribution
2. Within that table, typically one will see the oldest processor supported is the 486. That's because it introduced the atomic CMPXCHG instruction, which Linux relies upon. I assume the one 386 OS listed, Tiny SliTaz, uses a workaround involving the XCHG instruction such as https://lists.freebsd.org/pipermail/freebsd-current/2009-April/005533.html.
No amount of careful planning will ever replace dumb luck.