>We are at least 50 years off from strong AI as in human level common sense about the world.
There's no way you can justify this number. From your assessment of the status quo, you aren't keeping up with what's being researched on the frontier, so don't make these statements. It might be 10 years or less that we have strong AI, might not. We need to come up with algorithms to sort out the NP-complete problems of entailment and better knowledge representation, but whatever.
Common sense isn't spooky, it's a certain set of patterns and principles that are not readily codified, because the higher-level implications our reality are hard to derive without just seeing and sensing them. It's unlikely there is a database with an exhaustive set of facts such as "If A is next to B, then B is next to A" but this sort of thing is readily derivable with senses, because we always see the co-occurrence of (A next to B) then (B next to A), and so common sense principles come about by reinforcement. It's not inconceivable to devise a machine to learn billions of common sense rules about the world, given some good algorithms and senses.
You want to make something out of (name any substance)? There are only a few special cases where any government approval is required, and patents are NEVER required.
If you want any capital to operate with, you need security in the profitability of producing it. Patents are this security, and so are always necessary unless you want to throw money away. So no, you don't want to leave out steps that will quickly leave your company bankrupt. And all business sectors have codes, standards, and regulations by which you need to abide.
We don't know one millionth of one percent about anything.