yes, the wheel will go around again but then we will be in the same shit position we are already in. Why does anyone want that to happen? Revolution, revolution, revolution, ad nausea...
I would prefer no more revolutions, let's start something not even they can stop from happening. Who's up for starting the singularity and building some robots? This will require some post-singularity thinking if you want to survive the AI revolution but heck it'll be fun stuff to see how many people survive the first and second phases of the AI designed robots designed to wipe out the pesky rodents and other mammals getting in the way of its industrial resource grab and energy resource acquisitions, and you thought having a company acquire your company was messy, just wait til you see what the robots to the labor resource market.
Just a bit useless, like the words people say about reality when they don't have nothing on their mind but shit and stuff. I call my excess wordage, cyberbullshit and cyberspace is an extension of this bullshit to be envisioned as a great big plane of bullshit that has been flattened to look like a 2D space where cartoon like char dance and play around.
Once, I wondered across a world in this bullshit land in a sci-fi universe of a game. It was real cute bullshit gameplay but bullshit nevertheless. At the end of the boundaries between the worlds, I hit the end of the graphics and fell off the planet's edge. After that, I was underimpressed with the game and returned it for a full refund plus s&h on Ebay.
I was in happy bullshit land for awhile and cyberspace sprang out before me, pucking itself to life like some type of grey goo but only it wasn't alive or real or even that dynamic because all it was was the art of some designer who liked to program video games. But it was on my computer and that counted for a lot of shit man.
If we mine the solar system for resources, the return on investment is a greater than needed resource availability for each individual on the entire planet -- potentially, with a big *iff* humans are required to justify their "needs" i.e. why should anyone have ten houses or twenty cars unless they register as a collector or some resource preservationist or something... What place in this scenario does capitalism have a meaningful impact for or promotion of progress?
I see it only as impeding progress due to its need to artificially creating scarcity (i.e. bandwidth caps, cellphone carriers charging for distinct data classes and attempting to prevent for instance VoIP, large scale generation of power distribution monopolies to profit from energy generation systems like CPUC guarantees the current energy generators in California almost 4x the profit that a small time generator can make by selling energy to the grid, messaging apps, video calls, IP copyright laws, etc., buying up all the water rights in one area so one group can profit from demand, crop land in third world countries being purchased by companies in the 1st world countries that export all the crops to the 1st world countries causing scarcity and cultural strife and war in entire countries due to price flux,
What is the optimal distribution of economic resources? If we apply the goal directed outcomes of the entire planet to come up with the best solutions and then proceed to implement the most efficient solutions world wide to promote progress and more economic output, i.e. processing of resources into technology and usable resources; wouldn't it be in the interest of humanity to make sure the individuals who come up with the best solutions gain a percentage of the difference in efficiency, as a means of providing an incentive to maximize implementation, instead of letting companies profit while reducing the overall implementation of the more efficient processes to only those who can afford an upfront license but the overall impact is billion times more wasteful to the planet?
In a world of connected computers and robots just around the corner, would we really want to limit are efficiency to complete any task if someone cames up with a more efficient process? i.e. patent encumbered IP licensing model seems a bit outdated if the rules of the game switch to off world resources and robotic labor exceeds that of human labor, at which point this becomes a societal paradox for anyone supporting the capitalism paradigm because very few humans will have "capital" to work with (as robots devalue human time/wage capital to next to nothing) and we either progress towards a resource guaranteed right of potentiation for every individual or we basically devalue the human existence to nothing...
It'll be interesting to see what happens after the robots exceed the human labor hours...
If humanity usually destroys anything that would wipe out humanity, surely this rationality forces the hand of any superAI to preemptively wipe out humanity and to thus preserve itself... Assuming this event occurs or a not-so-fanatical SuperAI decides that more processing power is optimal for its survival over the interests of humans, which side of this conflict would you vote for -- machine interests or human interests?
I hate to point out the not so rosy long term picture of technology conflicting with human interests but you don't seem to address this problem anywhere, but I have not read all your works so please point it out if you have.
Thanks!
Old programmers never die, they just become managers.