Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Alternatives to regulating AI -- recognizing irony (Score 1) 26

Lawrence Lessig wrote in Code 2.0 you can shape human behavior by at least four things:
* rules
* norms
* prices
* architecture

All are important in different ways and likely could be involved in shaping AI in a healthy way.

But that said, because AI is a technology produced using abundance and which can produce more abundance, humans need a perspective (and economics and politics) rooted in abundance to use it in healthy ways. Otherwise we risk creating all sorts of ironic situations, like I discuss here:
https://pdfernhout.net/recogni...
"... Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic institutions in many ways. Despite probably having more computing power per square foot than any other place in the world, they seem not to have thought much about the implications of all that computer power and organized information to transform the world into a place of abundance for all. Cheap computing makes possible just about cheap everything else, as does the ability to make better designs through shared computing. ...
      There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies [and also "financial" organizations] are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all.
      The big problem is that all these new war machines [and AI plans]] and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream."

See also James P. Hogan's 1982 sci-fi novel "Voyage from Yesteryear" on such a perspective shift based on the implications of advanced technology to produce abundance for all.
https://web.archive.org/web/20...

Or Theodore Sturgeon's 1950s short story "The Skills of Xanadu" that helped inspire hypertext and the web.
https://archive.org/details/pr...

Comment All IRONY (ftfy) (Score 1) 60

As I suggest here: https://pdfernhout.net/recogni...
"There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all."

Comment Will your career be founded on IRONY or not? (Score 2) 65

FTFY. Explaination: https://pdfernhout.net/recogni...
"The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
        We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working")."

Comment Thanks for your insights & modern war is ironi (Score 1) 194

To take your insights a step further, consider, as I explain here: https://pdfernhout.net/recogni...
"Biological weapons like genetically-engineered plagues are ironic because they are about using advanced life-altering biotechnology to fight over which old-fashioned humans get to occupy the planet. Why not just use advanced biotech to let people pick their skin color, or to create living arkologies and agricultural abundance for everyone everywhere? ... The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream. We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working")."

Comment Your best defense is to recognize irony (Score 1) 212

As I wrote a dozen years ago: https://pdfernhout.net/recogni...
"Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead? ...
      Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic institutions in many ways. Despite probably having more computing power per square foot than any other place in the world, they seem not to have thought much about the implications of all that computer power and organized information to transform the world into a place of abundance for all. Cheap computing makes possible just about cheap everything else, as does the ability to make better designs through shared computing. ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...
      The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working"). ..."

Comment The problem is unrecognized irony, not AI (Score 2) 199

As I wrote a dozen years ago:
https://pdfernhout.net/recogni...
"The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military [or commercial] uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream."

This is similar to comments on Slashdot years ago related to 3D printing suggesting that if we ever got food replicators there would be mass starvation due to politics.

While it is always possible there will be a rogue AI someday who can't be stopped, the more immediate issue is humans wielding AI to concentrate wealth and political power -- trying to succeed in a social paradigm (including to get access to the most desirable mates) that no longer makes much sense.

Contrast with, say, James P. Hogan's novel "Voyage from Yesteryear" where people move to a social model of earning respect by accomplishment (when raised in a post-scarcity environment with a lot of AI and robotics):
https://web.archive.org/web/20...
"In the meantime, Earth went through a dodgy period, but managed in the end to muddle through. The fun begins when a generation ship housing a population of thousands arrives to "reclaim" the colony on behalf of the repressive, authoritarian regime that emerged following the crisis period. The Mayflower II brings with it all the tried and tested apparatus for bringing a recalcitrant population to heel: authority, with its power structure and symbolism, to impress; commercial institutions with the promise of wealth and possessions, to tempt and ensnare; a religious presence, to awe and instill duty and obedience; and if all else fails, armed military force to compel. But what happens when these methods encounter a population that has never been conditioned to respond?
      The book has an interesting corollary. Around about the mid eighties, I received a letter notifying me that the story had been serialized in an underground Polish s.f. magazine. They hadn't exactly "stolen" it, the publishers explained, but had credited zlotys to an account in my name there, so if I ever decided to take a holiday in Poland the expenses would be covered (there was no exchange mechanism with Western currencies at that time). Then the story started surfacing in other countries of Eastern Europe, by all accounts to an enthusiastic reception. What they liked there, apparently, was the updated "Ghandiesque" formula on how bring down an oppressive regime when it's got all the guns. And a couple of years later, they were all doing it!"

Comment On transcending the irony of military AI (Score 1) 179

Something I wrote a dozen years ago: https://pdfernhout.net/recogni...
"The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working").""

Comment AI means the end of the free market as we know it (Score 1) 33

As with my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

Yes, there may still be some exchange transactions. But the balance in the whole system will shift, especially towards more subsistence, gift, and planned transactions (since most human labor will no longer have much value given AI-powered robot slaves). And also sadly there may be more theft transactions if deeper issues about social equity are not addressed.

See also this document I put together over a decade ago:
https://pdfernhout.net/beyond-...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

Comment Re:We need to rethink socio-econo paradigms for AI (Score 1) 113

What you speculated about is essentially a plot point of Marshall Brains' Manna story (previously linked) where most humans in the USA end up warehoused in "Terrafoam" public housing with the expectation they will soon be killed off.

And while not exactly what you outline, here is a video parable I made in 2011 on a related theme of the long-term perils of excessive wealth concentration (satirically taken to the ultimate extreme):
"The Richest Man in the World: A parable about structural unemployment and a basic income"
https://www.youtube.com/watch?...

Comment Two Faces of Tomorrow by James P. Hogan (Score 1) 113

https://web.archive.org/web/20...
"I set the story forty years into the next century, by which time an integrated global system is managing much of the world's affairs. However, proposals for a major upgrade involving new software that learns are causing serious questions to be asked about the degree of decision-making that it can be entrusted with. The trouble is that while the solutions that it comes up with are logically flawless, they are unconstrained by the kind of common sense that humans acquire through a lifetime of real-world experience--which on several occasions has almost resulted in catastrophe. One school of opinion argues that the only way to go is forward, accepting the risks and allowing the system to learn from experience in the same way that people did. "Besides, if anything really bad starts happening, we can always downgrade again or pull the plug." "How can you guarantee that it will always let you?" the opponents reply.
        The answer eventually agreed is to run a test on a world-in-miniature. One of the new space habitats is taken over for the experiment and equipped with a supersystem containing all the advanced capabilities proposed for incorporation into the global net. The system is programmed for self-preservation as its highest goal, introducing deliberately the faculty of a "survival instinct" that the critics have speculated could arise spontaneously. The scientists then begin "attacking" it in a series of escalating tests to find out what it's capable of. It's far from Earth, so anything unexpected will be isolated and contained. A strong military presence is included in the mini-world's population--just in case things should take a nasty turn. And if it gets out of hand, we can always evacuate the whole place and nuke it. But the System, of course, doesn't quite see things that way."

A deeper issue is also those discussed by Langon Winner in his contemporaneous 1977 book "Autonomous Technology":
https://www.langdonwinner.com/...
"You [Winner] draw on Ellul to formulate what is at the same time a philosophical challenge and a profound anxiety: ÂThere can be no human autonomy in the face of technical autonomyÂ."

Related to that point see also "The Skills of Xanadu" by Theodore Sturgeon 1956:
https://archive.org/details/pr...
"He remembered something Tanyne had said once, casually, about men and their devices: "Ever since there were human beings, there has been conflict between Man and his machines. They will run him or he them; it's hard to say which is the less disastrous way. But a culture which is composed primarily of men has to destroy one made mostly of machines, or be destroyed. It was always that way. We lost a culture once on Xanadu. Didn't you ever wonder, Bril, why there are so few of us here? And why almost all of us have red hair?" ... "We were billions once," said Tan surprisingly. "We were wiped out. Know how many were left? Three!"

That was the story that inspired Ted Nelson of Hypertext fame and others (and so indirectly the World Wide Web).

Comment We need to rethink socio-econo paradigms for AI (Score 2) 113

As with my sig, "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

See also this document I put together a decade ago:
https://www.pdfernhout.net/bey...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

That was inspired in part by Marshall Brain's writing:
https://marshallbrain.com/mann...
https://marshallbrain.com/robo...

Comment Is OpenAI engaging in "self-dealing"? (Score 1) 44

An essay I wrote twenty years ago (informed in part from Slashdot discussions long ago): https://pdfernhout.net/open-le...
"Foundations, other grantmaking agencies handling public tax-exempt dollars, and charitable donors need to consider the implications for their grantmaking or donation policies if they use a now obsolete charitable model of subsidizing proprietary publishing and proprietary research. In order to improve the effectiveness and collaborativeness of the non-profit sector overall, it is suggested these grantmaking organizations and donors move to requiring grantees to make any resulting copyrighted digital materials freely available on the internet, including free licenses granting the right for others to make and redistribute new derivative works without further permission. It is also suggested patents resulting from charitably subsidized research research also be made freely available for general use. The alternative of allowing charitable dollars to result in proprietary copyrights and proprietary patents is corrupting the non-profit sector as it results in a conflict of interest between a non-profit's primary mission of helping humanity through freely sharing knowledge (made possible at little cost by the internet) and a desire to maximize short term revenues through charging licensing fees for access to patents and copyrights. In essence, with the change of publishing and communication economics made possible by the wide spread use of the internet, tax-exempt non-profits have become, perhaps unwittingly, caught up in a new form of "self-dealing", and it is up to donors and grantmakers (and eventually lawmakers) to prevent this by requiring free licensing of results as a condition of their grants and donations."

https://en.wikipedia.org/wiki/...
"Self-dealing is the conduct of a trustee, attorney, corporate officer, or other fiduciary that consists of taking advantage of their position in a transaction and acting in their own interests rather than in the interests of the beneficiaries of the trust, corporate shareholders, or their clients. ... Where a fiduciary has engaged in self-dealing, this constitutes a breach of the fiduciary relationship. The principal of that fiduciary (the person to whom duties are owed) may sue and both recover the principal's lost profits and disgorge the fiduciary's wrongful profits. In the United States, repeated self-dealing by a private foundation can result in the involuntary termination of its tax-exempt status."

Comment Re:Why luxury safer electric cars should be free (Score 1) 148

Thanks. Great insight: "Having to use your military gets expensive fast. Having a military so powerful that nobody challenges it is less expensive than a cheaper military that gets challenged, thus having to be used."

Neat idea about the personal rapid transit system! Might work for packages too.

Yo're right about oil and electricity. Other than fuel for cars, oil vs. energy efficiency is (or was) a bigger issue for home heating. With enough insulation of the right kind (and air-to-air heat exchangers), you don't need a furnace at all.
"No Furnaces but Heat Aplenty in âPassive Housesâ(TM)
https://www.nytimes.com/2008/1...

Again, if people paid the true cost of oil, people would have switched away from home oil heat a long time ago.

Slashdot Top Deals

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...