Forgot your password?
typodupeerror

Comment Re:Why? It's a variation on my sig on irony... (Score 1) 236

Coincidentally the next story on Slashdot is about infrasound making people feel uneasy:
"The Silent Frequency That Makes Old Buildings Feel Haunted "
https://science.slashdot.org/s...
"What happened, specifically, was this: those exposed to infrasound reported higher irritability, lower interest in the music, and a tendency to rate the music as sadder, irrespective of whether it was the calming or the horror track. Cortisol levels, measured before and about 20 minutes after exposure, were also elevated. Kale Scatterty, the PhD student who led the work, notes that irritability and cortisol do tend to move together under ordinary stress, but adds that "infrasound exposure had effects on both outcomes that went beyond that natural relationship." That distinction matters more than it might seem. Previous theories about infrasound and paranormal experience have often leaned on anxiety as the explanatory mechanism, the idea that low-frequency sound triggers a kind of free-floating dread that the mind then reaches for supernatural explanations to account for. The new data don't really support that picture. Measures of anxiety didn't budge significantly. What went up was irritability and disinterest, a kind of sour, low-grade aversion rather than fear. That's perhaps a more honest description of how a lot of ghost stories actually feel in the telling: not screaming terror, but wrong atmosphere, a sense of unease that never quite crystallizes into something you can point at."

Presumably infrasound from wind turbines would also make people (and animals) feel uneasy in this way?

Comment Why? It's a variation on my sig on irony... (Score 1) 236

"The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

The tools of universal global abundance (in this case wind, solar, and batteries) are available -- but some politicians (and their supporters) are still stuck in the mindset of trying to maximize private gain in the short term -- even if it costs their country and the world a huge amount of suffering. But, this damage also would not be possible without using the abundance tools of mass communication and bureaucracy to mislead people to vote against their own long-term interests.

That said, I'm not especially a horizontal-axis wind power fan compared to solar because of infrasound risk from wind power to humans and animals (especially whales). Vertical-axis wind turbines are potentially much quieter (but more expensive right now).

"Wind turbine infrasound: Phenomenology and effect on people"
https://www.sciencedirect.com/...

"Experimental investigation on noise characteristics of small scale vertical axis wind turbines in urban environments"
https://www.sciencedirect.com/...

This still is an ongoing area of research and contention though -- and with so much money involved it is hard to know what research to believe:
"No, Wind Farms Do Not Generate Harmful Infrasound"
https://www.renewableenergymag...
        "The Woolcock Institute of Medical Research in Australia examined how infrasound from wind turbines impacted 37 healthy adults with sensitive hearing. The test lasted 72 hours. .... This groundbreaking study [showing no short-term effect of infrasound on some participants] will spark subsequent research in the field to corroborate the claim. Wind farms are not the only renewable energy source with queries about human health, and it is vital to debunk or unpack them. ....
        Scientists can prove the safety of wind farm noise as much as they want, but massive blades still produce noise pollution. Most turbines are horizontal-axis wind turbines. Vertical-axis alternatives, such as Savonius turbines, are becoming popular for being smaller and less strenuous on the ears. They are less domineering and more capable of urban incorporation, which minimizes land use debates over wind adoption.
        Building-integrated wind turbines (BIWTs) are another solution. BIWTs must have less noise output because installers affix them inside building structures. Shorter blades combined with infrastructure dampens sound. Blade coatings help adjust pitch, too. Additionally, their placement allows high energy-harnessing potential with lower rotational speeds, reducing opportunities for noise pollution.
      Innovation of quieter turbines is critical because removing this pain point redirects focus to other potential concerns with wind farms. Advocating their benefits alongside research is the best way to support their relevance while waiting for data to catch up. ..."

That said, pollution from burning fossil fuels harms a lot of people and animals too, and I expect that most current vertical-axis wind turbines overall produces significantly less overall harm than burning fossil fuels even if we could do even better. It's a complex topic:
"Which is worse for wildlife, wind farms or oil drilling?"
https://www.bbc.com/future/art...
"But the sensors also picked up on something else, which you can listen to in the recording below: deafening blasts from oil exploration in the Gulf of Mexico. "[The Gulf of Mexico] is a really noisy area because of all of the surveys that are associated with drilling, they do these seismic surveys that generate a whole lot of noise as they are looking for the oil pockets and understanding how the oil is moving," says Frasier." ... For whales, dolphins and other marine mammals that rely on sound and echolocation to find their food in dark and murky water, such loud and sudden underwater noise can be deeply disorienting, equivalent to being blinded, as well as being very distressing for them, research has shown."

Comment The whole thing is an ironic security mistake (Score 1) 83

As I suggest in my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

More details here: https://pdfernhout.net/recogni...
        "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead? ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...
        The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance [otherwise they could not be so powerful and pervasive]. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working"). ..."

Comment Need a laughable perspective shift (Score 1) 175

As with my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."
https://pdfernhout.net/recogni...
        "Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace oil, or why not use rocketry to move into space by building space habitats for more land? ...
        These militaristic socio-economic ironies would be hilarious if they were not so deadly serious. ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...
      So, while in the past, we had "nothing to fear but fear itself", the thing to fear these days is ironcially ... irony. :-)
      So, how can we transcend militarism?
      Simple persuasive rhetoric was tried, and failed, when Albert Einstein said, with the creation of atomic weapons everything had changed except our way of thinking.
      The economic argument against war was tried, and failed; see "War is a Racket" by Two-Time Congressional Medal of Honor Recipient Major General Smedley D. Butler...
      A basic moral argument against war was tried, and failed; see Freeman Dyson's book "Weapons and Hope" that says nuclear weapons are a moral evil, like slavery.
      A deeper religious argument against war was tried, and failed, see "James P. Carse, Religious War In Light of the Infinite Game, SALT talk"...
      We even tried public education through TV to create an enlightened citizenry (what high hopes back when TV was created) and that even got corrupted into promoting and celebrating violence. See the book by Diane E. Levin and Nancy Carlsson-Paige "The War Play Dilemma" for ways to deal with that if you have children...
        So, people have tried, and tried again, and failed to turn the tide, both people in the military and people outside the military. Still, each attempt has contributed, but together they have not yet been enough yet to turn the tide and help the USA transcend militarism and empire.
      What else can we try that does not just beget more violence? ...
      Maybe ironic humor is our last, best hope against the war machines?
      As was quoted by Joel Goodman of the Humor Project...: "There are three things which are real: God, human folly, and laughter. The first two are beyond our comprehension. So we must do what we can with the third. (John F. Kennedy)"
      The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working"). ..."

Comment Eric Schmidt on AI used to make bioweapons soon (Score 1) 14

From the transcript about 43 minutes in of a public conversion with Eric Schmidt from Apr 10, 2025: https://www.youtube.com/watch?...
====
          "Question: Thanks for the great conversation so far. Leonard Justin. I'm a PhD student at MIT. Um, I was wondering if you could just discuss a bit more some of the risks you see coming specifically with respect to biology and how we should go about mitigating those. What's the role of the AI developers? What's the role of government? Um, yeah, how can we move forward on that?
        ----
        Schmidt: So, so you're going to know a lot more about this area than I, but speaking as an amateur in your field, the two current risks from these models are cyber and biorisks.
        The cyber ones are easy to understand. The system can generate cyber attacks and in theory can generate zero-day cyber attacks that we can't see and it can unleash them and furthermore it can do it at scale.
        In biology, you get some evil, you know, the equivalent of Osama bin Laden. They would start with an open-source model. Now these open source models have been restricted using a testing process. Uh they're called cards and they test it out and they delete that information from the model.
        It turns out it's relatively easy to un to reverse essentially those security modes around the model and that's a danger. So now you've got a model that can generate bad pathogens.
        Then the second thing you have to do is you have to find things to build them. Our collective assessment at the moment is that that's a nation state risk, not an individual terrorist risk. Although we could be wrong, but there's plenty of examples uh and this the the report talks about some of the Chinese examples where in theory if they wanted to they could not only manufacture bad things but sorry design them but also manufacture them.
        The good news and the reason we're all alive today is that the bio stuff is hard to manufacture and distribute and to make deadly and and spread and so forth and so on. Um there's lots of evidence for example that you can take a bad bio right now and modify it just enough that the testing regimes and the sort of surveillance regimes it bypasses and that's another threat.
        So that's what I worry about.
        But I think at the moment u our consensus is we're right below the threshold where this is an issue and the consensus in in my side of the industry is that one more or two more turns of the crank these issues will be -- and you know by then you'll be graduated and you can sort of help solve these problems.
        Um the a crank is turned every 18 months or so. This is about three years.
        ----
        Moderator: But theoretically, couldn't AI and biotechnology help you come up with a counter measure?
        ----
        Schmidt: Um, I had thought so, and that was the argument I made until I I do a lot of national security work. And there's a term called offense dominant. And an offense dominant is a is a situation in a military context where the attack cannot be countered at the same level as the attack. In other words, the damage is done.
        And most people, most biologists who've worked in this believe that while the model can be trained to counter this, the damage from the offense part is far greater than the ability to defend it, which is why we're so worried about it."
====

Ultimately, I feel a big part of the response to that threat needs to be a shift in perspective like through people laughing at my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity." :-)

Explored in more detail here:
"Recognizing irony is key to transcending militarism"
https://pdfernhout.net/recogni...
        "... Biological weapons like genetically-engineered plagues are ironic because they are about using advanced life-altering biotechnology to fight over which old-fashioned humans get to occupy the planet. Why not just use advanced biotech to let people pick their skin color, or to create living arkologies and agricultural abundance for everyone everywhere?
        These militaristic socio-economic ironies would be hilarious if they were not so deadly serious. ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ..."

Comment Tools of abundance misused from scarcity fears (Score 3, Insightful) 28

Time to trot out my sig again, sigh: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

Biotech if used well could make the Earth near a paradise of abundance for all. Or people could create and enforce artificial scarcity with biotech. People make choices every day about which future they are working towards.

As Bucky Fuller wrote: "Whether it is to be Utopia or Oblivion will be a touch-and-go relay race right up to the final moment.... Humanity is in 'final exam' as to whether or not it qualifies for continuance in Universe."

Comment Marshall Brain's "Manna" depicts similar things (Score 3, Insightful) 118

Thanks for your insightful posts. I expanded on that idea in 2010: https://pdfernhout.net/beyond-...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

That said, indigenous ways were "the original affluent society" (even if such ways might have been harder to practice on a restricted reservation after extensive conflicts with Europeans wielding "Guns, Germs, and Steel"):
https://en.wikipedia.org/wiki/...
"The basis of Sahlins' argument is that hunter-gatherer societies are able to achieve affluence by desiring little and meeting those needs/desires with what is available to them. This he calls the "Zen road to affluence, which states that human material wants are finite and few, and technical means unchanging but on the whole adequate"."

Comment IBM: The origins of THINK (Score 1) 78

https://www.ibm.com/history/th...
"An ad hoc lecture [from 1915] from IBMâ(TM)s future CEO spawned a slogan to guide the company through a century and beyond"

https://humancenteredlearning....
"And we must study through reading, listening, discussing, observing and thinking. We must not neglect any one of those ways of study. The trouble with most of us is that we fall down on the latter -- thinking -- because it's hard work for people to think. And, as Dr. Nicholas Murray Butler said recently, 'all of the problems of the world could be settled easily if men were only willing to think.' (Thomas Watson, IBM)"

https://en.wikiquote.org/wiki/...
"All the problems of the world could be settled easily if men were only willing to think. The trouble is that men very often resort to all sorts of devices in order not to think, because thinking is such hard work. (Nicholas Murray Butler, often misattributed to Thomas J. Watson)"

So, yeah, echoing your point, make programmers do the hardest parts of their job all the time -- especially reviewing code from inconsistent-to-put-it-politely AI contributors -- and no wonder they feel "fried".

Does AI support for programming need to be this way? I might hope not, but we are also mainly hearing about AI used within a short-term-profit-maximizing hyper-competitive corporate social context. Like I say in my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

Comment Re:Also required reading in history of technology (Score 1) 39

A tangent on another historian of science and technology I enjoyed taking a course from and who wrote about information technology and who died relatively way too young:
https://en.wikipedia.org/wiki/...
        "James Ralph Beniger (December 16, 1946 -- April 12, 2010) was an American historian and sociologist and Professor of Communications and Sociology at the Annenberg School for Communication at the University of Southern California, particularly known for his early work on the history of quantitative graphics in statistics, and his later work on the technological and economic origins of the information society."

RIP Jim. Thanks for being an excellent -- and kind -- professor. And mentioning me and other students with thanks in your Control Revolution book.

And thanks also introducing me to the 1978 book "Autonomous Technology: Technics-out-of-Control as a Theme in Political Thought" by Langdon Winner --- which just gets more relevant every day.
https://www.amazon.com/Autonom...
        "This study of the idea of technology out of control makes an important contribution to our understanding of the problems of civilization. The basic argument is not that some persons or groups promote technology against the public interest (true though that is), or even that our technology develops in its own way in spite of all our efforts to control it (also true in some respects). Rather, Winner is concerned with a more subtle effect: the artifacts that we have invented to satisfy our material wants have now developed, in size and complexity, to the point of delimiting or even determining our conception of the wants themselves. In that way, we as a civilization are losing mastery over our own tools.... 'As a source for readings and reflections on this problem, the book is rich and rewarding.... If it has a practical lesson, it is that of awareness: only by recognizing the boundaries of our socially constructed scientific-technological reality can we transcend them in imagination and then achieve effective human action.'"

Winner's book is perhaps a sort of antithesis of "The Soul of A New Machine"? In that sometimes technologists (I'm looking at your OpenAI and cohorts) can get so excited about problem solving and miss the big picture. Related humor:
https://www.reddit.com/r/Jokes...
        "This reminds me of the engineer who is condemned to death, but as his turn approaches they find that the mechanism on the electric chair is no longer functioning. About to send everyone home, the engineer calls out "Wait! I think i can see the problem!""

I heard that Winner did not get tenure at MIT essentially because he suggested that the method of education used there was essentially blinding the MIT students to the social implications of what they were doing. A related book:
https://web.archive.org/web/20...
        "Who are you going to be? That is the question.
        In this riveting book about the world of professional work, Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline."
        The hidden root of much career dissatisfaction, argues Schmidt, is the professional's lack of control over the political component of his or her creative work. Many professionals set out to make a contribution to society and add meaning to their lives. Yet our system of professional education and employment abusively inculcates an acceptance of politically subordinate roles in which professionals typically do not make a significant difference, undermining the creative potential of individuals, organizations and even democracy.
        Schmidt details the battle one must fight to be an independent thinker and to pursue one's own social vision in today's corporate society. He shows how an honest reassessment of what it really means to be a professional employee can be remarkably liberating. After reading this brutally frank book, no one who works for a living will ever think the same way about his or her job."

Anyway, I am so thankful for people like Jim Beniger, Langdon Winner, Michael Mahoney, and others who provided a larger perspective to technologists on what they were doing.

If I can find fault with "The Soul of a New Machine" from hazy recollections from 40+ years ago I don't think it did that. It's still informative though on how technology -- like now AI development -- can become addictive for techies with both good and bad aspects.
https://en.wikipedia.org/wiki/...
      "The book follows many of the designers as they give almost every waking moment of their lives to design and debug the new machine."

It's maybe kind of like the difference between books that glorify warmaking and "winning" versus those that glorify peacemaking and "win/win/win"?

An example of the latter is "To Become a Human Being: The Message of Tadodaho Chief Leon Shenandoah".
https://en.wikipedia.org/wiki/...

Or an essay by Terry Dobson on Aikido:
https://livingthepresentmoment...
https://en.wikipedia.org/wiki/...
      "My teacher taught us each morning that the art was devoted to peace. "Aikido," he said again and again, "is the art of reconciliation. Whoever has the mind to fight has broken his connection with the universe. If you try to dominate other people, you are already defeated. We study how to resolve conflict, not how to start it." ..."

Or in a way Theodore Sturgeon's "The Skills of Xanadu" 1956 short story about wearable networked mobile computing for sharing knowledge that inspired Ted Nelson to invent hypertext which contributed to the creation of the web.

Or Alfie Kohn's "No Contest: The Case Against Competition":
https://www.alfiekohn.org/cont...

One of the most interesting things I overheard at lunch at IBM Research was something along the lines of "We hire the top people from the most competitive schools -- and then wonder why they can't get along and cooperate."

And I'd add, wonder why many technologists rarely take much time to think about the broader social implications of what they are doing.

And "Soul of a New Machine" -- from what I can remember of it -- exemplifies and celebrates that focused mindset.

I can thank someone (maybe Bruce Maier?) who posted this essay to the Lyrics TOPS-10 timesharing system I used in high school circa 1980 -- for helping me have some reservations about such narrow focus:
"The Hacker Papers [about some Stanford CS students]"
https://cdn.preterhuman.net/te...
      "As much as an essay, this is a story. It is a true story of people paying $9,000 a year to lose elements of their humanity. It is a story of the breaking of wills and of people. It is a story of addictions, and of misplaced values. In a large part, it is my own story. ..."

Comment Also required reading in history of technology (Score 1) 39

Also assigned reading in Michael Mahoney's course on the history of science and technology at Princeton circa 1984. Although I had read it already -- and found it inspiring.

RIP Tracy Kidder.

And also RIP Professor Mahoney who died in 2008 at the relatively young age of 69 -- just when a historian is typically getting very productive.

https://en.wikipedia.org/wiki/...

https://www.dailyprincetonian....

"Histories of Computing"
https://www.hup.harvard.edu/bo...
"Computer technology is pervasive in the modern world, its role ever more important as it becomes embedded in a myriad of physical systems and disciplinary ways of thinking. The late Michael Sean Mahoney was a pioneer scholar of the history of computing, one of the first established historians of science to take seriously the challenges and opportunities posed by information technology to our understanding of the twentieth century."

It's going to be a rough next decade with ongoing loss of pioneers of personal computing and those who wrote about them or who inspired them. People like Steve Jobs and Doug Engelbart and Isaac Asimov and Theodore Sturgeon who have passed on years ago. Glad that Steve Wozniak and Alan Kay and Dan Ingalls and so on are still hanging in there! A heartfelt thanks to all of them for giving us possibilities -- even if we may not be doing great things with them right now.

"Original architects [Wozniak] of the personal computer hate what it's become..."
https://www.youtube.com/watch?...

"Steve Wozniak says he's "disappointed a lot" by AI and rarely uses it"
https://www.techspot.com/news/...

Comment Re:Robot philosopher? (Score 1) 94

Touché! Good point on "actually" and whether it is qualified, thanks. I guess that word only fits in relation to there actually being a novel which I was referring to? But I agree I should have worded that better. The word "fictionally" might have been a better choice? Glad your comment sparked some tangential discussion by others.

Comment Re:Robot philosopher? (Score 3, Insightful) 94

Children raised by robot educators/philosophers actually worked out well in the fictional sci-fi novel by James P. Hogan "Voyage from Yesteryear":
https://en.wikipedia.org/wiki/...
"The story opens early in the 21st century, as an automated space probe is being prepared for a mission to explore habitable exoplanets in the Alpha Centauri system. However, Earth appears destined for a global war which the probe designers fear that humanity may not survive. It appears that the only chance for the human species is to reestablish itself far away from the conflict but there is no time left for a crewed expedition to escape Earth. The team, led by Henry B. Congreve, change their mission priority and quickly modify the design to carry several hundred sets of electronically coded human genetic data. Also included in this mission of embryo space colonization is a databank of human knowledge, robots to convert the data into genetic material and care for the children and construct habitats when the destination is reached, and a number of artificial wombs. The probe's designers name it the Kuan-Yin after the bodhisattva of childbirth and compassion.
        Shortly after the launch, global war indeed breaks out and several decades later, Earthbound humanity is united under an authoritarian government. It is this government that receives a radio message from the fledgling "Chironian" civilization revealing that the probe found a habitable planet (Chiron) and that the first generation of children have been raised successfully.
        As the surviving power blocs of Earth before the conflict are still evident, North America, Europe and Asia each send a generation ship to Alpha Centauri to take control of the colony. By the time that the first generation ship (the American Mayflower II) arrives after 20 years, Chironian society is in its fifth generation.
        The Mayflower II has brought with it thousands of settlers, all the trappings of the authoritarian regime along with bureaucracy, religion, fascism and a military presence to keep the population in line. However, the planners behind the generation ship did not anticipate the direction that Chironian society took: in the absence of conditioning and with limitless robotic labor and fusion power, Chiron has become a post-scarcity economy. Money and material possessions are meaningless to the Chironians and social standing is determined by individual talent, which has resulted in a wealth of art and technology without any hierarchies, central authority or armed conflict.
        In an attempt to crush this anarchist adhocracy, the Mayflower II government employs every available method of control; however, in the absence of conditioning the Chironians are not even capable of comprehending the methods, let alone bowing to them. The Chironians simply use methods similar to Gandhi's satyagraha and other forms of nonviolent resistance to win over most of the Mayflower II crew members, who had never previously experienced true freedom, and isolate the die-hard authoritarians. ...

I guess it maybe all depends how wisely the robots are programmed initially? Sadly, with several AI companies racing forward in a winner-takes-all hyper-competitive safety-ignoring way to earn a lot of fiat dollars, seems like something needs to change to build a more hopeful future. Still, there is always the "Optimism of Uncertainty" as Howard Zinn called it.
https://www.thenation.com/arti...

Comment Tools of abundance misused from scarcity mindset (Score 1) 314

Time to trot out my sig again, sigh: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."
https://pdfernhout.net/recogni...
"The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream."

It was awesome ten days ago to see someone else who understands that irony:
"The Real Danger of AI Has Nothing to Do With AI" by Mo Gawdat
https://www.youtube.com/watch?...
        "Artificial intelligence is often described as a force that will shape humanity's future, yet technology itself carries no intention, morality, or agenda. The direction it takes depends entirely on the values and decisions of the people creating and using it.
        As intelligence becomes more powerful and capable of solving increasingly complex problems, the real question shifts away from what machines can do and toward what humanity chooses to prioritize. Every major technological leap reflects the ethical frameworks of the society behind it, meaning the future shaped by AI will ultimately mirror human behavior rather than machine logic.
        The challenge is not teaching machines to think -- it is learning to think more wisely ourselves."

Mo Gawdat was talking about AI, but the same applies to using advanced manufacturing and supply chains to make hypersonic missiles as in the article. That's US$99,000 that can't otherwise be used to make food, water, shelter, energy, 3D printers, and so on -- and so exacerbating the very conflicts that lead people to want to use hypersonic missiles.

Or for a more humorous take on this by me from 2009 as another "Downfall" bunker scene parody (which coincidentally starts with a mention of rockets/missiles):
https://groups.google.com/g/po...
        "Dialog of alternatively a military officer and Hitler:
"It looks like there are now local digital fabrication facilities here, here, and here."
"But we still have the rockets we need to take them out?"
"The rockets have all been used to launch seed automated machine shops for self-replicating space habitats for more living space in space."
"What about the nuclear bombs?"
"All turned into battery-style nuclear power plants for island cities in the oceans."
"What about the tanks?"
"The diesel engines have been remade to run biodiesel and are powering the internet hubs supplying technical education to the rest of the world."
"I can't believe this. What about the weaponized plagues?"
"The gene engineers turned them into antidotes for most major diseases like malaria, tuberculosis, cancer, and river blindness."
"Well, send in the Daleks."
"The Daleks have been re-outfitted to terraform Mars. There all gone with the rockets."
"Well, use the 3D printers to print out some more grenades."
"We tried that, but they only are printing toys, food, clothes, shelters, solar panels, and more 3D printers, for some reason."
"But what about the Samsung automated machine guns?"
"They were all reprogrammed into automated bird watching platforms. The guns were taken out and melted down into parts for agricultural robots."
"I just can't believe this. We've developed the most amazing technology the world has ever known in order to create artificial scarcity so we could rule the world through managing scarcity. Where is the scarcity?"
"Gone, Mein Fuhrer, all gone. All the technologies we developed for weapons to enforce scarcity have all been used to make abundance."
"How can we rule without scarcity? Where did it all go so wrong? ... Everyone with an engineering degree leave the room ... now!" [Cue long tirade on the general incompetence of engineers. :-) Then cue long tirade on how could engineers seriously wanted to help the German workers to not have to work so hard when the whole Nazi party platform was based on providing full employment using fiat dollars. Then cue long tirade on how could engineers have taken the socialism part seriously and shared the wealth of nature and technology with everyone globally.]
"So how are the common people paying for all this?"
"Much is free, and there is a basic income given to everyone for the rest. There is so much to go around with the robots and 3D printers and solar panels and so on, that most of the old work no longer needs to be done."
"You mean people get money without working at jobs? But nobody would work?"
"Everyone does what they love. And they are producing so much just as gifts."
"Oh, so you mean people are producing so much for free that the economic system has failed?"
"Yes, the old pyramid scheme one, anyway. There is a new post-scarcity economy, where between automation and a a gift economy the income-through-jobs link is almost completely broken. Everyone also gets income as a right of citizenship as a share of all our resources for the few things that still need to be rationed. Even you."
"Really? How much is this basic income?"
"Two thousand a month."
"Two thousand a month? Just for being me?"
"Yes."
"Well, with a basic income like that, maybe I can finally have the time and resources to get back to my painting...""

Sadly we are seeing some real bunker scenes with billionaires and they are not so funny:
https://www.vice.com/en/articl...
        "The men cited potential disasters caused by electromagnetic pulses, economic downturn, disease, or war that might "necessitate them leaving their Silicon Valley ranches and retreating to these fortified bunkers in the middle of nowhere."
        These "luxury bunkers" include features most of us could only ever dream of, like indoor pools and artificial sunlight, allowing them to remain sealed off from the world for years at a time, if necessary.
        "The billionaires understand that they're playing a dangerous game," Rushkoff said. "They are running out of room to externalize the damage of the way that their companies operate. Eventually, there's going to be the social unrest that leads to your undoing."
        Like the gated communities of the past, their biggest concern was to find ways to protect themselves from the "unruly masses," Rushkoff said. "The question we ended up spending the majority of time on was: 'How do I maintain control of my security force after my money is worthless?'"
        That is, if their money is no longer worth anything--if money no longer means power--how and why would a Navy Seal agree to guard a bunker for them?
        "Once they start talking in those terms, it's really easy to start puncturing a hole in their plan," Rushkoff said. "The most powerful people in the world see themselves as utterly incapable of actually creating a future in which everything's gonna be OK.""

Comment Recent history of the Open Technology Fund (Score 1) 54

https://en.wikipedia.org/wiki/...
====
The Open Technology Fund was started in 2012 by Libby Liu, then president of Radio Free Asia (RFA), as a pilot program within RFA to help better protect reporters and sources for the news organization with enhanced digital security technology.[9][2][5] Under U.S. Secretary of State Hillary Clinton, the State Department adopted a policy of supporting global internet freedom initiatives.[10] At this time, RFA began looking into technologies that helped their audiences avoid censorship and surveillance.[10] Journalist Eli Lake argued that Clinton's policy was "heavily influenced by the Internet activism that helped organize the green revolution in Iran in 2009 and other revolutions in the Arab world in 2010 and 2011".[10] ...

On March 14, 2025, President Donald Trump issued an executive order directing federal agencies to reduce their functions "to the maximum extent consistent with applicable law," including the U.S. Agency for Global Media (USAGM),[6] which disburses congressionally approved funding to OTF. The following day, USAGM senior advisor Kari Lake announced the termination of the OTF's federal grant, stating that the "award no longer effectuates agency priorities."[16]

In response, OTF filed suit against USAGM seeking the release of congressionally appropriated funds.[17] In its court filings, OTF argued that the termination would prevent an estimated 45 million users living under authoritarian regimes from accessing tools that enable uncensored access to the Internet and secure communications. The organization further claimed that, as the largest funder in the space, "the vast majority of internet freedom technology projects anywhere in the world will cease and the internet freedom technology field as whole will be largely decimated."[18]

The decision drew bipartisan concern from members of Congress who described OTF's work as vital to U.S. foreign policy priorities and Internet openness.[19][16] Although Lake later stated that she had withdrawn the letter rescinding OTF's funding,[20] USAGM reportedly sending payments after April 3rd.[21] In June, a federal judge ordered USAGM to release OTF's fiscal year 2024 funds.[22][23] In late October, OTF petitioned the court to compel the agency to disburse overdue fiscal year 2025 funding.[23]
====

Slashdot Top Deals

I cannot believe that God plays dice with the cosmos. -- Albert Einstein, on the randomness of quantum mechanics

Working...