Forgot your password?
typodupeerror

Comment On leadership through artificial scarcity (Score 1) 255

Thanks for the great insight on the roots of much artificial scarcity!

You might find this Trackdown episode from 1959 of interest -- as maybe life imitating art?
"A Conman Named [Walter] Trump"
https://www.youtube.com/watch?...
"The episode concerned a conman named Walter Trump, who attempted to scam a town out of their hard-earned cash by saying that a meteor storm would destroy the town that evening, and only he could save them. He would do this by building a wall. The name of the episode was "The End of the World.""

Have to wonder if Trump saw it as a teenager?

But, like "The Two Santa Clauses Theory", hard to argue with electoral success (whatever the consequences for the nation or the world)?
https://www.salon.com/2018/02/...
      "The Republican Party has been running a long con on America since Reagan's inauguration, and somehow our nation's media has missed it - even though it was announced in The Wall Street Journal in the 1970s and the GOP has clung tenaciously to it ever since.
        In fact, Republican strategist Jude Wanniski's 1974 "Two Santa Clauses Theory" has been the main reason why the GOP has succeeded in producing our last two Republican presidents, Bush and Trump (despite losing the popular vote both times). It's also why Reagan's economy seemed to be "good."
        Here's how it works, laid it out in simple summary:
        First, when Republicans control the federal government, and particularly the White House, spend money like a drunken sailor and run up the US debt as far and as fast as possible. This produces three results - it stimulates the economy thus making people think that the GOP can produce a good economy, it raises the debt dramatically, and it makes people think that Republicans are the "tax-cut Santa Claus."
        Second, when a Democrat is in the White House, scream about the national debt as loudly and frantically as possible, freaking out about how "our children will have to pay for it!" and "we have to cut spending to solve the crisis!" This will force the Democrats in power to cut their own social safety net programs, thus shooting their welfare-of-the-American-people Santa Claus. ...."

Comment Re:Why? It's a variation on my sig on irony... (Score 1) 255

Coincidentally the next story on Slashdot is about infrasound making people feel uneasy:
"The Silent Frequency That Makes Old Buildings Feel Haunted "
https://science.slashdot.org/s...
"What happened, specifically, was this: those exposed to infrasound reported higher irritability, lower interest in the music, and a tendency to rate the music as sadder, irrespective of whether it was the calming or the horror track. Cortisol levels, measured before and about 20 minutes after exposure, were also elevated. Kale Scatterty, the PhD student who led the work, notes that irritability and cortisol do tend to move together under ordinary stress, but adds that "infrasound exposure had effects on both outcomes that went beyond that natural relationship." That distinction matters more than it might seem. Previous theories about infrasound and paranormal experience have often leaned on anxiety as the explanatory mechanism, the idea that low-frequency sound triggers a kind of free-floating dread that the mind then reaches for supernatural explanations to account for. The new data don't really support that picture. Measures of anxiety didn't budge significantly. What went up was irritability and disinterest, a kind of sour, low-grade aversion rather than fear. That's perhaps a more honest description of how a lot of ghost stories actually feel in the telling: not screaming terror, but wrong atmosphere, a sense of unease that never quite crystallizes into something you can point at."

Presumably infrasound from wind turbines would also make people (and animals) feel uneasy in this way?

Comment Why? It's a variation on my sig on irony... (Score 1) 255

"The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

The tools of universal global abundance (in this case wind, solar, and batteries) are available -- but some politicians (and their supporters) are still stuck in the mindset of trying to maximize private gain in the short term -- even if it costs their country and the world a huge amount of suffering. But, this damage also would not be possible without using the abundance tools of mass communication and bureaucracy to mislead people to vote against their own long-term interests.

That said, I'm not especially a horizontal-axis wind power fan compared to solar because of infrasound risk from wind power to humans and animals (especially whales). Vertical-axis wind turbines are potentially much quieter (but more expensive right now).

"Wind turbine infrasound: Phenomenology and effect on people"
https://www.sciencedirect.com/...

"Experimental investigation on noise characteristics of small scale vertical axis wind turbines in urban environments"
https://www.sciencedirect.com/...

This still is an ongoing area of research and contention though -- and with so much money involved it is hard to know what research to believe:
"No, Wind Farms Do Not Generate Harmful Infrasound"
https://www.renewableenergymag...
        "The Woolcock Institute of Medical Research in Australia examined how infrasound from wind turbines impacted 37 healthy adults with sensitive hearing. The test lasted 72 hours. .... This groundbreaking study [showing no short-term effect of infrasound on some participants] will spark subsequent research in the field to corroborate the claim. Wind farms are not the only renewable energy source with queries about human health, and it is vital to debunk or unpack them. ....
        Scientists can prove the safety of wind farm noise as much as they want, but massive blades still produce noise pollution. Most turbines are horizontal-axis wind turbines. Vertical-axis alternatives, such as Savonius turbines, are becoming popular for being smaller and less strenuous on the ears. They are less domineering and more capable of urban incorporation, which minimizes land use debates over wind adoption.
        Building-integrated wind turbines (BIWTs) are another solution. BIWTs must have less noise output because installers affix them inside building structures. Shorter blades combined with infrastructure dampens sound. Blade coatings help adjust pitch, too. Additionally, their placement allows high energy-harnessing potential with lower rotational speeds, reducing opportunities for noise pollution.
      Innovation of quieter turbines is critical because removing this pain point redirects focus to other potential concerns with wind farms. Advocating their benefits alongside research is the best way to support their relevance while waiting for data to catch up. ..."

That said, pollution from burning fossil fuels harms a lot of people and animals too, and I expect that most current vertical-axis wind turbines overall produces significantly less overall harm than burning fossil fuels even if we could do even better. It's a complex topic:
"Which is worse for wildlife, wind farms or oil drilling?"
https://www.bbc.com/future/art...
"But the sensors also picked up on something else, which you can listen to in the recording below: deafening blasts from oil exploration in the Gulf of Mexico. "[The Gulf of Mexico] is a really noisy area because of all of the surveys that are associated with drilling, they do these seismic surveys that generate a whole lot of noise as they are looking for the oil pockets and understanding how the oil is moving," says Frasier." ... For whales, dolphins and other marine mammals that rely on sound and echolocation to find their food in dark and murky water, such loud and sudden underwater noise can be deeply disorienting, equivalent to being blinded, as well as being very distressing for them, research has shown."

Comment The whole thing is an ironic security mistake (Score 1) 83

As I suggest in my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

More details here: https://pdfernhout.net/recogni...
        "Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead? ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...
        The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance [otherwise they could not be so powerful and pervasive]. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working"). ..."

Comment Some of those companies are gone (Score 1) 169

One company that comes to mind is Light & Motion. They made premium bicycle lights in the US.

The problem is that bicycle lights were exempt from the tariff, while components to make bicycle lights were not. This meant that they not only had to compete with imported knock-offs, but they also paid more for the parts they needed. The result is that an established American company went out of business.

It's all part of the "Making America Second Rate Again."

Comment Need a laughable perspective shift (Score 1) 175

As with my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."
https://pdfernhout.net/recogni...
        "Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace oil, or why not use rocketry to move into space by building space habitats for more land? ...
        These militaristic socio-economic ironies would be hilarious if they were not so deadly serious. ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...
      So, while in the past, we had "nothing to fear but fear itself", the thing to fear these days is ironcially ... irony. :-)
      So, how can we transcend militarism?
      Simple persuasive rhetoric was tried, and failed, when Albert Einstein said, with the creation of atomic weapons everything had changed except our way of thinking.
      The economic argument against war was tried, and failed; see "War is a Racket" by Two-Time Congressional Medal of Honor Recipient Major General Smedley D. Butler...
      A basic moral argument against war was tried, and failed; see Freeman Dyson's book "Weapons and Hope" that says nuclear weapons are a moral evil, like slavery.
      A deeper religious argument against war was tried, and failed, see "James P. Carse, Religious War In Light of the Infinite Game, SALT talk"...
      We even tried public education through TV to create an enlightened citizenry (what high hopes back when TV was created) and that even got corrupted into promoting and celebrating violence. See the book by Diane E. Levin and Nancy Carlsson-Paige "The War Play Dilemma" for ways to deal with that if you have children...
        So, people have tried, and tried again, and failed to turn the tide, both people in the military and people outside the military. Still, each attempt has contributed, but together they have not yet been enough yet to turn the tide and help the USA transcend militarism and empire.
      What else can we try that does not just beget more violence? ...
      Maybe ironic humor is our last, best hope against the war machines?
      As was quoted by Joel Goodman of the Humor Project...: "There are three things which are real: God, human folly, and laughter. The first two are beyond our comprehension. So we must do what we can with the third. (John F. Kennedy)"
      The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.
      We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security ("I'm safe because you are nervous") and extrinsic security ("I'm safe despite long supply lines because I have a bunch of soldiers to defend them"), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch's mutual security ("We're all looking out for each other's safety") and Amory Lovin's intrinsic security ("Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working"). ..."

Comment $43? Not me! (Score 1) 52

Last night, my daughter and I went to see Hail Mary. The ticket price for both of us was $12 (1-adult and 1-senior), and adding a $4.50 big bucket of popcorn, we were still under $20.

It would have to be extremely exotic for me to even consider $43. . . Possibly first-class dining service, but I don't think that would even pull me in for two tickets.

Comment Eric Schmidt on AI used to make bioweapons soon (Score 1) 14

From the transcript about 43 minutes in of a public conversion with Eric Schmidt from Apr 10, 2025: https://www.youtube.com/watch?...
====
          "Question: Thanks for the great conversation so far. Leonard Justin. I'm a PhD student at MIT. Um, I was wondering if you could just discuss a bit more some of the risks you see coming specifically with respect to biology and how we should go about mitigating those. What's the role of the AI developers? What's the role of government? Um, yeah, how can we move forward on that?
        ----
        Schmidt: So, so you're going to know a lot more about this area than I, but speaking as an amateur in your field, the two current risks from these models are cyber and biorisks.
        The cyber ones are easy to understand. The system can generate cyber attacks and in theory can generate zero-day cyber attacks that we can't see and it can unleash them and furthermore it can do it at scale.
        In biology, you get some evil, you know, the equivalent of Osama bin Laden. They would start with an open-source model. Now these open source models have been restricted using a testing process. Uh they're called cards and they test it out and they delete that information from the model.
        It turns out it's relatively easy to un to reverse essentially those security modes around the model and that's a danger. So now you've got a model that can generate bad pathogens.
        Then the second thing you have to do is you have to find things to build them. Our collective assessment at the moment is that that's a nation state risk, not an individual terrorist risk. Although we could be wrong, but there's plenty of examples uh and this the the report talks about some of the Chinese examples where in theory if they wanted to they could not only manufacture bad things but sorry design them but also manufacture them.
        The good news and the reason we're all alive today is that the bio stuff is hard to manufacture and distribute and to make deadly and and spread and so forth and so on. Um there's lots of evidence for example that you can take a bad bio right now and modify it just enough that the testing regimes and the sort of surveillance regimes it bypasses and that's another threat.
        So that's what I worry about.
        But I think at the moment u our consensus is we're right below the threshold where this is an issue and the consensus in in my side of the industry is that one more or two more turns of the crank these issues will be -- and you know by then you'll be graduated and you can sort of help solve these problems.
        Um the a crank is turned every 18 months or so. This is about three years.
        ----
        Moderator: But theoretically, couldn't AI and biotechnology help you come up with a counter measure?
        ----
        Schmidt: Um, I had thought so, and that was the argument I made until I I do a lot of national security work. And there's a term called offense dominant. And an offense dominant is a is a situation in a military context where the attack cannot be countered at the same level as the attack. In other words, the damage is done.
        And most people, most biologists who've worked in this believe that while the model can be trained to counter this, the damage from the offense part is far greater than the ability to defend it, which is why we're so worried about it."
====

Ultimately, I feel a big part of the response to that threat needs to be a shift in perspective like through people laughing at my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity." :-)

Explored in more detail here:
"Recognizing irony is key to transcending militarism"
https://pdfernhout.net/recogni...
        "... Biological weapons like genetically-engineered plagues are ironic because they are about using advanced life-altering biotechnology to fight over which old-fashioned humans get to occupy the planet. Why not just use advanced biotech to let people pick their skin color, or to create living arkologies and agricultural abundance for everyone everywhere?
        These militaristic socio-economic ironies would be hilarious if they were not so deadly serious. ...
        There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those "security" agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ..."

Comment Tools of abundance misused from scarcity fears (Score 3, Insightful) 28

Time to trot out my sig again, sigh: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

Biotech if used well could make the Earth near a paradise of abundance for all. Or people could create and enforce artificial scarcity with biotech. People make choices every day about which future they are working towards.

As Bucky Fuller wrote: "Whether it is to be Utopia or Oblivion will be a touch-and-go relay race right up to the final moment.... Humanity is in 'final exam' as to whether or not it qualifies for continuance in Universe."

Comment Poe's Law (Score 1) 148

Poe's Law is the internet adage stating that without a clear, explicit indicator of intent (such as a winking smiley face or "/s" tag), it is impossible to create a parody of extreme views that someone won't mistake for a sincere expression of those views. This isn't to say that the OP intended satire. As Poe's Law says, I don't know.

Comment Marshall Brain's "Manna" depicts similar things (Score 3, Insightful) 118

Thanks for your insightful posts. I expanded on that idea in 2010: https://pdfernhout.net/beyond-...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

That said, indigenous ways were "the original affluent society" (even if such ways might have been harder to practice on a restricted reservation after extensive conflicts with Europeans wielding "Guns, Germs, and Steel"):
https://en.wikipedia.org/wiki/...
"The basis of Sahlins' argument is that hunter-gatherer societies are able to achieve affluence by desiring little and meeting those needs/desires with what is available to them. This he calls the "Zen road to affluence, which states that human material wants are finite and few, and technical means unchanging but on the whole adequate"."

Comment IBM: The origins of THINK (Score 1) 78

https://www.ibm.com/history/th...
"An ad hoc lecture [from 1915] from IBMâ(TM)s future CEO spawned a slogan to guide the company through a century and beyond"

https://humancenteredlearning....
"And we must study through reading, listening, discussing, observing and thinking. We must not neglect any one of those ways of study. The trouble with most of us is that we fall down on the latter -- thinking -- because it's hard work for people to think. And, as Dr. Nicholas Murray Butler said recently, 'all of the problems of the world could be settled easily if men were only willing to think.' (Thomas Watson, IBM)"

https://en.wikiquote.org/wiki/...
"All the problems of the world could be settled easily if men were only willing to think. The trouble is that men very often resort to all sorts of devices in order not to think, because thinking is such hard work. (Nicholas Murray Butler, often misattributed to Thomas J. Watson)"

So, yeah, echoing your point, make programmers do the hardest parts of their job all the time -- especially reviewing code from inconsistent-to-put-it-politely AI contributors -- and no wonder they feel "fried".

Does AI support for programming need to be this way? I might hope not, but we are also mainly hearing about AI used within a short-term-profit-maximizing hyper-competitive corporate social context. Like I say in my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

Comment Re:Also required reading in history of technology (Score 1) 39

A tangent on another historian of science and technology I enjoyed taking a course from and who wrote about information technology and who died relatively way too young:
https://en.wikipedia.org/wiki/...
        "James Ralph Beniger (December 16, 1946 -- April 12, 2010) was an American historian and sociologist and Professor of Communications and Sociology at the Annenberg School for Communication at the University of Southern California, particularly known for his early work on the history of quantitative graphics in statistics, and his later work on the technological and economic origins of the information society."

RIP Jim. Thanks for being an excellent -- and kind -- professor. And mentioning me and other students with thanks in your Control Revolution book.

And thanks also introducing me to the 1978 book "Autonomous Technology: Technics-out-of-Control as a Theme in Political Thought" by Langdon Winner --- which just gets more relevant every day.
https://www.amazon.com/Autonom...
        "This study of the idea of technology out of control makes an important contribution to our understanding of the problems of civilization. The basic argument is not that some persons or groups promote technology against the public interest (true though that is), or even that our technology develops in its own way in spite of all our efforts to control it (also true in some respects). Rather, Winner is concerned with a more subtle effect: the artifacts that we have invented to satisfy our material wants have now developed, in size and complexity, to the point of delimiting or even determining our conception of the wants themselves. In that way, we as a civilization are losing mastery over our own tools.... 'As a source for readings and reflections on this problem, the book is rich and rewarding.... If it has a practical lesson, it is that of awareness: only by recognizing the boundaries of our socially constructed scientific-technological reality can we transcend them in imagination and then achieve effective human action.'"

Winner's book is perhaps a sort of antithesis of "The Soul of A New Machine"? In that sometimes technologists (I'm looking at your OpenAI and cohorts) can get so excited about problem solving and miss the big picture. Related humor:
https://www.reddit.com/r/Jokes...
        "This reminds me of the engineer who is condemned to death, but as his turn approaches they find that the mechanism on the electric chair is no longer functioning. About to send everyone home, the engineer calls out "Wait! I think i can see the problem!""

I heard that Winner did not get tenure at MIT essentially because he suggested that the method of education used there was essentially blinding the MIT students to the social implications of what they were doing. A related book:
https://web.archive.org/web/20...
        "Who are you going to be? That is the question.
        In this riveting book about the world of professional work, Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline."
        The hidden root of much career dissatisfaction, argues Schmidt, is the professional's lack of control over the political component of his or her creative work. Many professionals set out to make a contribution to society and add meaning to their lives. Yet our system of professional education and employment abusively inculcates an acceptance of politically subordinate roles in which professionals typically do not make a significant difference, undermining the creative potential of individuals, organizations and even democracy.
        Schmidt details the battle one must fight to be an independent thinker and to pursue one's own social vision in today's corporate society. He shows how an honest reassessment of what it really means to be a professional employee can be remarkably liberating. After reading this brutally frank book, no one who works for a living will ever think the same way about his or her job."

Anyway, I am so thankful for people like Jim Beniger, Langdon Winner, Michael Mahoney, and others who provided a larger perspective to technologists on what they were doing.

If I can find fault with "The Soul of a New Machine" from hazy recollections from 40+ years ago I don't think it did that. It's still informative though on how technology -- like now AI development -- can become addictive for techies with both good and bad aspects.
https://en.wikipedia.org/wiki/...
      "The book follows many of the designers as they give almost every waking moment of their lives to design and debug the new machine."

It's maybe kind of like the difference between books that glorify warmaking and "winning" versus those that glorify peacemaking and "win/win/win"?

An example of the latter is "To Become a Human Being: The Message of Tadodaho Chief Leon Shenandoah".
https://en.wikipedia.org/wiki/...

Or an essay by Terry Dobson on Aikido:
https://livingthepresentmoment...
https://en.wikipedia.org/wiki/...
      "My teacher taught us each morning that the art was devoted to peace. "Aikido," he said again and again, "is the art of reconciliation. Whoever has the mind to fight has broken his connection with the universe. If you try to dominate other people, you are already defeated. We study how to resolve conflict, not how to start it." ..."

Or in a way Theodore Sturgeon's "The Skills of Xanadu" 1956 short story about wearable networked mobile computing for sharing knowledge that inspired Ted Nelson to invent hypertext which contributed to the creation of the web.

Or Alfie Kohn's "No Contest: The Case Against Competition":
https://www.alfiekohn.org/cont...

One of the most interesting things I overheard at lunch at IBM Research was something along the lines of "We hire the top people from the most competitive schools -- and then wonder why they can't get along and cooperate."

And I'd add, wonder why many technologists rarely take much time to think about the broader social implications of what they are doing.

And "Soul of a New Machine" -- from what I can remember of it -- exemplifies and celebrates that focused mindset.

I can thank someone (maybe Bruce Maier?) who posted this essay to the Lyrics TOPS-10 timesharing system I used in high school circa 1980 -- for helping me have some reservations about such narrow focus:
"The Hacker Papers [about some Stanford CS students]"
https://cdn.preterhuman.net/te...
      "As much as an essay, this is a story. It is a true story of people paying $9,000 a year to lose elements of their humanity. It is a story of the breaking of wills and of people. It is a story of addictions, and of misplaced values. In a large part, it is my own story. ..."

Comment Also required reading in history of technology (Score 1) 39

Also assigned reading in Michael Mahoney's course on the history of science and technology at Princeton circa 1984. Although I had read it already -- and found it inspiring.

RIP Tracy Kidder.

And also RIP Professor Mahoney who died in 2008 at the relatively young age of 69 -- just when a historian is typically getting very productive.

https://en.wikipedia.org/wiki/...

https://www.dailyprincetonian....

"Histories of Computing"
https://www.hup.harvard.edu/bo...
"Computer technology is pervasive in the modern world, its role ever more important as it becomes embedded in a myriad of physical systems and disciplinary ways of thinking. The late Michael Sean Mahoney was a pioneer scholar of the history of computing, one of the first established historians of science to take seriously the challenges and opportunities posed by information technology to our understanding of the twentieth century."

It's going to be a rough next decade with ongoing loss of pioneers of personal computing and those who wrote about them or who inspired them. People like Steve Jobs and Doug Engelbart and Isaac Asimov and Theodore Sturgeon who have passed on years ago. Glad that Steve Wozniak and Alan Kay and Dan Ingalls and so on are still hanging in there! A heartfelt thanks to all of them for giving us possibilities -- even if we may not be doing great things with them right now.

"Original architects [Wozniak] of the personal computer hate what it's become..."
https://www.youtube.com/watch?...

"Steve Wozniak says he's "disappointed a lot" by AI and rarely uses it"
https://www.techspot.com/news/...

Slashdot Top Deals

This screen intentionally left blank.

Working...