Comment Re:Slashdot has no memory? (Score 1) 10
It's too expensive now!
It's too expensive now!
Much of the critique seems irrelevant to AI other than LLMs, such as self-driving cars which map visual input to actions.
Well I shared the thread including your thought with Grok, and it said this:
-----
Here’s my actual, unbiased take on the technical debate (no Elon worship, no green hype, just physics and engineering reality as of 2025).
The original Slashdot skeptic is repeating a criticism that was 100% correct for the classic 1970s–2000s “hot dry rock” (HDR) projects (Los Alamos, Rosemanowes, Soultz 2000s era, etc.). Those really were conduction-limited. You drilled two wells a few hundred meters apart into 200–250 C granite, fracked a small volume, and watched the reservoir cool in months because heat was only replenishing by slow conduction through intact rock ( 2–3 W/mK). The recoverable fraction was tiny, and the impedance kept rising. That’s why HDR basically died commercially for 20 years.
Superhot / supercritical geothermal (400–450 C target, aiming for >500 C) is genuinely trying to escape that trap, and the physics does allow it—if everything works. Here’s why the critique no longer fully applies:
Energy density jumps by ~10× once you cross the critical point (374 C, 221 bar).
A kilogram of supercritical water at 500 C and 300 bar carries roughly 2,800–3,000 kJ/kg of usable enthalpy, versus ~300 kJ/kg for 250 C liquid water steam conversion in conventional geothermal. That alone reduces the mass flow (and therefore the parasitic pumping power and cooling load on the rock) by an order of magnitude for the same electrical output.
Heat transfer switches from conduction-dominated to convection-dominated.
In a well-designed, high-permeability fracture network, the supercritical fluid is buoyant and mobile. You get natural or strongly enhanced convection cells that sweep heat from a much larger volume of rock than the old HDR experiments ever touched. Surface area is now measured in square kilometers instead of hundreds of square meters.
Heat recharge rate in volcanic settings is completely different.
At Newberry, the background heat flux near the magma body is 300–1,000 mW/m (vs. ~80 mW/m continental average). Over a 1 km reservoir volume that’s hundreds of megawatts of continuous heat inflow. If your fracture cloud is big enough and permeable enough, you can pull 50–200 MW thermal more or less indefinitely without dropping the average rock temperature very much. (Fervo’s Project Red in Nevada—conventional temp, but still EGS—has already demonstrated ~3.5 MW from one well pair with negligible cooling over 18 months, and they’re only at ~190 C.)
Real remaining risks that the skeptic would be right to highlight:
Can you actually create and maintain a large, high-permeability fracture network at 500+ C and 10–20 km depth (or even 4–6 km near volcanoes) without it short-circuiting or sealing up with mineral precipitation? We don’t have field proof yet at those conditions.
Materials: drill bits, casings, pumps, and well cements that survive 500–600 C for 20–30 years do not fully exist off the shelf today. They’re being developed (high-nickel alloys, graphene-infused cements, etc.), but it’s expensive and unproven at scale.
Induced seismicity near an active volcano is non-trivial. Newberry is a shield volcano with low seismicity risk compared to, say, a subduction-zone stratovolcano, but still.
Economics: even if the resource works, LCOE has to beat new combined-cycle gas (~$40/MWh) and new nuclear (~$70–90/MWh) in the 2030s. The IEA’s “8 % of global electricity by 2050” scenario assumes superhot gets down to ~$30–50/MWh. That’s plausible only if wells become cheaper (thanks oil&gas tech transfer) and capacity factors stay >90 %.
Bottom line
The old “rock has terrible thermal conductivity, therefore geothermal is dead” argument is no longer a universal show-stopper for superhot volcanic/near-magmatic systems with modern EGS techniques. The physics really does flip in your favor above ~400–450 C. But turning that into thousands of gigawatts is still an engineering and materials challenge, not a solved problem. Mazama, Fervo, Eavor, Quaise, Sage Geosystems, etc., all have to demonstrate multi-decade performance at scale. The next 5–10 years of pilots will tell us whether the skeptics were merely early, or fundamentally right.
So: the ChatGPT answer you got was essentially correct on the physics and the generational shift. The Slashdot commenter is stuck in 1995 thinking, which is understandable—most of the industry was too until about 2018.
-----
If you think that this is incorrect information, a hallucination, or unduly sycophantic, I would like to hear your genuine case.
Same with my 12 mini. I want a small size iPhone due to my disabilities and small hands. Even SE would be OK, but it has to be faster than 13 mini!
And then, MS removes Notepad.
I wonder when Apple will increase their RAM prices.
Just a reminder! It's "blog post," not "blog!"
To address the contents of the post: yeah, this sucks, a textbook example of enshittification.
Fortunately, we have you -- a knowledgeable human, who can propose the right points!
So, I asked Chat about what you said, and asked Chat-GPT to formulate a response that directly addresses your key points.
Let me know if it reads like randomly extruded text, or if it has relevance to your understanding and argument:
---
The Campi Flegrei project you’re describing was a natural hydrothermal system, not an engineered superhot-rock (SHR) system, and that distinction matters for both the chemistry and the physics.
A hydrothermal reservoir taps naturally circulating volcanic brine. Its fluid chemistry is whatever the rock has been stewing in for centuries: arsenic, boron, mercury, dissolved metals, HS, etc. The industry learned long ago that natural brines often destroy turbines, foul condensers, and require scrubbing that wipes out the thermodynamic efficiency. Those problems are real, and your experience confirms them.
Superhot-rock geothermal is a different class of project. It does not rely on natural brines, natural permeability, or natural aquifers. SHR / modern EGS systems:
Bring their own working fluid (usually treated water in a closed or semi-closed loop),
Create engineered fracture networks rather than using natural ones,
Operate in a supercritical regime (>374C, high pressure) where heat transport is dominated by convective sweep rather than slow conduction, and
Are sited where mantle heat flux is extremely high (e.g., Newberry), not just where water has accumulated in shallow formations.
The key point is that the problems you encountered at Campi Flegrei — toxic brine chemistry, turbine contamination, and power losses from scrubbing — are specific to hydrothermal geology, not to engineered SHR systems. SHR avoids most of that simply because it doesn’t use the volcanic soup; it uses injected water circulating through a designed heat-exchange zone.
Your numbers actually underline the potential: you were getting ~50 MW per well from a shallow (~350–400C) hydrothermal system with awful chemistry. Modern SHR aims for rock in the 400–500C+ range, with supercritical water carrying far more enthalpy per kilogram and without the brine-chemistry penalty.
Whether SHR proves economical at scale is still an open engineering question. But the Campi Flegrei outcome doesn’t generalize to SHR any more than the problems of early natural-steam geothermal plants generalized to modern binary-cycle systems.
Does this distinction make sense from your point of view?
But as for MIT, or any other research institution with any prestige, my prediction is he is done.
As for Sam Altman, maybe you can point us to some example when he or OpenAI violated academic integrity by fabricating data like this?
There can be no twisted thought without a twisted molecule. -- R. W. Gerard