Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Energy in perspective (Score 1) 116

...and it will be depleted if we continue to pump it out of the ground. We need a sustainable and carbon-neutral replacement, and synthetic carbon-neutral fuels can be created with nuclear heat. Today, ammonia is produced using natural gas as a feedstock, but it can also be created with nuclear heat. With abundant energy, most anything can be produced, and LNG is no exception.

Either way it would be better to look for alternatives, rather than heating our homes and producing fertilizer with something that is both running out and increasing in price. That is idiotic.

Comment Energy in perspective (Score 1) 116

This ship is a marvel, and showcases the truly impressive capabilities of modern shipbuilding industry. What isn’t mentioned, but is equally impressive, is the rate at which such shipyards can turn out new ships, and the surprisingly low cost. However, one can’t help but lament that this capability isn’t being used to produce ThorCon reactors, instead of draining resources for a quick profit. (Do have a look at the white paper, it provides fascinating perspective.)

This LNG ship extracts and condenses 3.6 million tonnes of natural gas per year, with an energy density of 55.5 MJ/kg, giving:

        3.6e6 tonnes/year * 1e3 kg/tonne * 55.5 MJ/kg = 199.8e9 MJ/year
        or 199.8e9 MJ/year * MWh/3600 MJ * TWh/1e6MWh = 55.5 TWh/year

This yearly energy content represents a continuous power output of:

        199.8e9 MJ/year * GJ/1e3 MJ * year/(365.2425*24*3600)s = 6.33 GW

That is the equivalent of a few large power plants. In the scheme of global energy requirements though, it barely registers: world energy consumption in 2008 was 143,851 TWh.

Now, given the energy density of Uranium/Thorium at 80e6 MJ/kg, the energy contained within that 3.6 million tonnes of LNG could instead be derived from:

        199.8e9 MJ/year * kg/80e6 MJ * tonne/1e3 kg = 2.5 tonnes (of U or Th)

That is a rather small number, but lets put it in terms of volume. With Uranium at 1.5e9 MJ/L, or Thorium at 9.3e8 MJ/L, that amounts to roughly the size of a yoga ball:

          U: 199.8e9 MJ * L/1.5e9 MJ = 133L (sphere of radius 32cm)
        Th: 199.8e9 MJ * L/9.3e8 MJ = 215L (sphere of radius 37cm)

The fun part happens when you scale it back up to the global energy consumption of 143,851 TWh, and it translates to a meager 6500 tonnes per year, capable of replacing the billions of tons of fossil fuels we consume today. Even with projected growth, global energy demands could still be satisfied by a single mine, to say nothing of the billions of tons of uranium available in seawater. Before that is necessary, the tens of thousands of tons of so-called “nuclear waste” can be consumed, as they still contain ~95% of the original energy content.

Comment Should have been 64-bit from the start... (Score 3, Interesting) 67

If only Apple had postponed the Intel transition for about 6 months, their machines and software could have been 64-bit across the line, and this mess would have been avoided completely. Instead, we are eight years into yet another transition, with plenty of legacy 32-bit software out there, any of which require an entire duplicate set of shared libraries to be loaded.

Comment Re:I thought rare earths were not that rare (Score 1) 62

They are producing ore, which is then shipped to their facilities in China for processing. Is that really progress?

Molycorp reopened the mine, and then bought Neo Material Technologies for its processing capabilities:

But the deal also paves the way for Molycorp to ship minerals from its California mine to the Chinese operations of a Neo Material arm called Magnequench, in a reminder of how much technological rare-earth capability resides in China.

Comment Of course, when biomass is considered "renewable" (Score 0) 169

When renewable advocates boast about energy production, the numbers are inevitable inflated by huge amounts of biomass, or meaningless capacity numbers which do not represent actual energy delivered. Leveling forests to burn for fuel is not environmentally friendly, and not even carbon neutral on the time scales that matter. In some cases, forests are pelletized and shipped over seas, making the carbon impact even worse than burning coal. Unfortunately, aside from hydro, it is the only renewable that is reliable, and thus forms an integral part of "renewable" plans. More typically though, coal and gas take up the slack.

Read more about Australia specifically, or bioenergy in general. Sadly there is only one form of clean energy that is environmentally friendly and scalable, and the renewable fanatics will have nothing to do with it, instead promoting a world of poverty and mass environmental devastation. While solar and wind have their place, it would be much more effective to complement them with nuclear instead.

Comment Re:Expert?? (Score 1) 442

Not at all. It just requires enough smart equipment to cope with whatever the variation in supply is. Even on an entirely renewable grid there will still be a lot of base load available, from non-intermittent sources like hydro and from the minimum output of variable sources like wind. If you have enough turbines the wind is always blowing somewhere, and the overall output of the entire fleet never drops below some predictable level.

There is a lot of industrial equipment and processes that requires a constant source of power. Moreover, even if some can cope with the variability, the economics often fail when capital intensive facilities are sitting idle 2/3rds of the time. The wind is also calm for weeks at a time over large regions, requiring either 100% backup or storage; the idea that we can satisfy our needs by shuffling renewable energy around that isn't available through a grid that doesn't exist is pure fantasy.

Also note that he isn't say "no storage", just no grid level storage. House pack batteries and EVs, even small local pumped storage will be available.

I'm not saying this is a desirable state of affairs, merely possible. In practice it would make a lot of sense to have grid level storage.

How many people are likely to use their hideously expensive vehicle battery in this way when it severely shortens its life? It also suffers the same problem as "smart equipment"; we can't afford to toss out and replace every last piece of technology.

The only realistic way of curbing our CO2 and other pollution is to produce carbon neutral fuels for existing engines, and clean energy for our existing grid. Only nuclear is capable of providing the reliable and affordable energy necessary to cleanly power modern civilization. Otherwise we will be lucky to keep the lights on at night while the remainder of our industry departs for China. More likely though, we will continue leveling mountains for coal as the true believers refuse to let go of their fantasy.

Comment Re:The question should be, what is causing delays? (Score 1) 142

Do you have a citation for this "environmental damage"? Real damage, not caused by nuclear weapons manufacturing, and not the "OMG, three atoms of tritium escaped, we're all going to die!" sort of "damage".

The costs of the plants are a matter of record, so have a look. The NRC opened the door for litigation, and otherwise mired the nuclear industry. The AEC was an effective regulatory agency with an excellent safety record and reasonable costs. Under the NRC, costs skyrocketed and a number of reactors were even partially built yet never operated. Abundant examples are no further than your nearest search engine.

Comment Re:The question should be, what is causing delays? (Score 1) 142

True, but the idea behind the combined operation license was to allow construction and operation to continue while license issues are litigated. The delays in plant Vogtle and in SC are from the challenges with actually building the plant since much of the equipment has never been built before so they must building, testing, and constructing while they are trying to create a commercial plant on a tight schedule.

While there are very real concerns about the lack of construction experience as well as longer term engineering and operational support, these delays seem to be self inflicted, from issues with concrete pours to assuming brand new designs can be built on a very tight schedule where many of the components have never been built or used before.

Read more about the the Vogtle rebar issue. It is not fair to dismiss it as self-inflicted, when the regulator insists upon perfection and is unresponsive to circumstances. The rebar was installed to current building standards, rather than those in place when the design was approved. It was a small deviation and eventually the NRC allowed it with minor modifications. The problem is that such a minor issue can introduce a 6+ month delay when interaction with the NRC are required.

Regulations should be focused on safe designs, not on libraries of paperwork certifying safety. It is silly to require an N-stamp on every last nut and bolt (even in non-safety related systems) rather than using off the shelf parts where suitable. Certificates can be forged, and even if they are genuine, nothing is perfect. Safe designs make allowances for imperfect materials. Such a “cost is no object” approach is not useful in the real world, The oppressive regulatory regime only mires any progress and ensure that we are burdened with ancient, yet "approved" designs.

Comment The question should be, what is causing delays? (Score 2, Insightful) 142

Typically the endless lawsuits and anti-nuclear activism are the source of delays for nuclear construction. Even if not directly, then by proxy of the NRC, which is ineffective thanks to regulations based on ALARA and pseudo-science (LNT). If the NRC regulated based on solid science and legitimate safely concerns, it would be tremendously less expensive to meet nuclear safety standards. Unfortunately, our presidents have had a habit of appointing unqualified and nuclear-hostile people like Gregory Jackzo to lead the NRC, so the result is no surprise.

Another source of delay, is the lack of nuclear construction for decades, leaving the construction industry and supply chains to languish. Neither cost is inherent in nuclear construction, and both can be corrected. Delays of any large construction project are very expensive, and this is the primary means employed by anti-nuclear ideologues to drive up the cost. The submitter (mdsolar) may or may not have participated, but clearly has an axe to grind and the willingness to exploit the situation to peddle his ideology

Comment Re:Static scheduling always performs poorly (Score 1) 125

One might expect that, but the Mill is exceptionally flexible when it comes flow control. It can speculate through branches and have loads in flight through function calls. The speculation capabilities are far more powerful, and there are a lot of functional units to throw at it. There will be corner cases where an OOO might do slightly better, but in general the scales should be tipped in the other direction. If anything, the instruction window on conventional hardware is more limiting.

Papers would be great, but peer-reviewed venues also tend to reject things which are truly novel. The group has written papers only to have them rejected for absurd reasons, one being a lack of novelty and hard numbers or something like that. They are clearly not interested in writing papers or even filing patents, and we are fortunate (or not) that their hand was forced on the patents with changes in law.

Comment Re:Static scheduling always performs poorly (Score 1) 125

I think your generalization of static scheduling performs poorly on a Mill. :) The Mill architecture uses techniques which essentially eliminate stalls even with static scheduling, at least to about the same extent that an OOO can. Obviously, there will be cases where it will stall on main memory, but those are unavoidable on either. See the Memory talk in particular for how the Mill achieves this, and other improvements possible over OOO. The entire series of videos is fascinating if you have time, but there is also a short introduction. Beyond that, there is a considerable amount of detail scattered in the forums which the videos don't cover.

The Mill aims to provide OOO performance with DSP level efficiency, and offers a defensible means of doing so. Ultimately, any complex schemes aimed at keeping functional units busy are a waste of power if that result can be achieved with simple static hardware.

Comment Re:Sounds smart, but is it? (Score 1) 125

There is a lot of information available on the Mill architecture at this point, and very little reason to doubt its feasibility. Essentially all of the parts have been demonstrated in existing architectures, and the genius is in how they are combined in such a simple and elegant manner. Implementation issues aside, the idea is rock solid, and has too much potential to ignore. Perhaps the layman can not appreciate it, but the architecture has a profound ability to simplify and secure the entire stack of software on top of it. Even without silicon, that much is clear, and no doubt why there is so much excitement.

People look at architectures like a black box and fail to appreciate that the quality of applications they use are heavily dependent on what systems and language programmers can provide using that box. Not many have experience with such low level code, but it is a nightmare to produce and maintain on conventional architectures. It is fragile and full of cruft, requiring tremendous effort to optimize compilers/languages/libraries/etc. The Mill wipes away the need for such effort and enables trivial yet superior compilers, systems, software optimization and security. Those costs are worth arbitraging, even if the Mill itself offered no performance advantage. However, a Mill is actually very similar to a DSP from the hardware perspective so it is easy to extrapolate.

(Incidentally, the Mill is also expected to be an excellent platform for micro-kernels. The value of micro-kernels has never been in question, but there is a significant performance trade off on conventional architectures. L4 has done well in minimizing it, but on the Mill there will be no contest.)

Comment CPUs should be replaced upon request, or... (Score 1) 131

Alternatively, Intel should stop artificially segmenting their product line on every last instruction set extension or feature. ECC and VT-D should be standard features, yet are intentionally crippled on other Intel chips. If I paid extra for a Xeon, then I expect those to work and TSX is no different.

It is infuriating that developers and users alike must face such a mishmash of arbitrarily enabled functionality just so Intel can extract further profit, even while bragging about their low defect rate on the 22nm process. I'm not saying that processors shouldn't be binned, only that it should be done on the basis of defects. It is criminal to arbitrarily destroy value in the pursuit of profit, and maybe the law should reflect that.

Comment Re:Solution! (Score 1) 151

The problem, is that there aren't enough of them. ;) More seriously, that is basically what geothermal does, and it is useful where available. Exploration and drilling are expensive though, and suitable sites are limited. Interestingly, while the environmental effects are minimal compared to fossil fuels, they are still not as benign as with nuclear. Drilling releases greenhouse gasses trapped deep in the earth, among other things including radon. (Hence both geothermal and fracking put out more radioactivity than nuclear plants, though still nothing to panic over.) Fundamentally though, geothermal is merely indirect nuclear, taking advantage of the decay of thorium and uranium within the earth itself.

Slashdot Top Deals

The moon is made of green cheese. -- John Heywood

Working...