You missed the part about minimum; it is meaningless if Intel offers some hideously expensive outlier part. Granted, there are some six core variants which cost less, but they are still expensive 130-140W LGA2011 parts. The mainstream is still stuck squarely with 2-4 cores. Two cores in 2015 is pitiful.
More importantly, it is byte addressable and doesn't require any of the block erase nonsense of NAND. There is no window during which some (possibly old) data or even the entire device becomes corrupted because of a power loss during a read-modify-(erase)-write cycle. It would be genuinely good if such reliability became a standard feature.
While an apology is due, this sort of problem is inevitable given the nature of the technology. TRIM on NAND is a crutch for a technology that is poorly suited to data storage. Transforming NAND into a usable storage device requires heroic efforts on the part of the vendor, and it is hard to blame them for the bugs. Likewise, it is hard to blame Linux developers for their heroic efforts to work around the extensive deficiencies of NAND flash. Trusting in cheap commodity devices that don't even claim to protect against power loss is ill-advised.
Using TRIM as a band-aid for the performance woes of over-filled NAND devices is just asking for trouble. It has long been known that filling up filesystems leads to terrible performance, and the same applies to NAND drives. It is irresponsible of the vendors to provision the drives with insufficient reserved space, but one can compensate by setting aside an empty partition covering 5% of the space. It is much safer to disable TRIM and under-provision the drive, and it achieves the same effect of limiting write-amplification, without having to worry about bugs trimming away live data.
The only place were TRIM really makes sense is in the context of virtualization. Recovering space in sparse virtual disk images has real benefit, and operating system vendors have a lot more incentive and ability to make it work properly.
2. is the renewable option, which is worse than doing nothing as it has large ecological and economic impact for virtually no benefit.
3. may be necessary at some point for things like ocean acidification, but doesn't solve the fundamental energy problem.
However, limiting oneself to three unworkable options isn't productive, so let's introduce another:
4. the nuclear option; ie. doing something which actually works. The BRIC countries are already embracing this one.
I prefer 4, as it provides reliable carbon neutral energy with minimal environmental footprint. Density is key, in energy as well as other human endeavors. I refer people to An Ecomodernist Manifesto for the motivations. Those who truly value the environment and prosperity of humans should read that. The end goal is well within reach, but indulging in the "green" fantasy won't lead us there.
...and it will be depleted if we continue to pump it out of the ground. We need a sustainable and carbon-neutral replacement, and synthetic carbon-neutral fuels can be created with nuclear heat. Today, ammonia is produced using natural gas as a feedstock, but it can also be created with nuclear heat. With abundant energy, most anything can be produced, and LNG is no exception.
Either way it would be better to look for alternatives, rather than heating our homes and producing fertilizer with something that is both running out and increasing in price. That is idiotic.
This ship is a marvel, and showcases the truly impressive capabilities of modern shipbuilding industry. What isn’t mentioned, but is equally impressive, is the rate at which such shipyards can turn out new ships, and the surprisingly low cost. However, one can’t help but lament that this capability isn’t being used to produce ThorCon reactors, instead of draining resources for a quick profit. (Do have a look at the white paper, it provides fascinating perspective.)
This LNG ship extracts and condenses 3.6 million tonnes of natural gas per year, with an energy density of 55.5 MJ/kg, giving:
3.6e6 tonnes/year * 1e3 kg/tonne * 55.5 MJ/kg = 199.8e9 MJ/year
or 199.8e9 MJ/year * MWh/3600 MJ * TWh/1e6MWh = 55.5 TWh/year
This yearly energy content represents a continuous power output of:
199.8e9 MJ/year * GJ/1e3 MJ * year/(365.2425*24*3600)s = 6.33 GW
That is the equivalent of a few large power plants. In the scheme of global energy requirements though, it barely registers: world energy consumption in 2008 was 143,851 TWh.
Now, given the energy density of Uranium/Thorium at 80e6 MJ/kg, the energy contained within that 3.6 million tonnes of LNG could instead be derived from:
199.8e9 MJ/year * kg/80e6 MJ * tonne/1e3 kg = 2.5 tonnes (of U or Th)
That is a rather small number, but lets put it in terms of volume. With Uranium at 1.5e9 MJ/L, or Thorium at 9.3e8 MJ/L, that amounts to roughly the size of a yoga ball:
U: 199.8e9 MJ * L/1.5e9 MJ = 133L (sphere of radius 32cm)
Th: 199.8e9 MJ * L/9.3e8 MJ = 215L (sphere of radius 37cm)
The fun part happens when you scale it back up to the global energy consumption of 143,851 TWh, and it translates to a meager 6500 tonnes per year, capable of replacing the billions of tons of fossil fuels we consume today. Even with projected growth, global energy demands could still be satisfied by a single mine, to say nothing of the billions of tons of uranium available in seawater. Before that is necessary, the tens of thousands of tons of so-called “nuclear waste” can be consumed, as they still contain ~95% of the original energy content.
If only Apple had postponed the Intel transition for about 6 months, their machines and software could have been 64-bit across the line, and this mess would have been avoided completely. Instead, we are eight years into yet another transition, with plenty of legacy 32-bit software out there, any of which require an entire duplicate set of shared libraries to be loaded.
After all the trouble and expense of sending a probe or lander out into the unknown, it seems a waste not to provide them with an RTG for reliable power. Solar panels have hobbled Mars rovers as well as other spacecraft.
They are producing ore, which is then shipped to their facilities in China for processing. Is that really progress?
Molycorp reopened the mine, and then bought Neo Material Technologies for its processing capabilities:
But the deal also paves the way for Molycorp to ship minerals from its California mine to the Chinese operations of a Neo Material arm called Magnequench, in a reminder of how much technological rare-earth capability resides in China.
When renewable advocates boast about energy production, the numbers are inevitable inflated by huge amounts of biomass, or meaningless capacity numbers which do not represent actual energy delivered. Leveling forests to burn for fuel is not environmentally friendly, and not even carbon neutral on the time scales that matter. In some cases, forests are pelletized and shipped over seas, making the carbon impact even worse than burning coal. Unfortunately, aside from hydro, it is the only renewable that is reliable, and thus forms an integral part of "renewable" plans. More typically though, coal and gas take up the slack.
Read more about Australia specifically, or bioenergy in general. Sadly there is only one form of clean energy that is environmentally friendly and scalable, and the renewable fanatics will have nothing to do with it, instead promoting a world of poverty and mass environmental devastation. While solar and wind have their place, it would be much more effective to complement them with nuclear instead.
Not at all. It just requires enough smart equipment to cope with whatever the variation in supply is. Even on an entirely renewable grid there will still be a lot of base load available, from non-intermittent sources like hydro and from the minimum output of variable sources like wind. If you have enough turbines the wind is always blowing somewhere, and the overall output of the entire fleet never drops below some predictable level.
There is a lot of industrial equipment and processes that requires a constant source of power. Moreover, even if some can cope with the variability, the economics often fail when capital intensive facilities are sitting idle 2/3rds of the time. The wind is also calm for weeks at a time over large regions, requiring either 100% backup or storage; the idea that we can satisfy our needs by shuffling renewable energy around that isn't available through a grid that doesn't exist is pure fantasy.
Also note that he isn't say "no storage", just no grid level storage. House pack batteries and EVs, even small local pumped storage will be available.
I'm not saying this is a desirable state of affairs, merely possible. In practice it would make a lot of sense to have grid level storage.
How many people are likely to use their hideously expensive vehicle battery in this way when it severely shortens its life? It also suffers the same problem as "smart equipment"; we can't afford to toss out and replace every last piece of technology.
The only realistic way of curbing our CO2 and other pollution is to produce carbon neutral fuels for existing engines, and clean energy for our existing grid. Only nuclear is capable of providing the reliable and affordable energy necessary to cleanly power modern civilization. Otherwise we will be lucky to keep the lights on at night while the remainder of our industry departs for China. More likely though, we will continue leveling mountains for coal as the true believers refuse to let go of their fantasy.
Do you have a citation for this "environmental damage"? Real damage, not caused by nuclear weapons manufacturing, and not the "OMG, three atoms of tritium escaped, we're all going to die!" sort of "damage".
The costs of the plants are a matter of record, so have a look. The NRC opened the door for litigation, and otherwise mired the nuclear industry. The AEC was an effective regulatory agency with an excellent safety record and reasonable costs. Under the NRC, costs skyrocketed and a number of reactors were even partially built yet never operated. Abundant examples are no further than your nearest search engine.
True, but the idea behind the combined operation license was to allow construction and operation to continue while license issues are litigated. The delays in plant Vogtle and in SC are from the challenges with actually building the plant since much of the equipment has never been built before so they must building, testing, and constructing while they are trying to create a commercial plant on a tight schedule.
While there are very real concerns about the lack of construction experience as well as longer term engineering and operational support, these delays seem to be self inflicted, from issues with concrete pours to assuming brand new designs can be built on a very tight schedule where many of the components have never been built or used before.
Read more about the the Vogtle rebar issue. It is not fair to dismiss it as self-inflicted, when the regulator insists upon perfection and is unresponsive to circumstances. The rebar was installed to current building standards, rather than those in place when the design was approved. It was a small deviation and eventually the NRC allowed it with minor modifications. The problem is that such a minor issue can introduce a 6+ month delay when interaction with the NRC are required.
Regulations should be focused on safe designs, not on libraries of paperwork certifying safety. It is silly to require an N-stamp on every last nut and bolt (even in non-safety related systems) rather than using off the shelf parts where suitable. Certificates can be forged, and even if they are genuine, nothing is perfect. Safe designs make allowances for imperfect materials. Such a “cost is no object” approach is not useful in the real world, The oppressive regulatory regime only mires any progress and ensure that we are burdened with ancient, yet "approved" designs.
Typically the endless lawsuits and anti-nuclear activism are the source of delays for nuclear construction. Even if not directly, then by proxy of the NRC, which is ineffective thanks to regulations based on ALARA and pseudo-science (LNT). If the NRC regulated based on solid science and legitimate safely concerns, it would be tremendously less expensive to meet nuclear safety standards. Unfortunately, our presidents have had a habit of appointing unqualified and nuclear-hostile people like Gregory Jackzo to lead the NRC, so the result is no surprise.
Another source of delay, is the lack of nuclear construction for decades, leaving the construction industry and supply chains to languish. Neither cost is inherent in nuclear construction, and both can be corrected. Delays of any large construction project are very expensive, and this is the primary means employed by anti-nuclear ideologues to drive up the cost. The submitter (mdsolar) may or may not have participated, but clearly has an axe to grind and the willingness to exploit the situation to peddle his ideology
One might expect that, but the Mill is exceptionally flexible when it comes flow control. It can speculate through branches and have loads in flight through function calls. The speculation capabilities are far more powerful, and there are a lot of functional units to throw at it. There will be corner cases where an OOO might do slightly better, but in general the scales should be tipped in the other direction. If anything, the instruction window on conventional hardware is more limiting.
Papers would be great, but peer-reviewed venues also tend to reject things which are truly novel. The group has written papers only to have them rejected for absurd reasons, one being a lack of novelty and hard numbers or something like that. They are clearly not interested in writing papers or even filing patents, and we are fortunate (or not) that their hand was forced on the patents with changes in law.