Link to Original Source
Link to Original Source
4GW is the peak output with clear skies at noon. The 6.4 TWh/y is the expected yearly output, as quoted from the article. That yields an anticipated capacity factor of 0.18, after taking into account that the earth rotates and has clouds and such. Wind and solar look great if you compare nameplate capacity and ignore the variability. In reality though, getting useful power out of them is pure fantasy unless you have pumped hydro available nearby, and even then it is not competitive.
"The solar photovoltaic power plant will have an estimated life of 25 years and is expected to supply 6.4 billion kilowatt-hours per year, according to official figures."
For reference, a single 1GWe nuclear plant operating at (a conservative) 0.85 capacity factor will produce 7.45 TW-hours/year of reliable power. So this solar plant isn't the equivalent of one reactor, much less four. Considering that nuclear plants typically last 60 years and AP1000s are near $2/W in China, the solar option costs five times as much over that time frame.
While this solar farm is idle at night and unreliable by day, the transmission infrastructure must be built to handle the full capacity of the equivalent four nuclear plants, and it will sit idle most of the time. The solar option makes no economic sense, when instead they could purchase two actual 1GWe nuclear plants, and have 15 TW-hours/year of reliable power for more than twice as long.
These are usually used as a source of last resort. They are usually avoided even for peaking demand. They are loud, suck fuel like crazy.
They exist for precisely this type of emergency, fuel shortage, scheduled down time of gas fired plants, or any grid failure.
That may have been the case once, but combustion turbines are now the preferred complement to highly variable wind, as they spin up fast. Ironically, this "green" solution uses considerably more fuel than combined cycle gas turbines alone to produce the same amount of energy. (30% efficiency for 70% of the time while wind produces no energy, versus 60% efficiency 100% of the time with CCGTs alone.)
One of the comments points to DPF, which uses dynamic code generation to demultiplex packets. This is a very promising and surprisingly old idea. A dynamically generated classifier/filter could replace the entire network input path, and interface well with Van Jacobson's net channels. In addition to providing superior performance, it would afford far greater flexibility and modularity of code.
The refinement of process has postponed this for a long while, but the time has come to explore new architectures and technologies. The Mill architecture is one such example, and aims to bridge the enormous chasm of inefficiency between general purpose CPUs and DSPs. Conservatively, they are expecting a tenfold improvement in performance/W/$ on general purpose code, but the architecture is also well suited to wide MIMD and SIMD.
Another area ripe for innovation is memory technologies, which have suffered a similar stagnation limited to refinement of an ancient technology. The density of both cache and main memory can be significantly improved on the same process with Thyristor-RAM or Z-RAM. Considering the potential benefits and huge markets, it is vexing that more resources aren't expended toward commercializing better technologies. Some of the newer technologies also scale down better.
Something to replace the garbage which is NAND flash would also be welcome, yet sadly there appears to be no hurry there either. One point is certain, there is a desperate need to find a way to commercialize better technologies rather than perpetually refining inferior ones. Though examples abound, perhaps none is more urgent than the Liquid fluoride thorium reactor. Molten salt reactors could rapidly replace fossil fuels with clean and abundant energy while minimizing environmental impact, and affordable energy is the basis for all prosperity.
Some sort of means of distributing the wealth available from productivity gains is necessary, whatever you want to call it. In the end, there is nothing a human does which can not be replaced by a machine. It is not right that an increasingly automated system of production should serve a handful of owners while the remaining population starves. That is especially true, given that those owners have contributed virtually none of the real work that has brought us the benefits of modern civilization. (That we owe to the hard work of many people over thousands of years...)
1) Wealth shift is a distraction; the fundamental point is that it needs to be created rather than merely concentrated for the benefit of a few. To create wealth requires energy, and the developing world recognizes this simple fact. Most of that is coming from a sharp increase of coal plants today, but they are aggressively pursuing nuclear and will reap the benefits that we are forfeiting. While the west lets its energy, manufacturing, and other infrastructure decay, the real mechanisms for new wealth creation are growing over seas while ours recede. Sooner or later we are going to discover that exporting our monopoly on ideas is not a substitute.
2) We are nowhere near overpopulated, and the universe is a big place should it ever become an issue. Regardless, the best way to curb population is to lift them out of poverty. You have this backwards; people that can afford to relax and enjoy a bit of life, will not be busy popping out kids to help with chopping down trees for fires, fetching water, farming, washing clothes by hand, etc. When people are no longer burdened by such tasks, they also have time for education and innovation. Sure, some will watch TV, but even that is better than investment banking; those are the people really digging the hole for us all.
The recently revealed Mill architecture is far more interesting, and also offers a much more attractive programming model. It is a highly orthogonal architecture naturally capable of wide MIMD and SIMD. Vectorization and software pipelining of loops is discussed in the "metadata" talk, and is very clever and elegant. Those who have personally experienced the tedium of typical vector extensions will appreciate it all the more.
Based on sim, the creators expect an order of magnitude improvement of performance/W/$ over conventional architectures. (That being a very conservative estimate, with the goal being DSP like efficiency on general purpose code.) How they propose to achieve that is fascinating, but I'm even more exited about the potential impact on software development. The architecture described will vastly simplify the OS and compilers, and remove or greatly reduce a number of typical inefficiencies.
Knight's Landing leaves me with the usual impression of Intel using brute force and process superiority to retain the edge, and the Mill may offer enough of an architectural improvement to finally put and end to that. It would still be a long road, but it is a nice thought.
The filesystem work on HAMMER2 (the filesystem successor to HAMMER1) continues to progress but it wasn't ready for even an early alpha release this release. The core media formats are stable but the freemap and the higher level abstraction layers still have a lot of work ahead of them.
Have you considered space maps for tracking free space? I thought that was one of the more interesting ideas in ZFS.
Anyway, great work on the SMP scalability. It is refreshing to see a concerted effort in reworking the system to be more SMP friendly, rather than the profuse and convoluted locking that most others have adopted.
Fused glass display is fine. It is the soldered in RAM, proprietary SSD, and glued in battery that are totally unacceptable. Ordinarily, I'd double the RAM in a year for a pittance, but now Apple forces you to pay a hefty premium for a limited amount of RAM up front, obsoleting the machine that much sooner. Replacement SSDs are available eventually, but with few options at high cost. Finally, who wants to take/send in their machine for battery service every two years? Batteries are consumables, and shouldn't be glued in anymore than a toner cartridge.
While the NIST curves are suspect, slow, and problematic in a number of other ways, there are fast and safe elliptic curves.
I'm sorry you took it that way, I was rather hoping that you would read it and develop an appreciation of how fluid fueled reactors are utterly different and fundamentally superior. Then, hopefully discontinue suggesting that people "educate" themselves with the typical anti-nuclear/thorium propaganda. While your links do not have tailored rebuttals yet, I expect that all of the various specious arguments are addressed within, repeatedly.
For an author of typically insightful comments, you would do us all well to educate yourself rather than citing nonsense and propagating FUD. Molten salt reactors are a silver bullet capable of end our dependance on fossil fuels. Working to tarnish the singular available option with that potential is not helpful.
Molten salt reactors are by definition meltdown proof, as the working state is already molten. The fuel salts are impervious to radiation damage, and the vessel will melt long before the salts boil, at which point the salts will drain onto the floor and ultimately still end up in the drain tank. The fuel is the coolant, and it has excellent thermal conductivity. At those temperatures heat dissipates rapidly, minimizing the difficulty of passive cooling. Even if the plant were turned into rubble, the heat would still dissipate into the surrounding environment and the salt would eventually freeze, all the while trapping the dissolved fuel and fission products. The freeze plug is a convenience to minimize damage to the reactor, but is not necessary for avoiding a large scale release of radiation--that is virtually impossible by any means. In the absurdly improbable event that some of the salt did boil away, that process itself would rapidly cool the bulk of the remaining salt, minimizing the release into the environment.
Unlike molten salts, the ceramic fuel elements of solid fueled reactors have very poor thermal conductivity and much higher melting temperatures. Worse yet, the rods contain more than a years worth of fuel, and trap all of the fission products over that period in a thermal insulator, with the volatiles inevitably released when cooling fails and the ceramic melts. That is a meltdown, and the escape of years worth of volatile fission products is indeed a very serious problem which simply doesn't happen with salts.
A thorium fueled molten salt reactor is continuously replenished, and contains no excess fuel. The magic of thorium is that it breeds in a thermal spectrum, and offers a simple chemical mechanism for reprocessing, not available in other fuel cycles. The thermal spectrum also requires much less fuel than the fast spectrum. Thanks to the fluid fuel, some volatile fission products like Xenon simply bubble out, and are continuously removed and sequestered. Others form stable salts with fluorine. All together, there is a minimum amount of fissile and decay heat present in any accident scenario, and the most dangerous long term hazards like cesium and strontium remain dissolved. The fluorine salts are among the most chemically stable compounds, and do not react violently with air or water. They are by far the safest place for nuclear fuel and fission products, where they can fissioned thoroughly, leaving virtually no waste.
With molten salt reactors, one has to be extremely creative to imagine disastrous accident scenarios. As a bonus, molten salt reactors were extensively researched, and about 10 years away in the 1970s. It might take a little longer today, but with a concerted effort, we could be mass producing reactors within 20 years, and well on our way to replacing all fossil fuel consumption later thus century. It is the one and only proven technology capable of that, so we ought to pick up where we left off without delay.
Is NFTables suitable as a generic packet classifier, or is it strictly limited to packet filtering? Van Jacobson's net channels offer the possibility of extraordinary improvements in efficiency and performance, great simplification of drivers, ease of development, and much improved flexibility. The one missing piece is a flexible packet classifier. While NFTables looks like it incorporates many of the essential ideas, it isn't clear wether it is built with this in mind. If not, I'd like to see this fixed before it is integrated.
I've long thought that that we should replace the whole mess of statically #ifdef'd protocol switch statements and filtering mechanisms tacked on as an afterthought. That instead, we should build all of that upon something like BPF at the very lowest level, and have it dynamically compiled to native code. Protocol classification and filtering rules would be translated to BPF like code fragments to be assembled by the system, and it would be high performance and truly modular. Periodically analyzing the statistics of actual protocol and traffic flows, the system could also recombine the fragments in the most efficient way possible.
The digital revolution is largely irrelevant--it is dwindling wealth which will kill jobs and inflame social unrest. As long as the pie is shrinking, and the fools making policy treat our economy as a zero sum game, there will be no end of problems. Certainly the distribution of wealth must be addressed, but it is also absolutely essential that we grow the pie.
Jobs and natural resources are necessary inputs to wealth creation, though by no means sufficient. Limiting the scope of the discussion to jobs is a futile exercise and misses the key point: our collective prosperity rests entirely on energy. Both jobs and natural resources are directly dependent on access to affordable and abundant energy. While automation will continue to reduce the need for human input, energy is not optional. It is critical that we secure a reliable and economical source of energy now, or the suffering will be inevitable as fossil fuels become ever scarcer.
Without access to energy, we won't even be able to feed ourselves, much less maintain our existing infrastructure, and civilization will decay and crumble. There is no shortage of work to be done, but we are increasingly constrained by our resource inputs. If only people could develop an appreciation for just how instrumental cheap fossil fuels have been in supporting our present quality of life, the sooner we could seriously work toward replacing them. Unfortunately, that requires embracing nuclear energy, as it is the only viable replacement we have for fossil fuels. Needless to say, I'm not very hopeful about our future.