Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Numbers in summary contradict headline (Score 2) 235

If you drill down into the source numbers (Scotland Qtr tab), it breaks down as:

31.0% - 7.8 TWh - Nuclear
25.1% - 6.325 TWh - * Wind
22.2% - 5.6 TWh - Coal
12.4% - 3.108 TWh - * Hydro
5.6% - 1.4 TWh - Gas
2.3% - 0.585 TWh - * Other biomass including co-firing (this usually means wood burning)
1.1% - 0.277 TWh - * Landfill gas
0.2% - 0.054 TWh - * Solar
0.06% - 0.014 TWh - * Sewage sludge

Sources preceded by a * are classified as renewable.

Comment Re:Don't hear that it's just the Republicans at th (Score 2) 413

Senators can't be gerrymandered because they represent the entire state. A pre-set geographic boundary which (usually) can't be changed. Gerrymandering happens after each Census (2010, 2000, 1990, 1980, etc) when the House seats are reapportioned and redrawn to be relatively equal in population.

If you want a recent Democrat example, just look at California. In the 2014 House elections, Democrat candidates got 57.7% of the votes relative to Republicans (4.06m vs 2.98m). Yet they won 73.6% of the races (39 of 53). Of the 9 races where the winner got fewer than 57.7% of the votes, Democrats won 8, Republicans just 1.

Anyway, this is nothing new. The term Gerrymandering dates back to 1812. Letting the State legislatures draw the election districts is literally letting the foxes guard the henhouse (gerrymandering isn't just about helping your own party, it's also about making "safe" districts so incumbents have an easier time getting re-elected). In the 1990 election, California ended up with a Democrat-controlled legislature and a Republican governor. The Democrats gerrymandered the districts, and the governor vetoed it. The boundaries ended up being drawn by the State Supreme Court, and for the next 8 years California had probably the fairest elections in its history.

There were two California ballot initiatives in 1990 for taking control of redrawing the districts away from the legislature. They were both winning until about a month before the election. Basically every special interest out there realized fairer districts would add unpredictability by increasing the chances of incumbents losing. So they all ran ads against them (including several groups I had previously thought were "honest" like the Sierra Club and NOW). And both initiatives were defeated.

Comment Re:Spoofing (Score 1) 234

As others have pointed out, XPrivacy does exactly that. Unfortunately, it runs on top of the Xposed Framework, which was coded assuming Dalvik. ART breaks it, and Lollipop has switched over entirely to ART. According to the developer it's going to be a monumental task rewriting it to work under ART, and not to expect anything until 2015.

I've fallen back to my old standby - Droidwall. It's an iptables firewall. Doesn't help with the apps which need Internet access to function, but works on everything else. They can collect all the information they want. It's useless to them if the app can't phone home.

Comment Re:Actually doubles in 60 days (Score 2) 244

Your graph shows Sierra Leone and Guinea cases growing at about the same exponential rate as in the past. Most of the "slowdown" is due to Liberia cases tapering off, but there's a huge comment in the middle of the graphic saying this is likely due to a breakdown in Liberia's ability to accurately track the number of cases, rather than an actual slowdown.

Comment Re:How about transfer rate and reliability? (Score 1) 215

Transfer rate is proportional to areal density. If the areal density increases 10x, then for each rotation of the platter 10x as many bits pass under the read/write heads, and (sequential) transfer rate increases by 10x as well.

The problem for HDDs is and always has been random seek times - the time it takes to move the read/write heads to a new location and wait for the proper part of the platter to spin underneath. Look at this 7200 RPM HDD reivew from 2003. The sequential read/write speeds (45/27 MB/sec) are about a third what a modern drive gets. But the IOPS is exactly the same because it depends platter rotation speed.

Comment Re:Niche energy (Score 2) 90

I am wondering why you think the energy density is low? I think it is huge, a lot more kinetic energy per square meter than faster moving, but lighter, air.(wind)

That's the problem - you can't really tap into the kinetic energy of the wave from the surface. The up-down motion of the wave is just a boundary layer height change due to a transient lateral pressure differential in the water. i.e. The water pressure is higher at this point than at a point 1 meter away, while the air pressure is the same in both spots. So the water is higher at this point, creating the height differential we call a wave. The vast majority of the energy is transmitted under the surface - even if you covered the ocean surface with a solid 100% energy-absorbing material, the wave would still propagate. The amount that'd be lost to the surface (due to harvesting) is just the difference in cross sectional area of the wave front from one end of the harvesting device to the other if the wavefront were allowed to expand upwards. Unless you're in very shallow water, the vast majority of the energy simply passes underneath your device.

So a floating structure is a terribly inefficient way to extract energy from the wave. It'd be like trying to extract wind energy using balloons which flop around in the wind. A turbine is a much more efficient way to harvest the kinetic energy, except underwater turbines tend not to last very long due to corrosion, biological fouling, and experience higher wear due to the incompressibility of water.

If you don't believe me, ask yourself why sailing ships were designed to use wind energy instead of wave energy. Waves are more consistent than wind - even when there is no wind there are frequently ocean swells which could've provided energy to propel ships. It's because average wind energy is denser than the fraction of wave energy you can extract from something bobbing on the surface.

Comment Re:In Finland (Score 1) 516

And we have underground wiring. Areas with above ground wiring sees more outages.

This is also what annoys me whenever I have been visiting the US - the air is filled with wires high and low, which definitely destroys the scenery of the otherwise picturesque towns that are common in New England among other places.

It boils down to average population density and cost. It's worth paying to put wires underground if they're going to be servicing a thousand homes on one block. But it's not worth it if you're going to be servicing just a few dozen homes spread out over a square mile. That's why the cities tend to have underground wiring, and suburban and rural areas tend to have above ground wiring. Europe tends to have more of the former, the U.S. more of the latter.

As for home building materials, that again boils down to cost. The U.S. and Canada had (still have) vast tracts of forests and lumber is relatively cheap. Europeans cut down most of their forests centuries ago, so brick and stone tend to be cheaper.

Comment Re:In Finland (Score 5, Informative) 516

This brings me to my curiosity over why Americans keep building houses out of wood in these regions? In California for example much of the earthquake damage seems to be wooden houses although they have noticeably strengthened building codes Californians are still stuck with a whole lot of vulnerable older houses

Structural engineer here. Wooden structures survive earthquakes best because they flex. Contrary to the story of the three little pigs, stone masonry is the worst because it has no lateral strength. They're fine in static loading when all the forces are pointing straight down; but the moment the force vector tilts a bit sideways they collapse. The huge death tolls you hear about from earthquakes in developing countries is almost always from collapsed masonry or concrete structures. Mud huts simply don't have the mass to kill residents, and wood homes survive most earthquakes relatively intact. In the 1933 Long Beach earthquake most of the brick schoolbuildings collapsed. Fortunately the earthquake happened in the evening when the kids were home from school, or it could've been a disaster rivaling the 1906 San Francisco quake. But that's the quake which made California realize brick buildings in earthquake country were just plain stupid. If you drive around Los Angeles or San Francisco and look at the older brick buildings, you'll often see a regular pattern of square metal plates on the outside. These are the end ties for steel rods which retrofitted to masonry buildings. They run through the entire length of the building and connect all four sides together into a rigid box. Without them the walls simply fall over in an earthquake.

Metal would be better, but is much more expensive. And its strength is not needed for static loads in smaller structures. Static loading is the reason skyscrapers are made of metal, not because it's more resistant to earthquakes. Skyscrapers are naturally resistant to earthquakes because their height gives them a much lower natural resonance frequency than most earthquakes, and they just kind of shimmy in place during a quake. The highest-risk structures are about 3 stories tall - that's where your resonance frequency matches that of a typical earthquake. If you look at the buildings which collapsed in the Loma Prieta quake and the Northridge quake, the vast majority were 3 stories. Both were relatively moderate quakes so give you an idea which buildings are the first to collapse, unlike larger quakes which destroy a larger variety of buildings.

In earthquake country like California, the two places I would never live in are masonry buildings, and 3 story tall buildings.

Comment Re:Except... (Score 3, Informative) 126

Before you younguns turn this into a "those silly Americans" thread, Colossus was absolutely essential to breaking the Nazi Enigma code and was classified during and after WWII. ENIAC was therefore regarded worldwide as the world's first general purpose computer. Everyone who went to school before 1996 was taught that ENIAC was the world's first GP computer.

Information about Colossus was first declassified in 1975, but it wasn't until 1996 (not coincidentally 50 years after WWII ended) that enough about it was declassified for the general public to realize it was in fact the first GP computer.

Comment Re:LMFTFY (Score 1) 652

"Renewable energy technologies, as they exist today, simply won't work."
So, what? We should stop pursuing them altogether?

You really should read the IEEE article. It does a really good job explaining why their reasoning and conclusions have nothing to do with your knee-jerk reaction.

In a nutshell, they calculated what the best-case reduction in carbon emissions would be due to widescale adoption of renewables based on their economic feasibility and expected technological improvements. Then they used that to figure out what atmospheric carbon levels would be under this best-case scenario. CO2 levels would still be increasing. And since we already blew past the danger point of 350 ppm around 1990, we'd still be at risk of adverse climate change due to warming.

Basically, the number that climate change hinges upon is an amount. CO2 emissions are a rate - the first derivative of the amount (on the emissions side). Simply adopting renewables isn't enough. We have to adopt them quickly enough for the rate change to affect the amount in the desired direction. Don't do it quickly enough and things get worse (much worse) before they get better.

They gave up on an all-renewables plan because the economics of renewables simply aren't improving quickly enough to flip us from increasing the amount of CO2 in the atmosphere to reducing it, before we arrive at a disastrous scenario. It needs to be flipped faster, and the only way to do that without wreaking economic havoc to the energy sector is to rapidly adopt other carbon-neutral energy sources like nuclear.

Comment Re:If and only if (Score 1) 652

You assume that economies can't lose any money in transition.

This is a flawed idea in that just refuses to consider political action in response.

This is a Tragedy of the Commons situation. If a country adopts policies which lose money in the transition, its economy shrinks and the economies of countries which don't lose money in the transition grows. Basically all that happens is the CO2-generating activity gets shifted from countries who decide it's worth sacrificing their economy to save the planet, to countries who decide they'd rather grow their economy. e.g. The U.S. decides gasoline oil and coal should be taxed so gasoline is now $10/gal. Manufacturing and production then flees to (say) India where they've decided not to tax fossil fuels. And the net result is that there's very little reduction in CO2 emissions.

The only way "political action" gets you out of this quandary is if you can get the vast majority of the world's population to follow your economic austerity measures. Not 50%, not 75%, probably closer to 90%-95%. Good luck with that. Basically for the economic austerity plan to work, everyone has to be on board. If a major player isn't or enough people secretly go against it, it fails.

Comment Aren't GRBs tightly focused? (Score 1) 307

I only had time to skim TFA, but it sounded like they did a strict radial distribution from a GRB source to a potential life-harboring planet. i.e. if there's a GRB of x magnitude and it's within y parsecs of a planet, assume life on that planet is wiped out.

If I remember, GRBs are focused so most of their energy exits out the rotational poles. If you assume galaxies formed from a cloud of matter spiraling down and clumping together, then the stars and planets in a galaxy will tend to have the same angular momentum vector - that is, they rotate in pretty much the same plane as the galaxy. That means the rotational poles are oriented along the thin dimension of the galactic disk, and thus most of the energy from a GRB only impacts the star systems "above" and "below", not the huge bulk of systems "beside" the GRB star. That would drastically reduce the number of star systems "in the crosshairs" of a GRB. Obviously there are exceptions (Uranus' rotational axis is tilted 98 degrees), but without knowing the frequency of such exceptions it would seem impossible to accurately estimate how much life GRBs could potentially wipe out.

Comment Re:Marketspeak (Score 1) 125

"Our normal advertising is so annoying and offensive (because all advertising is, these days) that we have to find other ways to force it on to people because if advertising doesn't actually work, we'll all lose our jobs had have to actually work for a living."

Actually, I see product placement (in movies) as the solution to the piracy problem. Newspapers and broadcast TV made money from the embedded ads and not subscriptions. If movies made money from product placement ads, you could give them away for free and still make money. A new Nielsen-like company would have to develop which tracks and reports how widely viewed a movie is. The advertising companies would then pay the movie producer based on those ratings. There's the larger problem of automated spoofing (computer pretending someone has viewed the movie), but the search engine companies have managed to become successful despite it.

If you can get that mechanism to work as it has for broadcast TV, then it's in the movie maker's best interest for people to share copies of the movie amongst each other. The studios who insist on DRM and anti-piracy wither and die, while the studios who encourage viewers to stream, copy and share their movies prosper. And the piracy problem (as well as the ad infinitum copyright extension problem) collapses into nothingness. So I strongly encourage marketers to experiment with product placement and push the boundaries of how it works.

Comment Re:I don't think hydrogen makes sense (Score 1) 293

Besides the well-known problems associated with containing hydrogen, I'm skeptical that it makes sense to build out a whole new distribution system. We have an extensive network in place for distributing gasoline and smaller ones for distributing compressed natural gas (CNG) and liquid propane (LP), but hydrogen gas is very different from any of those three. We also have a network in place for distributing electricity.

If you run down the list of known chemical and electrical means to store energy, you find out the storage medium with the best energy density (both by weight and by volume) which is easiest and safest to store, transport, and use are... diesel, gasoline, and kerosene. There's a very good reason those have become the fuels which dominate transportation. They're hardly the cheapest (coal is an order of magnitude cheaper per Joule, which is why EVs are able to operate more cheaply than ICE vehicles).

There's a bad tendency for people who dislike one aspect of a fuel (e.g. it pollutes) to downplay any advantages of that same fuel. Unfortunately that results in said people ignoring common sense qualities which affect the economic viability of alternative fuels. Such is the case with hydrogen. It doesn't pollute at the combustion stage, so supporters flock to it while ignoring everything else. It's damn hard to store (needs to be compressed to 1000+ atmospheres, and even that doesn't have competitive volumetric energy density with gasoline). It's difficult to distribute (H2 is a tiny molecule, and will leak from hoses and fittings which are otherwise airtight and watertight).

And does it really not pollute? Well how do you make the hydrogen in the first place? If you use electrolysis to crack water, first you need to generate the electricity. That's usually from a coal plant operating at 45% efficiency at best. Then the electrolysis is about 65% efficient at best. Then you put the hydrogen through a fuel cell which can be 90% efficient in the lab, but peaks at about 70% efficient in commercial applications. Multiply these efficiencies and you get 20.5% efficient - worse than gasoline ICEs, which are currently about 25%-30% efficient. Since a smaller percentage of the energy in the fuel gets sent to wheels on the ground, it can potentially pollute more than gasoline. The story changes if we can convert most of electrical production to a clean source like nuclear, but the recent trend has been anti-nuclear and pro-renewables, which ironically results in the shortfall being taken up by more coal and gas.

The most promising conversion method is actually to process natural gas to strip the hydrogen. But why do you want to do that when natural gas in itself is nearly as good a fuel? Is it so you can brag there's no carbon coming out the tailpipe? When all you've done is secreted away the carbon emissions in the early processing stages?

My money is on electric vehicles.

The same is true for EVs - the EPA MPGe figures assume 100% efficiency for the production of electricity and charging. Factor in 45% production efficiency and 70% charging efficiency (what an engineer recorded after years of charging his Prius converted to a plug-in hybrid) and they're only slightly more energy efficient than ICEs. I'm not against EVs per se - one of the places I used to work at used electric golf carts to move equipment around so I completely agree there are applications where they are superior. And as you point out there are several potential breakthrough technologies (tied to battery energy density and charging speeds/efficiency) which could make them completely viable. But don't ignore other advantages of hydrocarbon fuels simply because you don't like the idea of spewing carbon into the atmosphere.

My money is on alcohol. The U.S. is mired in inefficient corn ethanol because of the manipulation of the corn industry. But elsewhere where ethanol is produced from crops which make sense (e.g. sugar cane), it's fairly successful. Ethanol has only 2/3rds the energy density of gasoline, but all of the same storage, transportation, and use advantages. Yes they pollute out the tailpipe, but it's a closed cycle so there's no net carbon added to the atmosphere. And unless there's some miraculous breakthrough in batteries, you're not going to be able to power a jet airliner with them. Whereas an ethanol-powered airliner is feasible. And if we can figure out a way to break down generic plant matter (cellulose is just long sugar molecules, and fermented sugar is what's used to create alcohol), then basically every plant in the world becomes a self-replicating solar panel collecting energy from the sun for us to convert to alcohol to use as fuel. No need to manufacture and maintain solar panels, or deal with messy and in some cases toxic battery technologies. Sunlight + CO2 from atmosphere -> plants -> sugar -> alcohol. That's the ultimate renewable IMHO.

Comment Re:So good that the proxy battle is over (Score 1) 69

Sounds like it. Apple and 5 publishers tried to raise the price of new "e-books from the $9.99 price that Amazon had made standard".

So why does Amazon get to set the price, and not Apple or the publishers?

This is so simple I'm amazed you got voted up. Fundamental market mechanics is that sellers try to raise the price, buyers try to lower the price. Everything from someone haggling over an item at a flea market to a multi-billion dollar corporate buyout operates this way. Both buyer and seller are acting in their own interests. However, the counterbalance to sellers having the power to raise the price is that if they raise it too much, buyers can go to a different seller to get the same or similar item. That natural balance between sellers trying to get as high a price as they can without driving buyers to competitors is what sets the market price.

Apple and the publishers were sellers who tried to raise the price. If they'd arrived at that price individually, then there's no problem. But they colluded to set it at that price, which is absolutely illegal since it breaks this fundamental market mechanic.

Amazon was a seller who tried to lower the price. That's not a problem since it benefits the buyer. It's just like a store deciding to hold a sale. (There's an anti-trust argument that Amazon shouldn't be selling ebooks at a loss, using profits from other markets to undercut competitors in the ebook market. But that wasn't the focus of this particular case, and its disingenuous to try to argue Apple and the publishers aren't guilty because of this. Both can be illegal. If Amazon's ebook pricing is driving competitors to bankruptcy, then that's a separate issue that needs to be decided in a separate case.)

Slashdot Top Deals

Radioactive cats have 18 half-lives.

Working...