Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Did costs account for administrative sabotage? (Score 1) 214

>> the level of NIMBY pushback per plant

Multiple cities originally signed up for the NuScale SMR's and bailed out when the cost jumped. Now the entire project appears to be a bust.

"The estimated costs of the project rose to $4.2 billion in 2018, then $6.1 billion in 2020, and finally $9.3 billion in 2023, after it was scaled down to 462 MW in 2021. In the end, the costs were clearly too high for UAMPS members to bear." https://www.utilitydive.com/ne...

Bear in mind, though, that the post you replied to never said or implied that there wouldn't be other reasons for objecting to SMRs, like them turning out to be massively more expensive than originally projected, and not enough electrical systems signing up to buy the power. The NIMBY lawsuits probably can't even start until they apply for an operating license that declares where they plan to put the plant, because until that stage, nobody even has standing to sue. UAMPS got canceled before they reached that stage. So it would be more accurate to say that they didn't have any NIMBY lawsuits *yet*.

Comment Re:There is no "higher production rate"... (Score 1) 214

Not only Brayton cycle turbines; anything would be better that the massive nuclear steam turbines that are needed to eek out even a marginally acceptable amount of power from the low-temperature steam. Also, MSRs are much more responsive than solid-fueled reactors, and naturally follow load.

It's too bad that embrittlement causes them to fail too quickly. If MSRs were actually feasible as a real-world power source, we would be using them already, because as I understand it, the xenon poison is removed more readily from the pile. Unfortunately, they aren't, so absent an improvement in metallurgy, I don't think that's going to happen any time soon.

Now using molten salt for solar, that's a possibility. It might even be possible to use boiling water from nuclear power to heat molten salt for thermal storage, so that you're not dealing with molten salt and neutron embrittlement on the same surfaces. Maybe. No idea, honestly.

Comment Re:There is no "higher production rate"... (Score 1) 214

Because when you produce more power than is consumed, you end up with unloaded turbines running faster and faster, which causes the line frequency to increase, at which point you have to start shedding generation capacity, both to bring the frequency back to where it should be and, at least in some cases, to prevent damage to the turbines themselves. In the interim before that generating capacity is shed, prices can fall below zero to encourage increased consumption to bring the grid frequency back under control.

If that were true then we'd have seen this before wind and solar subsidies were a thing.

Wind and solar subsidies drove adoption of wind and solar. Before that, load following on a minute-by-minute basis wasn't a thing to the same degree. But AFAIK, California does *not* pay solar or wind generation when it curtails their power production, and still has negative power prices. For most countries this is new, but the reason it is new is that there wasn't a regulatory framework for negative billing previously, not because it wouldn't have been useful previously.

It's trivial to shed generating capacity, no need to allow the spot price to go negative first. If there's too much solar power then just disconnect a few panels. If there's too much wind power then feather the blades and apply the brakes. That shedding of production is going to be fast enough that there's not any need to encourage demand with negative pricing.

That's exactly what they do when they curtail solar and wind. But bear in mind that it is different people making the decisions. The people deciding to sell power at a negative price aren't concerned so much with having to curtail, because they can't do so easily anyway. Rather, they're seeing offer prices that are low because of ample renewable power, and they need to get rid of the power that they're producing, because by definition, they have to find buyers. Renewables can get by with not finding buyers by curtailing, but won't curtail if there are buyers. But nuclear power has to lower their price until they can find a buyer, or else they're in trouble. So when the price gets cheap enough, it goes negative.

That has jack to do with subsidies of wind and solar power, except insofar as those subsidies resulted in more wind and solar power production.

So, in other words it has everything to do with wind and solar subsidies.

Very indirectly, perhaps. Wind and solar subsidies caused people to build more solar and wind production, which drives the cost of power down significantly at times. But even without subsidies, you would eventually have ended up with enough solar and wind on the grid to drive power prices negative. At best, subsidies just made that happen sooner.

Different kinds of turbines? What you're talking about is a turbine bypass valve.

No, I believe they are called Brayton cycle turbines. They operate on the same principles as natural gas turbines, the kinds of turbines currently being used for load following. Rather than boiling water to run a steam turbine we could have an air breathing turbine that heats the air with a heat exchanger than burning fuel. Most any nuclear reactor can adjust power output very quickly but the slow reaction of steam turbines is the weak link in this chain.

It has been done (though with CO2, not air) in small reactors, though it is a closed loop for the same reason that the coolant water is. Unfortunately, using CO2 as the coolant tends to result in using graphite as the moderator, which puts you at risk of another Chernobyl.

A more realistic approach would be to use excess electrical energy to compress air. However, it would require enormous hardware to sink that much energy, and I'm not sure how practical that would be.

Anyway, I'd argue that it isn't the steam that's the critical problem with nuclear reactions. It's the xenon poisoning. When you shut down a reactor quickly, it takes an enormous amount of energy to bring it back up before enough of those neutron-poison fuel byproducts break down, and if the fuel is towards the end of its life, you won't be able to do that for potentially multiple days. Solve that problem, and you'll be able to do much faster load following. Otherwise, you'll still need the reactor to produce roughly the average load for the time interval that your storage can cover, and if the average load changes downwards too quickly, you'll still be in trouble, unless I'm missing something.

Comment Re:No. (Score 1) 157

TL;DW induced demand means you'll just see increased traffic until the roads are clogged again. Cars are terrible people movers.

That seems unlikely to be the case. Roads clog because too many people need to get from point A to point B at the same time of day. There does come a point at which the roads are adequate for the population, and they won't clog except when there are accidents. And with adequate numbers of self-driving cars, the number of accidents should fall off a cliff, which means they won't clog at all.

The real problem with road systems, in my experience, is that pretty much everything they do to try to "improve" the roads actively makes traffic worse:

  • Timing traffic lights in ways that encourage people to take freeways means that people take freeways even if they're only going one exit and should have stayed off.
  • Metering lights that limit the flow of traffic also limit the ability of traffic to accelerate up to the speed of the road before merging, thus creating the very problem that they're supposed to solve.
  • All-day express lanes that cost money artificially reduce the flow of traffic during early parts of rush hour, causing the backup to be earlier and worse.
  • Zoning laws cause people to have to live far away from their workplaces, and encourage businesses to put all of their offices in one spot, creating a huge hot spot at nearby onramps and offramps.
  • Traffic lights that allow pedestrians to walk before allowing cars to go (ostensibly to reduce the rate of pedestrian injuries) result in not being able to get out any cars when the light turns green.

And so on.

To massively improve things, just do the opposite. Time traffic lights to maximize traffic flow on primary arterial roads, and use long cycles to encourage people to turn right and make a U-turn (mid-block) rather than wait for a left turn out onto those roads. Move metering lights halfway up the ramp so that cars can accelerate to the speed of traffic, even if that causes backups on nearby roads. Limit toll lanes (by law) to rush hour periods. Remove barriers to putting housing near businesses, and create tax incentives for businesses to spread out. Give pedestrians a separate walk cycle that lets them walk in any direction at every light, nationwide so that they don't cause backups for cars and don't get hit by cars. And so on.

Comment Re: We know thw answer (Score 1) 27

So the answer is, not like this, the tech allows for increased density at extreme cost and unproven at production, also, due to the scale, the electrical potential must be very small, thus it is probably great for electronics at extremely low voltages, like CPU, not for batteries. So your phone will consume less power and last longer with the same LiPo battery.

The lithium polymer batteries in a phone operate at a higher voltage (4.2V nominal per cell) than the lithium ion batteries in most cars (3.7V nominal per 18650 cell or 2170 cell, or 3.2V nominal per LiFePO4 cell). So I don't think that's necessarily a significant limitation if you have a hundred of them in series.

Comment Re:Did costs account for administrative sabotage? (Score 1) 214

>> intentional interference by people and organizations opposed to the projects

Show evidence for that tired chestnut. Vogtle Unit 3 was built at a site where there are already 2 reactors and it was way late and over budget.

To be fair, the grandparent poster did say "in part". Mistakes during design or construction that have to be fixed later can cause cost overruns and operational delays in pretty much anything, and aren't specific to nuclear power.

Also note that SMRs would likely have lower risk of delays caused by design or construction mistakes, because they would presumably be making tens of thousands of them, all alike.

But SMRs would also likely have much higher risk of NIMBY-related delays per gigawatt of output, because each power plant would produce less output, and the level of NIMBY pushback per plant would probably not be reduced proportionally. :-)

Comment Re:There is no "higher production rate"... (Score 2) 214

On a sunny and windy day the cost of solar and wind power is below zero

How can wind and solar power get to be priced below zero?

Because when you produce more power than is consumed, you end up with unloaded turbines running faster and faster, which causes the line frequency to increase, at which point you have to start shedding generation capacity, both to bring the frequency back to where it should be and, at least in some cases, to prevent damage to the turbines themselves. In the interim before that generating capacity is shed, prices can fall below zero to encourage increased consumption to bring the grid frequency back under control.

And note that "increased consumption" also potentially includes convincing nearby grid segments to buy more power from your part of the grid instead of some other part, resulting in temporarily shutting generating capacity somewhere far removed from the region that has excess output.

Oh, right, because we have governments that give money for those producing wind and solar power even if nobody is willing to buy it. To get their government money they need to ship out electrons, which means splitting some of this government money with those that are willing to open up their windows and run the air conditioner.

Most countries don't do that, yet they still have negative energy prices at times. Yes, government intervention that pays operators during periods of curtailment can probably exacerbate negative energy prices, but that is not the root cause. The root cause is oversupply and insufficient demand.

I've seen nuclear power plants offer their electricity at below zero costs, meaning they also pay people to take their excess electricity.

That's because it takes hours to ramp down a nuclear power plant. If they don't pay people to take the excess electricity, see above.

Much of this is driven by government subsidies of wind and solar power, they have to sell at the same price as anyone else but, again, selling at a negative price is just paying people to take their electricity.

That has jack to do with subsidies of wind and solar power, except insofar as those subsidies resulted in more wind and solar power production.

Nuclear power plants produce power with big steam turbines that don't react well to large shifts in load in short periods of time, and to mitigate against this nuclear power plants will sometimes pay people to take power for short periods so they don't have to put their turbines through the added wear of powering down and then back up again as load returns. With different kinds of turbines, and/or some kind of energy storage, they won't have to pay people to take their electricity since they can instead adjust their power output to match the load.

Different kinds of turbines? What you're talking about is a turbine bypass valve. And most (if not all) nuclear plants have those. But that only gets you so much reduction in output. Below that, your only option is a complete shutdown, and that's expensive, because depending on the type of reactor and how close to the end of the fuel's life you are, it can take anywhere from hours to days before you can restart the reactor.

Besides, reducing the power output by bypassing turbines isn't reducing the amount of fuel the reactor consumes, so you're losing money either way. By partially bypassing the turbine, sure, you might be losing less money momentarily, but you'll also be making less money momentarily when consumption swings back the other way. In the grand scheme of things, it usually makes more financial sense to let a bunch of peaking natural gas plants shut down and know that the losses on the downswing will be balanced out on the upswing, or at least that seems to be the conclusion that power plant operators have reached, judging by their behavior. :-)

If there's no profit in energy storage then people will just turn to opening windows and running the air conditioner for a profit.

But there's huge profit in energy storage. Where did you get the idea that there's no profit in it?

Comment Re:There is no "higher production rate"... (Score 2) 214

And you think that the public would want a nuclear reactor in their neighborhood?

Sure. Why not? The only reason the public is scared of nuclear power is because of safety, and ironically, the only reason nuclear plants are unsafe is because the public is scared of nuclear power. Were it not for NIMBY behavior decades ago, we would have replaced all of the old nuclear plants with newer designs by now, and nuclear power would be massively safer.

SMRs, because of their small size, at least have the potential to be designed in such a way that the worst-case failure mode would involve someone picking up the failed reactor with a forklift, loading it on a truck, and burying it in cement out in the desert somewhere.

Even with existing nuclear plants, the fear of nuclear power is quite irrational. The number of people killed by nuclear power plants worldwide is about 101, and that includes freak accidents like people getting electrocuted, falling, divers getting sucked into water intakes, drowning, etc. The number killed by actual nuclear-specific risks is just 52, 50 of which were at Chernobyl.

In 1937, the London School natural gas explosion killed 300 people. The San Bruno pipeline explosion killed 8 and put 51 more in the hospital, and destroyed 38 homes and damaged 70 more. In total, 4200 houses get destroyed by natural gas leaks in the U.S. every year. And assuming the odds of dying in a natural-gas-induced fire are similar to other house fires (I couldn't find stats on natural gas fire deaths specifically), that would mean natural gas fires probably kill about 30 people every year.

Yet nobody raises an eyebrow when you tell them that there is a natural gas pipeline running through their neighborhood, despite natural gas fires killing as many people as Chernobyl about every two years.

From a public policy perspective, competent leaders shouldn't bow to irrational fears about nuclear power. Part of a leader's job is to educate the public about why they're making the decisions that they make, and the only way you're going to cure the public's irrational fear of nuclear power is with a multi-decade solid safety record, coupled with education campaigns about how much safer nuclear power is than what's in their neighborhood right now.

I can't see any problems with these huge military and terrorist target all over the place.

What foreign military is realistically going to be stupid enough to attack the U.S.?

Terrorists? Sure, but blowing up a school or a skyscraper would likely cause far more damage and death than blowing up a small nuclear pile, and would be way, way easier to pull off.

I don't know about the security of your nuclear plants, but in my country they are quite strict. And having that security around several thousands more is not feasible.

There's security around major substations already, so adding a small nuclear pile inside those locations might not increase the amount of security needed by nearly as much as you think. There are, of course, a lot of unknowns in that calculation.

Also, bear in mind that pretty much every inch of a traditional nuclear plant represents a potential attack vector. Closing the wrong manual valve or blowing up a bomb that takes out a bunch of coolant lines could potentially be just as catastrophic as blowing up the reactor vessel itself. With small reactor designs, because of the lower thermal output, this is potentially not the case. Having a smaller overall critical attack surface means less need for on-site security, and that is doubly true if you have adequate site design standards to slow down would-be attackers.

Also, what about the fuel and waste problems? Right now one of the major producers of nuclear fuel is our friendly Russia. Yes, the same one that is under a shit-ton of sanctions so they can't sell their oil to fund their imperialistic dreams. You want to turn to them for more fuel? And in other countries with uranium deposits, people are protesting mines. The uranium is low grade in many places with means you have to mine a lot to get enough to enrich. Again, who wants their precious nature to be a huge quarry.

Russia is actually a very minor producer of nuclear fuel. 75% of uranium comes from Kazakhstan, Canada, Australia. and Namibia. Russia (#6 in the world) produces barely a tenth as much as Kazakhstan alone, a third as much as Canada, less than half as much as Namibia, and barely over half as much as Australia (source).

Mind you, this might give Russia more of a reason to invade Kazakstan, but that's kind of orthogonal.

With more reactors we have more waste. Not only spent fuel, but also the construction material when the reactor eventually has to be scapped.

One of the reasons reactors have to be replaced is because they run so hot. Smaller reactors likely don't run as hot, so they may not deteriorate as quickly. This is not a given, of course, and some studies have suggested that some SMR designs may produce larger amounts of neutron-embrittled steel than traditional power plants. So that's a possible concern, but not necessarily a given, and whether it turns out to be a problem may depend in part on how long the power plants last and whether those independent analyses (often done without access to complete documentation) are, in fact, correct.

Also, SMRs can be up to half again more thermally efficient than traditional BWRs, so per unit of energy, you're using less fuel. And they're often designed to use more enriched fuel, which means a higher percentage of the spent fuel is actually depleted, both of which result in less fuel waste per unit of power.

As for the waste problem, my entire life I have heard "we have the solution". Funny enough, it is the same solution now as when I was a kid, decades ago. Just that no-one has implemented it.

We absolutely do have a solution for the waste. What we don't have is the political will to make it happen. Everybody wants the benefits of nuclear power, but nobody wants to pay the price. And until that changes...

Perhaps because no scientist is willing to sign off that the solution will work for 10000 year, let alone 100000 years. Just look at the egyptian hieroglypics, they are not 10000 years old and still we have trouble understanding them.

IMO, that isn't a particularly realistic concern. It's not like the tools that Egyptians used have become unknown to modern science. In 100,000 years, it is safe to assume that humans will still know what a Geiger counter is, even if they have a different name for it. It is also reasonably safe to assume that in much the same way as maps have been kept up to date since the advent of maps, the maps showing locations of dangerous storage sites will also be kept up to date even if civilizations fall and are replaced. There are simply too many people who have reasons to ensure that this knowledge doesn't get lost, unlike the story of some long-forgotten battle or King Tutankhamun's grocery list or whatever a particular set of hieroglyphs happened to be about.

Also, it is worth noting that in spite of millennia of language drift, the more modern, phonetic form of hieroglyphics is readily interpretable by way of Coptic language (which is still in use today, albeit in much the same way that Ecclesiastical Latin is still used).

And at this point, we've pretty much reached a point where a sizable percentage of the planet understands one of a single-digit number of languages, and that is becoming more and more true over time. If you take just the top 5 (English, Mandarin Chinese, Hindi, Spanish, and French), were it not for some amount of overlap, those five languages alone would cover over half the world population. English by itself would be understandable by one in five.

And the rate of language drift is decreasing over time. For example, an average person in Shakespeare's time would probably not have been able to readily understand Beowulf at all, because the differences between Old English (pre-1066) and early modern English are so huge. But today, someone can read Shakespeare with minimal difficulty despite it having been almost as long since his plays were written. From 1066 through the 1300s (Chaucer's era), English and French blended, and the resulting language change was radical. From there to about 1500 or so, the written English language continued to change pretty rapidly. Then it suddenly stopped.

What happened? The Gutenberg printing press. The existence of mass-produced printed writing made language change more slowly, because regional variations that would otherwise lead to language drift were diminished, literacy increased, and access to the printed word increased. As a result, in the 500+ years since, in spite of pronunciation changes, the written vocabulary of English has barely changed at all (though things were often phrased differently back then).

Whether that will still be the case in 100,000 years is another question. We don't even have language from back that far to use as a metric. The oldest cave paintings are less than half that old. But given the rate at which basic English words have stayed constant, barring some sort of large-scale nuclear war, it seems rather likely that written English will still be understandable by a large number of people in 5,000 years or more, and that by then, surely someone will have updated the signs. And in the event of a large-scale nuclear war that wipes out most of human civilization, a small number of deeply buried nuclear waste sites will be a minor risk compared with other radiation risks, so that's probably not an interesting edge case to optimize for.

Comment Re:Math (Score 1) 214

While all of that is true, how was that ever going to overcome the fact that the amount of work which must be done was multiplied by using it?

Why do you assume that the amount of work required would increase faster than the power production increase?

I could easily imagine an SMR design in which the actual operation is 100% automated and remotely monitored, reducing the personnel costs to not much more than the number of people who work for nuclear plants right now, while producing vastly more energy.

I could also imagine the nuclear fuel being compartmentalized for safe transportation, and consolidating the cooling process, the short-term and long-term spent fuel storage, and the reprocessing into a small number of centralized locations, thus not increasing the workforce in that area significantly, either, again while producing vastly more energy.

I'm not saying it necessarily *will* work that way, but I certainly wouldn't assume that it won't.

Comment Re:There is no "higher production rate"... (Score 1) 214

Maybe because higher production rates have had the effect on pretty much every complex technological thing ever?

Not when the "higher production rate" is, even theoretically, from single digits to maybe dozens.

Why do you think the number would be that small? The U.S. has O(780,000) megawatts of coal and natural gas power capacity. In any ideal world, both of those numbers would be zero. SMRs can replace not only base load, but also peaking load, so that entire number is potentially in play.

An SMR, by definition, produces 300 MW or less. So that means you would need 2,600 of them in the U.S. alone. Some SMRs produce more like 20 megawatts, so you'd need 39,000 at that size to produce the same amount of power.

Worldwide, we have about 4,570 GW of fossil fuel power production capacity, so you would need 15,230 to 228,500 worldwide. At at those levels, it wouldn't even be classified as low-volume manufacturing anymore (ignoring the possibility that multiple companies come up with a viable design simultaneously and split the market).

Comment Re:A start, but (Score 1) 157

Not the original poster, but this NPR story mentions a 2009 study in Arizona that showed cyclists were the at-fault party for 44% of crashes. A study in Minnesota showed cyclists were at fault for 49% of cycling crashes. A 2004 study in D.C. showed cyclists were more likely to be at fault. A 1991 study in Hawaii showed cyclists were at fault just 16.5% of the time.

So different studies come to drastically different conclusions, but most of the studies seem to point towards cyclists and drivers being almost equally likely to be at fault. This is rather shocking, because one would normally expect cyclists to realize that cycling in a city is dangerous and that they would therefore be constantly vigilant. But in practice, it sounds like at least on average, they're only slightly more careful than drivers.

So 40% sounds plausible.

Comment Re:For anyone who cares about how it actually work (Score 1) 176

"Refrigeration in space" is otherwise known as "painting your spacecraft white" :)

Depends on how significant your internal heat sources are (e.g. computers waiting to fire up the engines). And if you're transporting people on the same flight as the frozen food, remember that people are a rather significant internal heat source. :-)

Comment Re:Many thoughts, most unfavorable (Score 1) 157

"...I could cause havoc on a busy highway or bridge by discretely placing one in the traffic lanes and causing false alarms and maybe even uncommanded braking events at high speed. Not good."
And you would be a sociopath abusing safety equipment to cause that havoc. Wonder if there could be laws against that?

It's also illegal to crack into someone's system even if the root password is "root". It's still a terrible idea to set your root password to "root" and enable remote root logins. If you design a system in a way that is fundamentally insecure, you should always assume that someone will come along and abuse it.

Just because something is illegal doesn't mean it won't be done, nor does it mean that the police will have the technical capability to figure out who did it, particularly if its operation is delayed beyond the retention period of traffic camera footage.

And it need not even be malicious. A cyclist riding over a bridge that crosses above a freeway could cause panic braking at high speed. Are you going to make it illegal for cyclists to ride over freeway bridges?

Comment Re:A hack and accident waiting to happen (Score 1) 157

And that's just the *immediate* thought that comes to mind seeing this. It's not a matter of "saving more than it hurts", it's a matter of creating something that can very easily get abused for cheap. And bad actors are a reality.

Agreed. All of these ideas that folks come up with where cars are supposed to trust external signals that they can't verify are fundamentally flawed by design. Cyclists should not be blindly riding across roads without looking to see if a car is coming. That's what stop signs and traffic lights are for. And although it would be nice to reduce the odds of cyclists dying when they make reckless mistakes, it isn't worth having the traffic grid constantly being brought to a halt by threat actors.

Cars should trust what they see and can detect by vision or other similar mechanisms, period. There is very little point to having any additional information, because no external data can be considered trustworthy.

For things *far* beyond what they can see, e.g. navigation decisions, trusting external sources (map data, traffic data) is unavoidable, of course, but bad map data or traffic data is unlikely to be a safety issue as long as the car doesn't trust it for safety-critical decisions (deciding when, precisely, to steer, etc.).

But for things that they should be able to see, trusting external data over what their internal sensors can detect creates the potential for causing the cars to make safety-critical decisions that are wrong, which can actually diminish safety.

The example of a cyclist 300 feet ahead, just over the next rise, is rather nonsensical. If the cyclist is moving in the correct direction, there's never a problem, and even if the cyclist had a wreck and is stopped over the next rise, you should see the cyclist in time to stop. If not, unless the cyclist is riding the wrong direction, chances are either the driver is going too fast or the road's speed limit is high enough that cyclists should not be allowed on the road. I doubt the number of times that this happens in a year without one of those three contributing factors is a very high number. I suppose collisions between cars pulling out of driveways with limited sidewalk visibility would be the exception, but I think those also are more likely to be injuries rather than fatalities.

Assuming camera-based and/or LIDAR-based detection of bicycles that are actually visible is working properly, that vehicles properly creep out to see past obstructions, and that cars are driving the speed limit, the only way this can realistically prevent a large number of accidents would be in situations where the cyclist does something incredibly stupid, like crossing a road without stopping at a traffic light or stop sign to verify that it is safe to cross. That isn't the cause of very many cyclist accidents, because most cyclists aren't complete idiots.

But the potential for this technology to cause serious accidents is much, much larger. Think about someone putting a transmitter on the underside of a 70 MPH freeway bridge, for example. The map system says that two roads cross, so it is at least plausibly a spot where a cyclist could be in the path of traffic. So now you have cars panic-braking at 70 MPH for a cyclist that doesn't exist, causing wreck after wreck after wreck in the same spot day after day.

Not to mention that if they aren't relying on vision for detection, cars aren't likely to know that the cyclist is too far above them vertically to plausibly ever cross their path, so every time a cyclist rides across a freeway bridge, they're going to cause a traffic jam. I mean, they haven't even solved the problem of radar return from bridges yet, and that's a much easier problem to solve (because of increased data) than the "is the cyclist on a bridge" problem.

And if you turn that feature off on the freeway to avoid those problems, you're likely to also cause it to not work at freeway exit ramps, which is one of the few spots where otherwise cautious cyclists might plausibly be saved by the system.

No, this idea seems like pretty much all downside to me, with no real upside, practically speaking. For every one life it saves, it could easily kill a hundred.

Slashdot Top Deals

Successful and fortunate crime is called virtue. - Seneca

Working...