Follow Slashdot stories on Twitter


Forgot your password?

Comment "Your NOx may vary" - one more thing. (Score 1) 383

So with ... selection pressure ... Engineers, with the best intentions, would tend to design engines that pollute a bit more when off the test.

By "a bit", after 30+ years of selection pressure I wouldn't be surprised by as much as 25 to 50% extra NOx on "off the test" readings from just optimizing with only the test and field mileage for feedback.

Unless there's something special about diesels that makes them inherently troublesome on some non-test cycles, though, 2x or more seems too high to be honest fallout, and should prompt a detailed search for explicit cheat code.

Comment I was there. "Your NOx may vary" (Score 1) 383

Much of my early career was consulting to the auto industry (in particular, Ford and GM) during the early periods of electronic engine controls and their interaction with the emissions test regime in question. I did some work with engine controls, but most of it was emissions testing automation and data reduction.

We all (executives, engine designers, test equipment designers, and regulators) knew:
  - The test conditions were arbitrary but standard.
  - Detecting them and switching modes would be trivial to implement and look good at first, but also illegal, immoral, and financially disastrous for the company when they were eventually detected.
  - Because engineering was done to meet the regulations - which met score well on the tests - even with honest efforts and no cheating it would eventually evolve the vehicles to do well on the tests but probably not so well on other operational cycles. (You see this with "your mileage may vary".)
  - Tests and design processes were VERY expensive and the companies highly competitive. They couldn't afford to engineer for BOTH the regulations and to be good all the time out of niceness: The "nice guys" would "finish last", be driven out of the market, and you'd STILL only get cars that only met the regulations. A level playing field was needed.
  - So it was the responsibility of the regulators to write test specifications that modelled the driving cycle well enough that engines tuned to them would also perform adequately in general, despite the "design to the test" evolutionary pressure, and the engineers to meet the law on the tests that were imposed, not do so by explicit detect-the-tests cheats.

The executives and early-stage engineering departments were aware of the temptation for engineers to write cheats, and (at least at one I worked for) put some draconian controls in on software changes to the engine control, to prevent them. (The official explanation given to the inconvenienced engineers was "insuring regulatory compliance".)

I was told that the regulators came up with the standard test by
  - instrumenting a car (with a bicycle wheel speed recorder on the bumper and some event-recording switches),
  - parking behind various cars (in Denver?) and, when their owners started up, surreptitiously tailing them to their destination and recording their warm-up idle time, speeds, acceleration, braking, standing waiting for lights, etc. (but not the upslope/downslope and wind).
  - picking one of these trips, which contained both city and highway driving and looked pretty typical, and adding a "cold soak" to the start (engine is not run for several hours) to standardize the starting conditions and model an initial start, and a guesstimate of a final idling period before shutdown. (To meet the cold-soak requirement, cars were pushed into the test cell by hand or things like electric pallet jacks.)

The test measures exhaust airflow volume and concentration of CO2, CO, and unburned hydrocarbons. So gasoline consumption can be easily computed by "carbon balance" - you know how much carbon is in a gallon, you measure all of it as it comes out, none is lost and only a tiny bit of burned lube oil adds any. So you get mileage for free by postprocessing the data. The regulators got the bright idea of putting this computed mileage on the stickers for customers to make objective comparisons when shopping.

It's easy to measure the average mileage of cars in the field: Just divide the odometer mileage by the gallons pumped to refill the tank, and average over several fillups to smooth out variation in how the tank was topped off. It quickly became apparent that:
  - Mileage in normal service varied substantially.
  - The trip defined as the standard one got substantially better mileage than was typical.
Thus was born the caveat "your mileage may vary" and a regulation change to partition the sticker mileage into separate pieces for the stop-and-go city portion and mostly-cruising highway portion. For gasoline engines, using those two, and a small nudge downward for the standard trip's deviation from the typical, gives customers a good guide.

Also because it's easy to measure, mileage numbers from the field provided feedback to limit the tendency for "design to the test" to make gas consumption evolve into complete optimization for the test. Any model that got horrible mileage in the field would soon get bad reviews, and the engineers would be on its case (if this hadn't happened before it was released.)

But emissions are NOT easily measured in the field. About the only tests there are periodic checks in some states - and they tend to use a very abbreviated cycle. They're just intended to check that the stock emissions control equipment hasn't broken or been disconnected.

So with field feedback on mileage but not emissions, the secondary selection pressure (after "do well on the standard test) is for the engine to get good mileage on other cycles without regard to whether this affects emission. Engineers, with the best intentions, would tend to design engines that pollute a bit more when off the test.

= = = =

I agree with most of what you say. But this is incomplete:

The higher temperatures and pressures (of diesels) help with CO and unburned hydrocarbons (they favor more complete combustion), but the scale of the added NOx and PM problems are much greater.

Which is true upstream of the catalytic converter. But the whole POINT of a (three-way) cat is to move oxygen from NOx to CO and unburned hydrocarbons. Get the right fuel-air mixture and any leftover oxygen, NOx, CO, and HC are all burned exactly. Getting this right with early engines - using fluid and mechanical computation - was a real pain. With software and exhaust oxygen sensors it's a much easier job.

As for particulate matter, the original emission control regulations were designed around what was current when they were imposed: gasoline engines, running the Otto cycle, which doesn't emit much PM unless horribly detuned, worn into burning lots of lube oil, or fed the wrong fuel (like accidentally topping off the tank from the green diesel-fuel pump hose). Diesels tend to put out a lot of PM, and (as big lumps of mostly carbon and unburned hydrocarbons) a surface catalyst can't do much with it. So getting that right pretty much needs to be dealt with separately.

Comment Re:Not surprising and can you blame them? (Score 1) 383

This is also why various technology contests (such as the X Prize) rarely if ever produce any applicable technology. (And when they do, it almost always requires a great deal of R&D to move it into the real world.) The competitors seek to win the prize with a design optimized to win the prize.

The X prize was designed to enable CATS (Cheap Acess To Space) - but the winning design doesn't scale well from suborbital to orbital. Hell, it barely scales from a four place suborbital to an eight place suborbital.

Comment Re:Too little, too late (Score 4, Interesting) 256

But who is playing shenanigans Samsung or Apple.
Did Apple Spec out the correct specs to Samsung and they made a cheap knockoff, after sending a batch that seems to meet initial QA, in a very German style. Or did Apple know about/agree to giving different quality products.

There's a third possibility that should not be discounted out of hand - Samsung meets the specification, while TSMC exceeds it. Without access to internal information, it's hard to tell what's going on behind the curtain and all too easy to leap on the 'obvious' conspiracy.

Of course, the various mega corps routinely indulge in behavior that makes conspiracy theories not all that far fetched...

Comment Re:Good for them (Score 1) 191

Of course, there's always the question - is it because they completed the program, or because they were selected for the program? Not all prisoners are eligible, and not all who are eligible gain a berth. It could just as easily be the prisoners that gained a berth would be within the 60% who don't come come back to prison within three years regardless of their participation due to personal drive and existing educational accomplishments (which are large factors in whether or not they qualify in the first place).

Don't get me wrong, education is always good - but with no control group, claiming a priori that education is the sole cause for the drop in recidivism seems a bit of stretch,

Comment Perhaps he's making flakes of Rydberg matter? (Score 1) 186

The secret sauce seems to be ultra-dense deuterium, "D(0)" whatever that means. Looking through the author's other papers, it looks like he's claiming to have made metallic hydrogen, which would be a Nobel Prize right there.

If he can demonstrate this, then fine ... he's a super genius.

Perhaps he's making flakes of Rydberg matter, floating in a near-vacuum.

(If I understand it correctly) this is matter where the individual atoms have been NEARLY ionized, by pumping an electron up to ALMOST, but not quite, the energy needed to free it from the atom, leaving an ion. (You can do this with a laser tuned to the energy difference between the ground state, or the state the electron WAS originally in, and the state you want it in.) If you get the electron into one of the high, flat, circular orbitals, it looks almost like a classic Bohr atom (earth/moon style orbit), and the state lasts for several hours.

Atoms in such a state associate into dense hexagonal clusters. (19-atom clusters are easy and heavily studied, and clusters of up to 91 atoms are reported.) The electrons bond the atoms by delocalizing, forming a metallic, hexagonal grid, similar to a tiny flake of graphite sheet. You can't make them very big. (There's some issue with the speed of light screwing up the bonding stability when the flakes get too big.) But you can make a lot of them, creating a "dusty plasma".

So hitting gas with the right laser pulse could end up with lots of flakes of this stuff, with deuterons held in tight (dense!) and well-defined flat hexagonal arrays by a chicken-wire of delocalized electrons, with zero (or tiny) net charge, floating around in a near vacuum and suitable for all sorts of manipulation. (Like slamming them into each other, for instance.)

Now how this interacts with substituting muons for electrons (something analogous to an impurity in a semiconductor crystal?), missing or extra electrons (ditto?), occasional oddball nuclei (again ditto?), or perhaps how it might generate muons when tickled by appropriate laser pulses, all look like good open questions for active research.

The point is that it's pretty easy to get these long-lived, self-organized, high-density, stable regular geometry, crystal flakes of graphite-like deuterium floating in a near vacuum, where you can poke at them, without any pesky condensed matter to get in the way.

Easy as in maybe you can do it on a desktop with diode lasers, producing "maker" level nuclear physics experiments. B-)

Comment Re:Not the total cost! (Score 1) 415

Speaking of renewables in the U.S. why is hydro never mentioned when discussing renewables?!?

Because it makes up a rather limited percentage of generation capacity in the US - and that percentage isn't going to go up significantly. (Weaseling because I'm still on my first cup of coffee and there may be some I'm unaware of.) We aren't building power generation dams in any significant quantity, and that's extraordinarily unlikely to change.

Comment Re:Benefit to end users? (Score 1) 686

Which would be a fine argument if there were not literally millions of people around the world who do amazing things daily without resorting to appalling behaviour.

Also, the "we're better off for his existence" argument only works if you can show all the shitty alternate timelines where he doesn't manage the kernel and it sucks. Along those lines I could claim we'll never know how good the kernel could be since a giant douchebag is sucking all the air out of the room.

Comment E-fields foul chromosome segregation. (Score 2) 34

Some recently approved cancer treatments (particularly: for inoperaable brain cancer) are basedt on a recent discovery:
  - The electric fields from changing magnetic fields interfere with chromosome segregation during mitosis.
  - The affected cells generalluy do one of two things:
        - Complete the division with missorted chromosomes - then both offspring cells commit suicide.
      - Give up on cell division - then the new diploid cell commits suicide.
Cells not undergoing mitosis keep perking along just fine. (Perhaps this is why large-range electric fields aren't present in cells except during division: Electrical effects occur across membranes or in very close range between molecules - because the use of the fields in the chromosome segregation mechanism means any newly-evolving "feature" that involved long-range E-fields would kill the cell partway to evolving it.

This is great for brain cancer treatment: Essentially nothing is splitting except the cancer cells. Maybe you lose some nerve stem cells and have slightly lower brain plasticity over the coming decades - but that's a heck of a lot better than dying in agony and gradually-increasing dimentia over 6 months to a year.

But start poking at brains with this in the long term - especially brains of people under 21 or so, when the brains are still doing substantial interconnection and cell division - and you might start seeing some nasty damage.

Comment Re:Missing piece of a puzzle? (Score 3, Interesting) 186

Looked it up:

They replace an electron in a hydrogen atom/molecule - but are heavy so the resulting muonic atom/molecule is much smaller, allowing the nuclei to come within fusion distance.

H2 (D-D, D-T) molecule.

The fusion kicks the muon off and it repeats the process. [...] The problem has always been that it takes a lot of energy to make a muon and it has a tiny lifetime - long enough to do maybe four fusions before it decays.

Actually the muon lasts a couple microseconds which is a LONG time at molecular and nuclear speeds. But in addition to decaying it has maybe a 1/2% to 1% chance of sticking to the helium and getting lost until it times out. So it only catalyzes maybe 100 to 200 reactions. You need somewhat more than 300 to break even for the energy used to create it in an accelerator (maybe times a factor of about 2.5 to make up for the accelerator efficiency).

Comment Missing piece of a puzzle? (Score 4, Informative) 186

I followed the link to the original paper. It's a bit sketchy. But on a skim I don't get quite as much of a "what did he do" as the author of that piece did.

What it looks to me like he did is:
  - Made some "ultra dense" duterium - apparently by the same method as F&P: Using electricity to force it into palladium by electrolysis, with the solid palladium holding it at high density and in particular orientations.
  - Hit it with a laser.
  - Got muons out - with energies above those that could be explained by the laser excitation, and apparently with energy totalling substantially more than spent on the laser and the electrolysis drive power.

Now if this is real, and can be repeated and engineered:

1) High-energy charged particles, at well-defined energies, emerging from a well-defined location, and with adequate lifetimes to last through a few microseconds of the process, can easily have most of their kinetic energy collected as electricity by pretty trivial equipment.

2) Muons catalyze fusion - at room temperature (or even liquid hydrogen temperature). They replace an electron in a hydrogen atom/molecule - but are heavy so the resulting muonic atom/molecule is much smaller, allowing the nuclei to come within fusion distance. The fusion kicks the muon off and it repeats the process. This has been known for decades: Just point a muon beam at some hydrogen and watch the fun.

The problem has always been that it takes a lot of energy to make a muon and it has a tiny lifetime - long enough to do maybe four fusions before it decays. So muon-catalyzed fusion (using accelerators to make muons) would never approach breakeven. If this guy has figured out how to make muons in a simple cell, with the energy to make the muon coming from a fusion reaction, it could change the game big-time.

Also: If muons manufactured by such a process were a step in the very sporadic, looked-like-fusion, effects seen by the people trying to do cold fusion, it could explain why the effects were sporadic - and understanding the process might lead to being able to produce it reliably and consistently.

So maybe this is just another will-o-the-wisp. Or maybe it's something that could lead to substantial repeatable interesting physics. Or maybe it could lead to real energy-producing reactors on a less-than-tokamak scale.

And just maybe it's a missing piece of a real room-temperature fusion process that led to the cold-fusion flap and might become practical. Wouldn't that be nice?

Regardless, this just got published within the last month or so. If it's real it should be pretty easy to reproduce, and from there not too hard to figure out. So let's see what happens. Maybe nothing, maybe little, just the off chance of another roller-coaster ride. B-)

"The hands that help are better far than the lips that pray." -- Robert G. Ingersoll