Please create an account to participate in the Slashdot moderation system


Forgot your password?

Comment: Re:LENR is not fusion (Score 1) 162

by radtea (#48677161) Attached to: Bill Gates Sponsoring Palladium-Based LENR Technology

the best theory so far is that of Widom-Larsen

Widom-Larsen requires an implausible mix of scales. The effective mass of heavy electrons in the solid state is a collective phenomenon happening over distances and time-scales that are large relative to the nucleus and nuclear time-scales and affect the dynamics of the electron's interaction with the lattice, on those scales. To impute to these large-scale effects efficacy at the nuclear scale is very unlikely to be correct.

Consider a car analogy: a car moving along a freeway in dense traffic interacts with all the cars around it. If the driver accelerates, they will pull up close to the care behind and that driver may speed up a bit too, sending a diminishing wave of acceleration through the traffic, so compared to the same car alone on the road the car in dense traffic appears to have a much higher effective mass. Alone, you hit the gas and speed up a lot. In traffic, you hit the gas and speed up a little bit. That's what the electron in the surface looks like: a car in traffic.

But on the scale of car-car interactions, the "bare" mass of the car is what matters. If two cars collide you get an energy of 0.5*m*v^2, not 0.5*Meff*v^2.

Yeah, there are multi-car pileups that muddy the analogy, but they add up to nothing like the effective mass of the whole traffic block, so there. And the difference in scales between "cars and traffic" is tiny compared to the difference in scales between "nuclei and the lattice", so the effect that analogy hopefully makes obvious will be that much larger in the latter case.

Comment: Re:I see now (Score 4, Insightful) 58

by radtea (#48650339) Attached to: 26 Foot Long Boat 3D Printed In 100,000 Different Pieces

Ah, so he's an idiot.

Pretty much. He seems unaware of the huge selection bias--and logical contradiction--implied by the claim about "the inability that humans have exhibited in rectifying uncontrollable catastrophic challenges"

We've dealt with a huge number of challenges successfully, but a pretentious git like this would never even be aware of them, so his estimate of our track-record is off by light years.

Bacterial disease: rectified.

Unwanted pregnancy: rectified.

Polio: rectified.

Smallpox: rectified.

Growing enough food to feed ourselves: rectified.

And so on.

Sure there are hard problems left. They will be solved by engineers, scientists, bureaucrats and businesspeople willing to take risks and test ideas by publicly testing them via systematic observation, controlled experiment and Bayesian inference, not pretentious gits telling us how awful we all are.

Comment: Perspective (Score 3, Informative) 74

by radtea (#48638823) Attached to: NASA Video Shows What It's Like To Reenter the Earth's Atmosphere

For those like me, who just watched the video and didn't understand the point of view 'til quite late on, the camera is pointing back along the direction of flight.

Also, for some reason the video has strange out-of-focus side-pieces that are distracting and annoying. The view itself is gorgeous and amazing.

Comment: Re:More important: how is this happening? (Score 2) 70

by radtea (#48628139) Attached to: Terrestrial Gamma Ray Bursts Very Common

The distinction between X-Rays and gamma rays is not the way how they are produced but the energy level.

As others have pointed out, this is false. Here's a simple guide to the complex language of electromagnetic radiation:

1) If it was produced by an atomic process it's an x-ray, no matter what the energy.

2) If it was produced by a nuclear process, it's a gamma-ray, no matter what the energy

3) If the source is neither atomic nor nuclear, or unknown, it's field-dependent and circumstance dependent. I tend to think of bremstrahlung as gamma radiation unless I'm talking about x-ray sources for imaging or medical treatment. This is a purely cultural difference, with the terms "x-ray" and "gamma ray" being understood as interchangeable by practitioners, but with one or the other being preferred depending on context. Annihilation radiation is called gamma or x-ray depending on the field as well.

With regard to the EM radiation from storms, there are multiple possible origins. It's pretty easy to create neutrons from high-energy plasmas, as in the Farnsworth Fusor. Subsequent capture of those neutrons on nuclei will produce "true" gamma rays. On the other hand, various purely EM processes could be producing x-rays as well. So the EM radiation from storms could well be a mix of both nuclear and atomic processes. Call 'em gammas or x-rays, and don't make a big deal of it.

Comment: Re:Climate != single event (Score 3, Insightful) 222

by radtea (#48605469) Attached to: Linking Drought and Climate Change: Difficult To Do

This is only in the headlines because of how unfortunately politicized this topic has become.

It's news because Every. Single. Story. on weather ends up talking about climate change. Dunno if that's politicalization or just flavour-of-the-week reporting, but it needs to be pointed out as the nonsense it is.

Climate is a distribution.

Weather is an event.

Distributions are made of events, but they are not events and they have properties (their mean and higher moments) that are emergent properties of the distribution, not properties of the events that make them up.

So long as idiots talk about climate change every time there is a warm spell or a cold snap, there will be a need to point out the difference between events and distributions, and the very small amount you can say about discerning between different distributions that largely overlap based on a single event, or even a small handful of events.

Comment: Re:Wasn't there a book about this? (Score 1) 138

by radtea (#48602889) Attached to: How Birds Lost Their Teeth

The example I use is Butterflies, which change from a crawling creature to one that flies, mid life.

Except we have a pretty good idea of how it happens.

Do you believe in eggs? That is, do you believe in organisms--including insects--that reproduce by laying eggs? And do you believe that those eggs don't have shells?

If so, can you imagine a mutation that makes an egg very slightly motile? The outer layers of such eggs is typically some kind of protein. Suppose that there is a mutation such that after the egg has grown to a certain size there is a biochemical response that causes the protein coat to contract when exposed to light. Lots of biochemicals react to light, and some of them change shape or react with other molecules under the exposure to light in ways that cause them to change shape. It just has be a tiny bit.

Now you have an "egg" that in its later stages of development moves away from light. Such an egg might plausibly be more likely to survive than one that stays put. So over time, the eggs of such insects that are very slightly motile come to predominate. There is no way around that if the mutation is heritable, so unless you don't believe in DNA--and chemistry--you have to accept that that happens. You could also deny the laws of probability, in which case I have a lottery ticket to sell you.

Now iterate this process over a few million generations. Can you see how you might go from a flying insect that lays eggs to a flying insect that lays slightly motile eggs to a flying insect that has a motile larval stage?

What we can or cannot imagine is irrelevant to what does or does not exist, so I'm not arguing here that "evolution is imaginable and therefore true", but merely trying to extend your imagination to the point where you are motivated to look more deeply into the subject.

My own belief is that evolution by variation and natural selection is not just plausible, but mathematically necessary:

Comment: Re:Should Allah be translated to God? (Score 0) 880

by radtea (#48597713) Attached to: Apparent Islamic Terrorism Strikes Sydney

Since the meaning of the gibberish on the flag is, "I am stupid! I am really really stupid!" it doesn't seem like there's much point in arguing about the conventions of translation.

In English "God" is sometimes rendered Yaweh or Jehova, but could as equally well be given as "Silly Bugger" or "Twit", and it wouldn't change the meaning of Christian gibberish, so there is no reason to quibble about how Muslim gibberish is translated. You could swap God for Allah in the translation and it would still mean: "I am stupid! I am really really stupid!"

Since God/Allah/Twit/etc is a word for something that the vast preponderance of the evidence suggests does not exist--all kinds of things are true that an all-powerful, all-loving, all-vengeful Supreme Being would not permit, and all kinds of things don't exist that such a being would create--anyone who believes in such a Being is necessarily stupid. As stupid as someone who believes in the Tooth Fairy or Santa Claus. And since the translation should capture the gist of a sentence's meaning, "I am stupid! I am really, really stupid!" appropriately captures the gist of this one.

Comment: Re:serious question (Score 4, Insightful) 113

by radtea (#48585869) Attached to: 2014 Geek Gift Guide

is bennett haselton a real person? and if so, any idea what abnormal psychological diagnoses he might fit?

Narcissistic Marketing Disorder: the belief that whatever you have to say, no matter now banal, stupid, confused, idiotic, boring, passe', imbecilic or wrong, it is interesting and important because it's you saying it.

Comment: Re:What people want to read (Score 1) 368

by radtea (#48545825) Attached to: Overly Familiar Sci-Fi

The biggest problem with what Stross is saying is that people, in general, want to read about situations that are familiar to them. It's damn hard to come up with a truly believable far-future culture in the first place, but it's much harder to do so in a way that makes it both alien to us and something that people can identify with enough to actually enjoy reading.

The same is true of historical fiction. Protagonists from as little as a century ago, if depicted realistically, would be both wildly implausible and utterly unpalatable to modern audiences. Even modern novels from other cultures have a lot of heavy lifting to do if they want to get an audience in the Anglosphere.

Two reasonably good historical authors are Patrick O'Brien and George MacDonald Fraser. The former manages by making his characters genuinely alien to us, and the latter by having a hero (Flashman) who is a complete reprobate, so when he--for example--sells his nominal wife into slavery we are shocked but not surprised.

On this basis, even near future SF is hard to do well. I've written a near future novel ( and even a decade or three in the future is hard to handle realistically while still keeping characters accessible to the modern reader.

I'd go further and say that when we read historical authors, from Shakespeare to Austen to Dickens, we often gloss over just how weird the worlds they are writing about actually are, and the pace of social change in the past generation or two accounts for most of that shift. If things keep up at this rate none of us will be able to communicate meaningfully with our grandchildren.

Comment: Pushes back discovery, not reality (Score 1) 59

by radtea (#48533631) Attached to: The Ancestor of Humans Was an "Artist" 500,000 Years Ago

Fossil finds are a very sparsely sampled distribution, which means that while the earliest evidence for art has been pushed back hundreds of thousands of years, the earliest making of art almost certainly predates it by a much longer span:

This is not a new idea, but it's one that continually evades reporters in this area. The data of first discovery of a sparsely sampled distribution is almost certainly much, much later than the first instance of the thing being sampled.

Comment: Re:I don't get it (Score 2) 167

by radtea (#48511417) Attached to: Is a "Wikipedia For News" Feasible?

I don't think even-handed coverage is possible, when journalism as a whole is essentially paid trolling for one agenda or another.

We can at least hope for news stories that convey a minimal amount of relevant background information:

The cost of supplying a few concrete facts relevant to the background of each story is apparently too much for various news outlets, but with the kind of crowd-sourcing Larry is suggesting this could be done. It'll be interesting to see how this effort evolves.

Ideology may always be with us, in the sense that that "there is no view from no where" but it is (precisely!) equally true that "there is no view of no where", and modern news organizations apparently forget that. They routinely distort the news to the point where it is almost unrecognizable (ask anyone who has been close to any matter reported in the news). Part of the value of sites like /. is that sometimes we get people here who can untangle the journalist's mix of ideology and ignorance from the subject of the story, which gives us all a better view of reality, which of course is possible (your smartphone wouldn't work if it wasn't.)

Comment: Re:Graphene: easy to use, hard to produce (Score 1) 129

by radtea (#48494525) Attached to: Graphene May Top Kevlar As a Bullet-Stopping Material

The situation was similar for transistors, if you recall: the first solid-state transistor was invented in 1947...

Actually, the situation was very different for the transistor. The 1947 invention was the point-contact transistor. The bipolar junction silicon transistor was invented in 1954 and the first commercial transistor radio was released the same year (both by TI):

So less than 7 years from "it's possible" to the first release of perhaps the most famous application.

Microchips, which you mention for some reason, are irrelevant: the impact of the transistor was huge long before microchips became relevant.

Graphene, by contrast, is a decade past discovery. Ten years ago we were told two things about graphene:

1) no one knows how to produce it in bulk

2) if we could produce it in bulk there would be awesome things that could be done with it.

Continuing to publish stories a decade later that amplify the awesome things that could be done with it, when there has apparently been little or no progress in its mass production, is some combination of boring/frustrating/stupid. We don't really need to be continually told, "The list of things you can't yet to do with graphene, and won't ever be able to do with graphene in the foreseeable future, continues to lengthen."

It just isn't interesting to tell these stories. Come back and talk about graphene when there is progress on mass production. That is interesting. Adding to the already long list of things that will never be made out of it because no one can figure out how to mass produce it is not.

It isn't easy being the parent of a six-year-old. However, it's a pretty small price to pay for having somebody around the house who understands computers.