Become a fan of Slashdot on Facebook


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:Who's to say? (Score 2) 52

If it were true that long-term low level radiation were unquestionably harmful, you'd expect to find a clear negative trend.

No, that's not what we'd expect to find at all.

We'd expect to find at the high end a certain level of radiation that is absolutely lethal, and as the dose is reduced, the impact would drop down steadily, until a zone where life expectancy is reduced. However, that life expectancy is more or less on an absolute scale, and must be compared to the life expectancy of the species being exposed. An insect may survive high doses of radiation simply because it wouldn't normally live long enough to exhibit symptoms, while a longer-lived animal like a human will likely survive long enough to get cancer that ultimately causes death.

At a very low dose, the chances of having any noticeable symptom from radiation is unlikely enough that it could equally likely be caused by millions of other factors, so usually nobody cares. There is still a negative trend in survivability, but it's dwarfed by all of the other fatal conditions.

Too little radiation and the species dies due to inability to keep pace with changing environmental conditions.

Radiation isn't the only mechanism for mutation, though. Rather, it's the fast and cheap way to make a lot of mutations really fast, usually in places that cannot possibly contribute to evolution.

In order to change the species, an offspring's DNA must be mutated. That's dependent on a few thousand cells out of the trillions in a human body. Those particular cells are the ones involved in meiosis, splitting and reassembling the DNA that will become half of the offspring. During that reassembly process is where most mutations happen, usually by random chemical processes rather than any radiation. This enzyme doesn't successfully react with that protein, so a gene gets skipped or altered or inserted... It is extremely rare that a gene is altered by radiation during the process.

Once an offspring's development begins, though, the effects of mutations become more pronounced. If radiation mutates a single cell during early stages of growth, that fetus will develop with a cluster of mutated cells. Unless those cells are destined to become a gonad, however, the mutation will die with that generation, and the species will not change.

Similarly, radiation affecting a mature individual is is unlikely to have any positive effect, as the mutation is almost always either destructive or irrelevant. The proper functioning of a human body requires millions of interactions between tens of thousands of proteins, so randomly changing one protein is more likely to break something than to add new functionality. Of course, as before, even breaking something is only going to affect the species if it happens to occur in a cell involved in reproduction.

It is important to remember that evolution is never towards anything. It is away from an inability to reproduce (usually due to death). As an illustration, you must realize that you are the result of an unbroken line of millions of ancestors dating back millions of years, and every single one of those millions of ancestors were fertile and successful in mating. There is no scorecard in evolution. Either you pass on your genes, or you don't. It doesn't matter if your changing environment caused you severe illness or discomfort. As long as you manage to find a mate and make a child, you've won the natural selection game.

In short, radiation is a purely random occurrence with purely random effects, and the odds of any particular radiation-caused mutation being beneficial are so absurdly small that it is absolutely safe to say that overall, there is no safe dose.

Comment Re:Init alternatives (Score 1) 330

Has the development of new cinder block designs 'stagnated' because nobody has designed a new cinder block for probably a century? No, the existing design is fine and doesn't need replacement.

That's a very interesting analogy, since I'm currently in a Japanese hotel overlooking a construction site.

Rather than the typical cinder blocks, the bottom two floors are being built from large reinforced concrete slabs, about 1 meter by 3 meters, which interlock and periodically have apparently-plastic pieces. It definitely has increased the speed of construction over laying cinder blocks, and I suspect the slabs and plastic provide some means of safety in an earthquake.

Elsewhere on the technological spectrum, several years ago I volunteered in Africa, where buildings were built with more traditional cinder blocks. The blocks, though, were formed with poor-quality cement, and crumbled when put under load. In America in the 1980s, I was involved in a remodeling project that had to replace some 50-year-old blocks because they were falling apart. Modern blocks (30 years ago) have much more consistent quality, because manufacturers can now chemically test the quality of cement before using it.

In short, construction techniques and technology have indeed improved in the last century. Not even the traditional block dimensions are "fine" for all cases, as I see across the street, though in other areas compatibility is necessary. The requirements have changed, but you seem blissfully unaware. Fortunately, your ignorance of technological progress does not negate its existence.

Comment Re:Init alternatives (Score 1) 330

You're reading far too much into one word. You should try reading the rest of that paragraph.

The first init systems were damned little more than just a shell. After that, we moved to running a single script at startup, and eventually went to runlevels with some common conventions. That's where development stagnated for a decade or two, and that's where I'm drawing my "antique" line. At the time, systems couldn't handle multitasking very well (mostly in terms of race conditions and programmers' sanity), and the massive university systems didn't really need to boot quickly, either, so there wasn't much development in parallel initialization.

Since then, Linux has been created and moved to the desktop, and we have a whole slew of new init systems, most of which natively support modern perceptions of parallelism, security, configuration, hardware, and other new developments since their predecessors moved out of the design phase. It isn't so much that "old is bad" as that the new is more likely to have been designed with modern paradigms in mind. Despite your dismissal, parallelism in particular is important, especially as Linux has taken a role as the embedded OS of choice for smart devices and cheap laptops.

While "change for the sake of change" may be wasted effort, it must be compared against the effort of keeping the old system. For example, how much effort is required of a distro maintainer to write and maintain init scripts for all their packages, including functionality for checking that dependencies started correctly and that scripts follow current best practices? How much effort is required to even make sure that the scripts are numbered in order to start correctly? In an age where building a dependency tree is only a few milliseconds of work, I would say it is wasted effort to make a sysadmin figure it out.

On the side of systemd proponents, I don't think the argument has ever been that "old means bad". Rather, the argument has been that we've learned a few lessons over the past thirty years, and we ought to put those lessons into our software.

Comment Re:Init alternatives (Score 2) 330

So let me get this straight... in order to say "Foo depends on some kind of bar, which happens to be baz on this system", I need to write a "bar" definition that actually runs "baz", and go modify a completely separate dependency file to add "foo".

...and you're suggesting this is clean?

Comment Re:Init alternatives (Score 3, Informative) 330

With all due respect, that comparison is awful.

In the effort to make an "apples to apples" comparison, it uses only the bare minimum of functionality from each toolset. There's no illustration of dependencies or capability control. It is useful for getting a rough idea of how the init systems' config files look, but not really as the basis for any kind of comparison, especially with regard to advanced features.

Comment Re:Init alternatives (Score 2, Interesting) 330

Well, on my home rolled NAS appliance, I really like the ability to reboot all of my VMs very quickly when applying security updates, because I'm not the only one that uses it.

A fair point.

The thing is, there's so much damn drama over it that I'm curious what its detractors want to use in its place.

Typically sysvinit or mostly-compatible equvalents. From my perspective, they don't want to learn something new, and they don't see the existing system as broken.

And why are some people going to go out of their way to say "you don't need a faster boot time" when they don't know my use case?

The obligatory XKCD applies. Most boot processes are fast enough now that it's not really worthwhile for an end user to shave a few seconds off the time. On the other hand, doing something as a hobbyist is entirely about wasting time, so I won't hold that against you.

The biggest improvement over antique boot systems is going to parallel boot chains. Rather than running scripts one at a time, in order, a tree is built to determine what services are dependent on what other services. For example, it doesn't make sense to start the SSH server until the network is live. There are several init systems that do this, differing mostly in how they define dependencies. Some rely on specific services ("openssh-server relies on network") while others work on more generic capabilities ("remote-shell relies on network, and openssh-server is what we'll use for remote-shell").

After parallelism, it gets tricky and subtle. Maybe we don't need all of a service to start before its dependencies. For example, we don't necessarily need all of our DHCP leases assigned before we know which network interfaces are connected. That requires a more granular service definition, but provides a lot more power, especially for systems with very complicated startup procedures. With that power, we can shave a few more seconds off the boot time, because we aren't required to wait while services settle, improving our overall parallelism. That's useful for me (professionally, I build systems that boot with a strict time limit, and may reboot every few hours), but most folks don't really benefit with the added complexity.

Furthermore, with the way I hack my Android smartphone, I'd love it if it booted faster.

I don't know much about Android init, but I think it uses its own system unrelated to systemd, sysvinit, or any of the alternatives listed in TFS.

Comment Re:Init alternatives (Score 4, Interesting) 330

I'm not sure if that's a serious question or an attempt to troll, but regardless...

Speed is not why you should want (or not) systemd. It's Linux. How often do you expect to reboot the thing, anyway?

In the spirit of "Do one thing and do it well", systemd's goal is "manage services and dependencies". To that end, the only real interaction you normally have with systemd is to start or stop a service, and view the associated logs if some service is misbehaving. In my opinion, them, I don't really see the point in changing one's distro (including support lifecycles, development trust, and organization philosophy) just to swap out init. It's just not that big a deal.

Comment Re:Computer scientists don't understand sociology (Score 1) 1321

That's always been the standard method in this country. We've had the Boston Tea Party, anti-slavery protests, Black Panthers, Vietnam, Rodney King... and now we yell about pipelines and drilling, elections, foreign policy, and who can use what restrooms.

That said, there is still room for facts and consideration by those who are actually doing the work, and that's encouraging. I've been involved with a number of activist programs hosted by those "elites" who know what they're talking about. It's provided me a very interesting perspective on how complex the typical analysis really is.

Fortunately, the loud voices don't seem to have much effect on the people who are actually affect policy... Unfortunately, the loud protests do distract from those folks, creating the appearance that nothing is being done, leading to further dissatisfaction. There has always been, and will always be, a certain fascination with the folks who complain loudly and reduce complex situations to a simple cause. There are roughly 7 billion people on this planet, and that's about 500,000,000,000,000,000,000 relationships. "Simple" is not a word that applies easily.

Comment Re:Sore losers (Score 2) 1321

Sort of...

It's also very disappointing that the polls and predictions were so wildly incorrect. That is highly irregular, though not indicative of foul play.

I'm not ambitious enough to find the article, but Nate Silver had an informative retrospective piece about how some of the most influential voting blocks ended up voting contrary to what they were assumed to do, while polls concentrated on getting accurate results in other areas more typically "on the fence".

Trump's campaign did a surprisingly good job of swaying typically-blue voters to his side, while Clinton's campaign focused on the traditional swing states. Ultimately, those states that Clinton won couldn't compensate for the masses that Trump won, so Trump won the election. Since the polls focused on those swing states as well, Clinton showed a lead there.

In the future, we'll need to have an analysis of the polls that considers their assumptions about who's important to poll. The reality is that every vote counts, even when you might not expect it to.

Comment Computer scientists don't understand sociology (Score 5, Insightful) 1321

Having read earlier reports of this analysis, I'm going to have to respectfully disagree.

From what I read, there was no attempt to find other explanations, like a demographic preference for e-voting over paper, or the local economic costs of maintaining one particular voting mechanism.

Nope, let's just just straight to assuming hacking.

Comment Re: Lots of states have anti-scalping laws (Score 5, Informative) 171

Having worked in theater for a while, I can assure you that ticket sales (in a traditional venue, at least) have practically nothing to do with the house costs... but it's complicated.

First, the house charges rent. That typically covers the wear & tear, upkeep, and basic services for the show during its run. There may be basic crew costs included, or they may be negotiated separately, but the bottom line is a big up-front cost for the producers to put a show on stage at all.

As tickets go on sale, they are priced according to what the market will bear, through a joint agreement between the producers and the house. Front-row seats for a Broadway show pull in above-average prices, but the nosebleed section behind a pillar next to the air conditioner barely sells for enough to cover the processing cost. However, selling cheap seats allows the producers to boast about the number of tickets sold, and helps the house meet goals for community access (which is very important for nonprofit houses). A cut of the ticket sales goes to the house (justified as covering the box office processing costs), but the majority of it goes to the producers... After all, the producers are also paying a lot of the expense to promote the show, often through separate advertising deals with the house company.

To address the original point: Ticket sales are based on the market, not the expenses. A nonprofit house working toward promoting the arts might indeed sell tickets at a huge loss to please their patron donors. A promoter trying to increase a band's popularity might cut prices, expecting to lose money on the whole show in an effort to boost popularity for higher return later. On the other hand, a top-bill show with great reviews in other venues could be priced at a huge profit, and still expect to sell.

Finally, once the show opens, the producers have the captive audience in the seats, excited about the show, and that's when the real money-making starts. Concessions are usually handled mostly by the house, but merchandise is usually purely profit for the producers. That's why the adage about merchandise rings true. It does allow promoters to cut ticket prices and still make a profit on the show, or at least reduce the expense they paid for promoting the brand. For small bands who have to pay their own expenses, this is the best chance they get at turning a profit on the show.

Comment Re:If confirmed, does this make it realistic? (Score 4, Insightful) 477

While the OP did indeed ask for practical propulsion applications, the implications of a change in physics theory is enormous, as his example illustrates quite well.

Spectral lines led to the realization that energy is not continuous, but discrete in very small units which can interact with matter, and by inverting that principle, small changes to matter can result in large changes to energy. That directly led to the theory behind semiconductors, enabling transistors and other solid-state electronics, ultimately leading to the entirety of modern electronics technology.

Similarly, verifying a repeatable violation of the laws of physics means that those laws are inaccurate. By refining the theory to fit the new observations, we can also revisit our assumptions about what is possible using electromechanics. To address OP's question, energy, not fuel, becomes the limiting factor in propulsion. That in turn alters the theory of rocketry, which affects the limits of human expansion, providing new areas of study for anthropology and sociology.

However, the scope of affect also lies beyond rocketry. If EM can produce thrust, we may be able to miniaturize the device to a nanotechnology scale, as a new tool for nanomachines. As one example off the top of my head, we may be able to produce self-controlled materials that change shape by rearranging microscopic structures, similar to how animal muscles work by moving actin and myosin molecules.

In short, the actual application of any discovery is the increase in understanding of how the universe works, and from that we can derive advances in technologies.

Comment Re:Of course (Score 1) 733

"Wrong" and "illegal" are two different things. Despite what certain politicians may say, the American judicial system is built to ensure that only people who do illegal things go to jail. I'm not saying it works perfectly at making sure people who don't do illegal things stay out of jail, but that's another discussion. What's important in this case is that we don't just decide "this person ought to be in jail", and invent retroactive mechanisms to imprison them.

Slashdot Top Deals

Science may someday discover what faith has always known.