Forgot your password?
typodupeerror

Comment: a suggested approach (Score 1) 170

Solutions depending upon space travel etc. seem both expensive and dependent on future technology not somehow making recovery too inexpensive. Ditto other high-tech solutions.

I have a notion of a different strategy, but cannot figure out all the necessary details. Suppose we could derive a strong encryption technology that could not likely be broken within the time period of interest. (This is uncertain and questionable!) That encryption should be arranged to depend upon a _long_ key, assume for discussion a concatenation of a large number of numbers that _cannot_ be known before the target date, How to define years in advance a large set of numbers that will magically appear at some specific future time? Two suggestions of indeterminate brittleness (where "brittleness" means the probability that the depended-upon machinery will no longer exist).

Pick some large number of U.S. and world cities -- perhaps in the 1000's --- and on the magic date concatenate the ordered set of max/min temperatures reported by some identifiable set of weather reporting entities. Provide fallback (default values?) for cities that no longer exist, or which are no longer reported, or whatever. Specify fallback for reporting organizations that disappear. The intent of the fallback definition is to provide algorithmic keys regardless what has happened to the data-generating organizations over time.

Obviously, this computation becomes more brittle the longer civilization runs. One would not want to depend upon temperature reported in the NY Times, because the NYT might not be around in another century, or might not bother reporting weather since that data is more available on whatever has replaced the web. But it ought be possible with enough careful thinking to devise a dataset definition that could be interpreted unambiguously after reasonable lengths of time.

As backup, several such dataset definitions should be defined. For example, use the stock market: The first N digits of the closing price a large number of stocks (or their well-defined successors) with defaults to ignore data (stocks) that no longer exist. The stock market might not exist in 100 years, not NOAH, but enough well-defined fallbacks could be defined. It might not matter if any particular fallback is no longer well defined, fallback to the next fallback. It doesn't matter much if this fallback to different collections of time-dependent data branches or requires expensive multiple tries. The principle of decryption is that its computation is much much less expensive than brute force attack.

So, on the target release date, the vault machine goes out on the internet (or is "manually" passed the necessary set of numbers, since whatever has replaced the internet won't be accessible by even 25-year-old systems) and if the thousands of collected digits match, it should decrypt the payload. It is almost certain that any data disambiguation algorithm will become ambiguous over time. But if the ambiguities don't branch into too many separate paths, they can each be followed to see if any one works. Assume that processor time is very very inexpensive.

This sort of solution presumes that the vault machine can determine the time, so it couldn't be tricked into thinking that the time has expired. Some sort of high-capacity power backup and wipe-on-intrusion machinery is required. Technical details left to my SlashDot colleagues. Determining enough likely-surviving data sources over 25, 50, or 100 years is a very interesting techno-sociological problem!

Comment: Re:The fast lanes: a parable (Net Gnutrality?) (Score 1) 182

by smhsmh (#47015963) Attached to: FCC Votes To Consider Next Round of 'Net Neutrality' Rules

What can we learn from Boston's venerable Jamaicaway, a three lane road with two in each direction?

The FCC's concern has been directed by the moneyed interests of big media (e.g. Netflix, Google) and big networking (the ISPs and also Google). Both sides have legitimate business concerns, so this is understandable and not intrinsically evil. It is certainly true that connections with the media companies are responsible for the large share of internet bandwidth. But this completely ignores the rest of internet traffic, about which the FCC and Congress and certainly the News Media are nearly unaware. (Because that isn't where the money is.)

The original capability of the Internet was to provide (reliable?) connections between arbitrary machines. The original concerns with decentralization and automatic self-reconfiguration (nuclear war) are perhaps not so much the concern any more, but it is still the case that the Internet must support arbitrary connections between arbitrary devices. I use various SSL/VPN/RDP to work from my laptop at home to development machines at my employer's office. The bandwidth isn't trivial, but small compared to receiving video or even mp3. Netflix and Google and the FCC and Congress and the Media aren't much concerned about that service, but I sure am as should be anyone concerned about the health of the economy. And we are moving into the so-called Internet of Things, so Google's subsidiaries can monitor the setting of my thermostat and the contents of my refrigerator for me. Where does this traffic fit into Net Gnutrality? Who advocates for it? Next year might be too late.

Of course, managing any limited resource requires policies, and usually some kind of cost pricing. (I'm ignoring here the technically-attractive argument that net bandwidth should be implementationally cheap enough as to not need measurement. The problem with unmeasured resouces is that applications soon find ways to overuse those resources.) Assigning the "costs" of network traffic are more complex than is obvious. My connection to my office (15 hops, 5 miles as a carrier pigeon would navigate it, but about 60 miles as the backbone sites route it) first traverses my local ISP (the company that bought the name of the company that in my childhood was the de facto national phone system), then traverses a couple hops through another backbone provider, then a couple more hops through the backbone/local provider my employer uses. Most of the time reliability and latency are admirable -- Emacs works just like a local program. But every tens of minutes latency suddenly becomes about 1.5 seconds, I believe at one of the interfaces between these big three providers.

All three of these big network providers bear some cost from my traffic. (My emacs load is small compared to Netflix, but the issues of throughput and latency are similar.) The providers at each end have some direct cost-recovery mechanism (monthly fees) but the one in the middle does not. There are complex contractual reimbursement protocols between backbone providers. They all deserve to be compensated for the costs. I'd like to understand these contracts better (except I have other things to do with my life) but I'd insist the FCC _show_ that they understand them.

BTW, the nature of some of these compensation mechanisms is that they are (and should be trans-national), not directly subject to Congressional or Administrative (FCC) regulation.

My conclusion would be that the forces on connectivity providers are similar to the cost structure of other public utilities, the same as the nice people who provide you electricity, water, sewerage, and (in the past) telephone connection, and that there should be a wall between these utilities and content providers. That would be the sane solution from the public policy perspective. Whoops, I used the concepts "sane" and "public policy" in the same sentence. Don't expect it to work out that way, as money and Congress are involved.

Comment: Re:The Real Solution (Score 1) 433

by smhsmh (#46743121) Attached to: UN: Renewables, Nuclear Must Triple To Save Climate

This "solution" is on the correct track, except the estimate of the reduction necessary is probably a severe underestimate.

My premise is that with current modes of energy use and energy production, the planet cannot support anything like the current load if most of the 7 billion aspire to energy consumption (actually, CO2 production) that is any significant fraction of the rates for the developed world. The easiest (both technically and morally) approaches are improvements in energy efficiency and reduction of the carbon load of energy production, but they will never be enough to save the planet. Decades ago the Chinese government realized they needed drastically to reduce population growth, and implemented the one-child policy with that government's usual brand of unacceptable and heavyhanded bureaucracy, but despite the implementation it was the right idea. If the 2+ billion people of India and China aspire to achieve merely a third of the average carbon loading of a USA citizen, the planet is cooked.

There are many ways to reduce human population. Disease, famine, war, and all of the tired historical paradigms are possibilities, although it is impossible to predict which will arrive first. But without effective remediation, one or the other will suffice to correct the current imbalance. It is _late_ in the game to try to reduce population (and to further reduce energy consumption, and reduce the carbon loading of energy production, but alone together will not be enough) but we have to try.

One good approach would be international promise of pariahship. The developed countries who understand the problem (North America, Western Chine, Japan, perhaps a very few other East Aisa countries, and some others who have achieved ZPG by indifference of the population, (Russia?) should unite and declare pariahship at any country that has not committed to Negative Population Growth, i.e. one child per couple. The pariahs could obtains no visas, no internet communication, no banking, and no commerce at all.

Unfortunately, this can't work because it would require intelligent cooperation between the governments of the several countries. So we're all cooked.

Comment: Re:How long id a song (Score 1) 100

by smhsmh (#46491947) Attached to: How Data Storage Has Grown In the Past 60 Years

At least an American football field is a well-defined standard unit. (Canadian is different, as is international football.) But you can find these sizes on Wikipedia or in the official rules of the organizing committees.

Not so the standard unit song which does not appear on the Wikipedia disambiguation page for "song."

I believe I first saw "song" used as a measure of storage capacity in an Apple ad, perhaps for an early Ipad. At the time I thought it was an slippery and slightly dangerous way for marketing hype to corrupt technical specifications. Perhaps I misremember, but simple calculation put the size of a song somewhere in the low 3 MB range.

But a quick check of early Ipod print ads on YouTube suggests Apple used 1 MB also, derived from the claims that a 1GB Ipod could hold 1000 songs and a 2GB Ipod could hold 2000. These are obviously rounded numbers, but deviate significantly from 3+ MB.

A random spec sheet on kingston.com implies a song is 4 MB. A current spec sheet on sandisk.com implies about 3.85 MB.

BTW, did you know that more than 20 million unpackaged I7 chips will fit on one American football field, including end zones?

Comment: Re:not worth it (Score 1) 461

by smhsmh (#46461003) Attached to: The $100,000 Device That Could Have Solved Missing Plane Mystery

A little more thinking outside current boxes.

Continuous position and status reporting to a satellite communication system doesn't seem to hugely costly compared to pumping fuel into those engines, but let's consider whether there are alternatives to satellite telemetry.

Supplose planes could redundantly record the black-box data stream for one another. At an average altitude of 30,000 feet, direct-sight transmissions carry about 300 miles. If planes transmitted promiscuously to one another, and every plane recorded what it received, then the black-box data for every plane would usually be available within hours after an incident. Commercial airliners travel well under 600 MPH, so it is exceedingly rare that an airliner at cruise altitude is not within 300 miles of any other airliner.

Comment: Re:not worth it (Score 1) 461

by smhsmh (#46460921) Attached to: The $100,000 Device That Could Have Solved Missing Plane Mystery

The costs of maintenance and testing of new "black box" systems ought not be different in magnitude to the current costs of maintenance and testing for current 1960's black box systems. Probably less.

Let's start thinking outside the box. What else could be improved in black box machinery? Two things occur to me right off.

(1) If I understand current practice, there are two independent black boxes. One collects a large amount instrument readings from the plane, and the other records sounds (voice) in the cockpit. This separation might have made sense in the '60s, but any trash piece of hardware ought be capable of recording both. By all means planes should have two independent recording black boxes, but if they each collect _all_ the data, then it would only be necessary to recover one after an incident.

(2) Black boxes are necessarily designed to survive serious impact and submersion in salt water under extreme pressure for nearly indefinite time. But if both are both securely attached (to the airframe?) somewhere in back (where they are most likely to survive impact) in a sea crash they will be somewhere at the bottom of the sea. Perhaps some system could be devised so that one black box (or perhaps a third) could be external to the plane, attached in a way that would quickly disintegrate in salt water, or upon severe impact, and furthermore, have positive buoyancy (it floats).

Item (2) could turn a difficult retrieval problem into an easier retrieval problem, but I recognize that anything that makes an external black box easy to separate from a plane would create a danger of unintended separation. Obviously, separation can't be dependent on water, since planes must taxi and fly through rainstorms. But perhaps a modern flight recorder with its independent power and sonar beepers could be constructed inside a box less than a pound, such that a rare separation from the host aircraft would be less of a hazard to those on the ground, and an insignificant detriment to the plane's aerodynamics. I believe we humans can currently design and launch satellites with similar performance/weight ratios.

(Just thinking --- thinking isn't in my job description.)

Comment: Re:much ado about nothing (Score 1) 506

by smhsmh (#46372619) Attached to: Quebec Language Police Target Store Owner's Facebook Page

Near a border is a an irrelevant legal distinction. You're in one region or the other. And you have to comply with the laws of that region. And yes, you should have to.

There is a subtle cultural difference between Canadians and USAicans that originates more than two centuries ago.

In 1775 and the years immediately following, most colonials decided not to comply with the laws of the region and revolted. Our gentle neighbors to the North remained compliant. Some number of those living in revolutionary territory who did not want to revolt emigrated to what is now Canada.

That voluntary separation has served well, since we've mostly stayed friends and allies since then (except in hockey and curling). Compare the voluntary, amicable, and successful separations of Norway and Sweden, and of Slovakia and the Czech Republic. Compare the century-long mess that was the United States after the failure of separation in 1865. (This is not to imply that successful Southern succession would have been a good thing, but I digress.)

Comment: Re:Makes no sense. (Score 1) 478

by smhsmh (#46283239) Attached to: Ask Slashdot: Anti-Camera Device For Use In a Small Bus?

My father taught me photography with a camera that had no electronics whatever, above the quantum-mechanical layer of photons interacting with crystals or silver oxide. Still have one around somewhere. Let me know how anyone would propose disabling this device without subjecting the entire bus to dangerous effect such as harmful levels of X-rays.

Comment: Re:A "clipped" audio signal is still a valid signa (Score 1) 526

by smhsmh (#46217753) Attached to: Customer: Dell Denies Speaker Repair Under Warranty, Blames VLC

Protecting against thermal burnout doesn't require expensive DSP algorithms. The audio drivers could come close by simply maintaining a time-decayed value of the recent audio sample delta. Easy to compute. The time constant would be something a little faster than the rate a which the speaker can dump heat to the environment. That computation would track fairly closely the power dissipation (heat) in the speaker, and when unsafely high, the driver should stop and flag some sort of error popup.

Technology

Building a Better Bike Helmet Out of Paper 317

Posted by samzenpus
from the recycled-helmet dept.
An anonymous reader writes "Inspired by nature, a London man believes the solution to safer bike helmets is to build them out of paper. '"The animal that stood out was the woodpecker. It pecks at about ten times per second and every time it pecks it sustains the same amount of force as us crashing at 50 miles per hour," says Surabhi. "It's the only bird in the world where the skull and the beak are completely disjointed, and there's a soft corrugated cartilage in the middle that absorbs all the impact and stops it from getting a headache." In order to mimic the woodpecker's crumple zone, Anirudha turned to a cheap and easily accessible source — paper. He engineered it into a double-layer of honeycomb that could then be cut and constructed into a functioning helmet. "What you end up with is with tiny little airbags throughout the helmet," he says.'"

Comment: Re:A legit question (Score 1) 212

by smhsmh (#45868225) Attached to: First US Public Library With No Paper Books Opens In Texas

Your idea of cryptographically signing books is an extremely worthy one. The details may be tricky to work out (a recent /. item suggests that the NSA is planning to break all cryptography for all time with quantum computing) but we should all keep the idea in mind against the time it becomes necessary to protect our history and knowledge from Big Brother. And it will become necessary...

BTW, my family dog has been running a small paperless library in the back yard for about a decade. No E-readers -- all the items are scratch-and-sniff.

He keeps differentiating, flying off on a tangent.

Working...