Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Slashdot Deals: Cyber Monday Sale Extended! Courses ranging from coding to project management - all eLearning deals 20% off with coupon code "CYBERMONDAY20". ×

Comment Re:prohibited ? (Score 3, Insightful) 214

The US social security number as an id is seriously broken. After consideration, I'd epect my ssn to be in at least 100 poorly-secured databases: bank accounts, insurance accounts, doctor/dentist/hospital facilities, employers, etc. The number is hardly secret, yet there are about 350M persons in the USA and only 1000M distinct ssns.

A better system would redefine a ssn as two components. A 9 (or 10?) digit public number would signify who you are -- lotsa entities need to know that -- and a 6-7 digit secret number would prove that you are the person associated with that ssn. It is pure hell when the first number needs to be changed (witness protection?) but the second number could be changed often and with little overhead or impact, whenever one suspects it has been compromised. The current electronic fiduciary networks would be sufficient and secure enough to support and manage this.

Unfortunately, the US Congress will eventually try to fix these problems by passing laws, making things illegal, rather than passing technology that makes violations almost impossible.

Comment Re:Let me get this right.... (Score 1) 372

More toddlers than police officers??? This is a shitistic.

It means nothing without considering the size of the relative populations of toddlers and police officers. Also consider that toddlers are toddlers 24/7, while police officers are typically on duty less than 1/4 of that time.

Quick internet search says there are about 30 million deer in the US today, and about 6 million are taken by hunters each year. (About 1 million are taken by motor vehicles.) Is it statistically more hazardous to be a deer in the US today than a young black male? But why should anyone who is not a gun-rights loony care about a ridiculous statistic like this?

Comment Re:What the frack (Score 1) 350

You've been on the bridge of US warships -- I have not -- but I wonder why inertial navigation has not been installed to serve as a backup at least on capital ships.

Consider the navigation of nuclear subs. They may serve for months silently submerged. They can tow lf antennas, but they need to navigate in real time, and even semi-reliable techniques like sonar pinging the sea floor would enhance enemy tracking. I've heard that subs have well-developed inertial tracking for reliable navigation over nontrivial time periods. I'd like to know more out of intellectual curiosity.

If inertial navigation works well enough, why isn't it also available on a $1e9 capital carrier?

Comment a suggested approach (Score 1) 170

Solutions depending upon space travel etc. seem both expensive and dependent on future technology not somehow making recovery too inexpensive. Ditto other high-tech solutions.

I have a notion of a different strategy, but cannot figure out all the necessary details. Suppose we could derive a strong encryption technology that could not likely be broken within the time period of interest. (This is uncertain and questionable!) That encryption should be arranged to depend upon a _long_ key, assume for discussion a concatenation of a large number of numbers that _cannot_ be known before the target date, How to define years in advance a large set of numbers that will magically appear at some specific future time? Two suggestions of indeterminate brittleness (where "brittleness" means the probability that the depended-upon machinery will no longer exist).

Pick some large number of U.S. and world cities -- perhaps in the 1000's --- and on the magic date concatenate the ordered set of max/min temperatures reported by some identifiable set of weather reporting entities. Provide fallback (default values?) for cities that no longer exist, or which are no longer reported, or whatever. Specify fallback for reporting organizations that disappear. The intent of the fallback definition is to provide algorithmic keys regardless what has happened to the data-generating organizations over time.

Obviously, this computation becomes more brittle the longer civilization runs. One would not want to depend upon temperature reported in the NY Times, because the NYT might not be around in another century, or might not bother reporting weather since that data is more available on whatever has replaced the web. But it ought be possible with enough careful thinking to devise a dataset definition that could be interpreted unambiguously after reasonable lengths of time.

As backup, several such dataset definitions should be defined. For example, use the stock market: The first N digits of the closing price a large number of stocks (or their well-defined successors) with defaults to ignore data (stocks) that no longer exist. The stock market might not exist in 100 years, not NOAH, but enough well-defined fallbacks could be defined. It might not matter if any particular fallback is no longer well defined, fallback to the next fallback. It doesn't matter much if this fallback to different collections of time-dependent data branches or requires expensive multiple tries. The principle of decryption is that its computation is much much less expensive than brute force attack.

So, on the target release date, the vault machine goes out on the internet (or is "manually" passed the necessary set of numbers, since whatever has replaced the internet won't be accessible by even 25-year-old systems) and if the thousands of collected digits match, it should decrypt the payload. It is almost certain that any data disambiguation algorithm will become ambiguous over time. But if the ambiguities don't branch into too many separate paths, they can each be followed to see if any one works. Assume that processor time is very very inexpensive.

This sort of solution presumes that the vault machine can determine the time, so it couldn't be tricked into thinking that the time has expired. Some sort of high-capacity power backup and wipe-on-intrusion machinery is required. Technical details left to my SlashDot colleagues. Determining enough likely-surviving data sources over 25, 50, or 100 years is a very interesting techno-sociological problem!

Comment Re:The fast lanes: a parable (Net Gnutrality?) (Score 1) 182

What can we learn from Boston's venerable Jamaicaway, a three lane road with two in each direction?

The FCC's concern has been directed by the moneyed interests of big media (e.g. Netflix, Google) and big networking (the ISPs and also Google). Both sides have legitimate business concerns, so this is understandable and not intrinsically evil. It is certainly true that connections with the media companies are responsible for the large share of internet bandwidth. But this completely ignores the rest of internet traffic, about which the FCC and Congress and certainly the News Media are nearly unaware. (Because that isn't where the money is.)

The original capability of the Internet was to provide (reliable?) connections between arbitrary machines. The original concerns with decentralization and automatic self-reconfiguration (nuclear war) are perhaps not so much the concern any more, but it is still the case that the Internet must support arbitrary connections between arbitrary devices. I use various SSL/VPN/RDP to work from my laptop at home to development machines at my employer's office. The bandwidth isn't trivial, but small compared to receiving video or even mp3. Netflix and Google and the FCC and Congress and the Media aren't much concerned about that service, but I sure am as should be anyone concerned about the health of the economy. And we are moving into the so-called Internet of Things, so Google's subsidiaries can monitor the setting of my thermostat and the contents of my refrigerator for me. Where does this traffic fit into Net Gnutrality? Who advocates for it? Next year might be too late.

Of course, managing any limited resource requires policies, and usually some kind of cost pricing. (I'm ignoring here the technically-attractive argument that net bandwidth should be implementationally cheap enough as to not need measurement. The problem with unmeasured resouces is that applications soon find ways to overuse those resources.) Assigning the "costs" of network traffic are more complex than is obvious. My connection to my office (15 hops, 5 miles as a carrier pigeon would navigate it, but about 60 miles as the backbone sites route it) first traverses my local ISP (the company that bought the name of the company that in my childhood was the de facto national phone system), then traverses a couple hops through another backbone provider, then a couple more hops through the backbone/local provider my employer uses. Most of the time reliability and latency are admirable -- Emacs works just like a local program. But every tens of minutes latency suddenly becomes about 1.5 seconds, I believe at one of the interfaces between these big three providers.

All three of these big network providers bear some cost from my traffic. (My emacs load is small compared to Netflix, but the issues of throughput and latency are similar.) The providers at each end have some direct cost-recovery mechanism (monthly fees) but the one in the middle does not. There are complex contractual reimbursement protocols between backbone providers. They all deserve to be compensated for the costs. I'd like to understand these contracts better (except I have other things to do with my life) but I'd insist the FCC _show_ that they understand them.

BTW, the nature of some of these compensation mechanisms is that they are (and should be trans-national), not directly subject to Congressional or Administrative (FCC) regulation.

My conclusion would be that the forces on connectivity providers are similar to the cost structure of other public utilities, the same as the nice people who provide you electricity, water, sewerage, and (in the past) telephone connection, and that there should be a wall between these utilities and content providers. That would be the sane solution from the public policy perspective. Whoops, I used the concepts "sane" and "public policy" in the same sentence. Don't expect it to work out that way, as money and Congress are involved.

Comment Re:The Real Solution (Score 1) 433

This "solution" is on the correct track, except the estimate of the reduction necessary is probably a severe underestimate.

My premise is that with current modes of energy use and energy production, the planet cannot support anything like the current load if most of the 7 billion aspire to energy consumption (actually, CO2 production) that is any significant fraction of the rates for the developed world. The easiest (both technically and morally) approaches are improvements in energy efficiency and reduction of the carbon load of energy production, but they will never be enough to save the planet. Decades ago the Chinese government realized they needed drastically to reduce population growth, and implemented the one-child policy with that government's usual brand of unacceptable and heavyhanded bureaucracy, but despite the implementation it was the right idea. If the 2+ billion people of India and China aspire to achieve merely a third of the average carbon loading of a USA citizen, the planet is cooked.

There are many ways to reduce human population. Disease, famine, war, and all of the tired historical paradigms are possibilities, although it is impossible to predict which will arrive first. But without effective remediation, one or the other will suffice to correct the current imbalance. It is _late_ in the game to try to reduce population (and to further reduce energy consumption, and reduce the carbon loading of energy production, but alone together will not be enough) but we have to try.

One good approach would be international promise of pariahship. The developed countries who understand the problem (North America, Western Chine, Japan, perhaps a very few other East Aisa countries, and some others who have achieved ZPG by indifference of the population, (Russia?) should unite and declare pariahship at any country that has not committed to Negative Population Growth, i.e. one child per couple. The pariahs could obtains no visas, no internet communication, no banking, and no commerce at all.

Unfortunately, this can't work because it would require intelligent cooperation between the governments of the several countries. So we're all cooked.

Comment Re:How long id a song (Score 1) 100

At least an American football field is a well-defined standard unit. (Canadian is different, as is international football.) But you can find these sizes on Wikipedia or in the official rules of the organizing committees.

Not so the standard unit song which does not appear on the Wikipedia disambiguation page for "song."

I believe I first saw "song" used as a measure of storage capacity in an Apple ad, perhaps for an early Ipad. At the time I thought it was an slippery and slightly dangerous way for marketing hype to corrupt technical specifications. Perhaps I misremember, but simple calculation put the size of a song somewhere in the low 3 MB range.

But a quick check of early Ipod print ads on YouTube suggests Apple used 1 MB also, derived from the claims that a 1GB Ipod could hold 1000 songs and a 2GB Ipod could hold 2000. These are obviously rounded numbers, but deviate significantly from 3+ MB.

A random spec sheet on implies a song is 4 MB. A current spec sheet on implies about 3.85 MB.

BTW, did you know that more than 20 million unpackaged I7 chips will fit on one American football field, including end zones?

Comment Re:not worth it (Score 1) 461

A little more thinking outside current boxes.

Continuous position and status reporting to a satellite communication system doesn't seem to hugely costly compared to pumping fuel into those engines, but let's consider whether there are alternatives to satellite telemetry.

Supplose planes could redundantly record the black-box data stream for one another. At an average altitude of 30,000 feet, direct-sight transmissions carry about 300 miles. If planes transmitted promiscuously to one another, and every plane recorded what it received, then the black-box data for every plane would usually be available within hours after an incident. Commercial airliners travel well under 600 MPH, so it is exceedingly rare that an airliner at cruise altitude is not within 300 miles of any other airliner.

Comment Re:not worth it (Score 1) 461

The costs of maintenance and testing of new "black box" systems ought not be different in magnitude to the current costs of maintenance and testing for current 1960's black box systems. Probably less.

Let's start thinking outside the box. What else could be improved in black box machinery? Two things occur to me right off.

(1) If I understand current practice, there are two independent black boxes. One collects a large amount instrument readings from the plane, and the other records sounds (voice) in the cockpit. This separation might have made sense in the '60s, but any trash piece of hardware ought be capable of recording both. By all means planes should have two independent recording black boxes, but if they each collect _all_ the data, then it would only be necessary to recover one after an incident.

(2) Black boxes are necessarily designed to survive serious impact and submersion in salt water under extreme pressure for nearly indefinite time. But if both are both securely attached (to the airframe?) somewhere in back (where they are most likely to survive impact) in a sea crash they will be somewhere at the bottom of the sea. Perhaps some system could be devised so that one black box (or perhaps a third) could be external to the plane, attached in a way that would quickly disintegrate in salt water, or upon severe impact, and furthermore, have positive buoyancy (it floats).

Item (2) could turn a difficult retrieval problem into an easier retrieval problem, but I recognize that anything that makes an external black box easy to separate from a plane would create a danger of unintended separation. Obviously, separation can't be dependent on water, since planes must taxi and fly through rainstorms. But perhaps a modern flight recorder with its independent power and sonar beepers could be constructed inside a box less than a pound, such that a rare separation from the host aircraft would be less of a hazard to those on the ground, and an insignificant detriment to the plane's aerodynamics. I believe we humans can currently design and launch satellites with similar performance/weight ratios.

(Just thinking --- thinking isn't in my job description.)

Comment Re:much ado about nothing (Score 1) 506

Near a border is a an irrelevant legal distinction. You're in one region or the other. And you have to comply with the laws of that region. And yes, you should have to.

There is a subtle cultural difference between Canadians and USAicans that originates more than two centuries ago.

In 1775 and the years immediately following, most colonials decided not to comply with the laws of the region and revolted. Our gentle neighbors to the North remained compliant. Some number of those living in revolutionary territory who did not want to revolt emigrated to what is now Canada.

That voluntary separation has served well, since we've mostly stayed friends and allies since then (except in hockey and curling). Compare the voluntary, amicable, and successful separations of Norway and Sweden, and of Slovakia and the Czech Republic. Compare the century-long mess that was the United States after the failure of separation in 1865. (This is not to imply that successful Southern succession would have been a good thing, but I digress.)

Comment Re:Makes no sense. (Score 1) 478

My father taught me photography with a camera that had no electronics whatever, above the quantum-mechanical layer of photons interacting with crystals or silver oxide. Still have one around somewhere. Let me know how anyone would propose disabling this device without subjecting the entire bus to dangerous effect such as harmful levels of X-rays.

Comment Re:A "clipped" audio signal is still a valid signa (Score 1) 526

Protecting against thermal burnout doesn't require expensive DSP algorithms. The audio drivers could come close by simply maintaining a time-decayed value of the recent audio sample delta. Easy to compute. The time constant would be something a little faster than the rate a which the speaker can dump heat to the environment. That computation would track fairly closely the power dissipation (heat) in the speaker, and when unsafely high, the driver should stop and flag some sort of error popup.

"Everybody is talking about the weather but nobody does anything about it." -- Mark Twain