Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Slashdot Deals: Cyber Monday Sale! Courses ranging from coding to project management - all eLearning deals 25% off with coupon code "CYBERMONDAY25". ×

Comment Re:This is taking the wrong approach (Score 1) 291

Why should machines use UTC at all? We have a time standard that doesn't use leap seconds - Atomic Time (TAI). We can convert between the two fairly easily. So why not instead push for software to use TAI in place of UTC, and then convert for output or whatever?

Yup; down at the lowest ("OS") level, that's exactly what's done, on all computer system except MS's DOS/Windows (and who among us actually know what those systems do internally? ;-)

If you examine a unix or unix-like OS like linux's source code, you'll find that it has and uses the second counter that the time(3) function return, and has no need for anything above that. There are various user-level library routines that convert the time() value to assorted human-readable formats. Leap seconds are a feature of user-level code, not of the underlying system.

I've seen any number of cases where a project attempts to deal with time via higher-level data formats than the simple second counter. All have eventually failed, and reverted to the second counter, which just keeps ticking along and leaves the conversion to complex time/date "display" formats to Someone Else.

The one remaining problem is all the date formats that can't be reliably converted back to seconds. I keep running across dates like 10/8/12, for which it's utterly impossible to decide which field is the year, which is the day, and the other one must be the month. This often goes along with a time format that doesn't bother to include the time zone (or whether DST is in effect;-). But this is a social problem, not a technical one. The programmers understand that such formats aren't usable by the software, or even by humans in the near future.

(Actually, I've seen some hints that a "formatted" time is in use inside Macs, perhaps in OS X itself, or in a very low-level library. This would explain some time anomalies that have been observed in a few apps. I wouldn't be surprised if they'd done something like this, to go along with the way the kernel munges file names so that strcmp() says the name in the directory is not equal to the name passed to open() when the file was created, thus breaking a lot of software developed on other systems. Maybe some day it'll be found and fixed. Or maybe not.)

Comment Re:This seems contradictory (Score 1) 210

Snowden isn't accused of a sex crime as Assange is, and that ultimately is the only difference really that I can see. ...

Wonder why not? You'd think it'd be just as easy to find (and fund) a few women to accuse Snowden of sexual assault. It's not like US prosecutors have never used such tactics in the past.

One conjecture is that the US government keeps thinking it can get an assassin in to take him out, but that's turning out to be a bit trickier in Russia than it was in Pakistan or a few other countries we might list. Maybe the judicious thing would be to abandon that approach, and revert to the tried-and-true sex accusations. Of course, given their history, the Russian leaders might just lol at such an approach.

Comment Re:"and time runs backward as well as forward." (Score 4, Funny) 257

Remember that these are the same people arguing against the Universe having a creator....

Nah; they've just found that the "creator" worked at what we consider the (heat) death of the universe, and the creation has run backward since then. We don't remember something until the universe reaches the event we're trying to remember, and then it sends a description of the event forward along your time line. This transmission has a significant error rate, of course.

Does that clear it all up? If not, wait a bit, and someone farther back will send a more detailed explanation. Of course, since it'll be traveling longer, there'll be more dropped bits, so we may not be able to make as much sense of it.

Comment Re:Fair use exception for research purposes? (Score 1) 204

My understanding is that a lot of scientific work are funded via public money, yet the copyright gets assigned to private entities. In the context of copying vs. 'taking', their behavior is closer to 'taking' than what the researchers are doing. Simply because they prevent access to it by others.

If viewed as a public "investment", limiting access to the knowledge actually reduces the "payback" by not spreading the findings to anyone who wants it. This in turn probably lowers overall quality by having fewer (and perhaps less qualified) people examining the findings. ...

A number of historians have made a similar argument. The idea is that the "scientific method" is hardly new, and can't account for the rapid development of modern technology over the past few centuries. We have plenty of evidence that the scientific approach has been widely understood since prehistory, everywhere in the world. But new knowledge has generally been closely held by small "guilds" that keep it secret, so the only knowledge is what's in the mind of the current members of a small group. The result is loss of information over generations, and widespread rediscovery of the same results in different societies.

The important thing that happened in Europe a few centuries back was the concept of open publication. The result of this was what Isaac Newton characterized as "standing on the shoulders of giants". By this he meant the passing of information in a print form, to anyone able and willing to read it, learn from it, and go on to new discoveries rather than laboriously rediscovering what others had known years before.

The copyright system is a throwback to the old method of closely-held information that others can't build on, and sometimes can't even learn. Maybe it wasn't meant that way, but that's what 20th-century changes in copyright law has turned it into. Anything we can do to defeat it and revert to an open-publication system is for the good of all of us. (This includes those who are using it to block medical advancement that could have produced treatment for whatever eventually kills them.)

Comment Re:LOL .. RICO (Score 1) 136

So if you're one of those reading this story and thinking "OMGWTFBBQ that so unfaaaaair make it illeeeeegal now OMG," then do us all a favor and don't expose yourself to contract negotiations with 800lb gorillas like Oracle. There are grownups for that work.

Nah; you don't need a (human) grownup; you need a bigger, more aggressive gorilla. If you don't have one, the grownups in your organization should have the good sense not to try doing the job themselves. The sort of battles that gorillas engage in are not the sort that even the "best" human would want to tackle. Humans won out over gorillas not by being more powerful, but by being more intelligent. If you fight them on their own terms, you lose (no matter which species you are ;-).

Comment Re:No way I'd agree to give away private data (Score 1) 32


On one hand, you can try to prevent abuses of personal genome data by having all kinds of laws to try to keep people's genome data private. On the other hand, you can make it illegal to abuse people on the basis of their genome data.


This isn't a new idea. I've run across a number of explanations of a decades-old bunch of statistics: Scandinavia has contributed medical information extracted from health databases far out of proportion to the size of their populations or the number of medical researchers. The explanation seems to be that, rather than making medical data secret, they decided to make it fairly open (especially to medical and biological researchers), and passed laws with serious punishment for "abusing" the information. It's not perfect, of course, but the large fines levied against a number of companies for things like discriminating in hiring on the basis of medical records or DNA have resulted in general acquiescence to having personal data in the databases. As a result, researchers can do all sorts of statistical studies on the data, publish the results, use the results to get funding for more research, etc., etc. People don't cooperate as much in the rest of the world due to the widespread attempts to keep the data private.

Attempting to keep medical data secret really just means a secret "market" for the information, and a lot of difficulty proving who was responsible for the sorts of abuses we see in a lot of the rest of the world. People understand this, so it should be no surprise that they might not want their personal medical information in the databases.

Information is important if you want to live a long, healthy life. If we make the information secret, it'll mostly be used by those with the clout to get it for their own financial (and/or political) purposes. If we share it, the chances are good that it'll be used to diagnose and control or cure medical problems that you may have in the future.

Comment Re:Racism v. Bias v. Intelligence (Score 1) 445

If there's anything the Jews are gifted at, it's nepotism.

I've had a good number of Asian friends who've claimed that they're better at it than the Jews.

(Actually, you mostly hear this claim from people in the "Chinese diaspora" population, who like to point out that this population has a social role in Asia very similar to the Jews, Gypsies and Greek in Europe, and the Arabs in southern Asia. They're historically a population of merchants who've lived in shoreline "ghetto" enclaves outside of China proper, and they've faced all the same sorts of prejudice and discrimination as a result. So it's not surprising that they'd have a lot of similar "social support" traditions.)

Comment Re:Internet Phase-Out (Score 1) 57

The Internet was built from the ground up with fault-tolerant collaboration at the heart. It never occurred to the well meaning scientists and engineers that some of the users would be out and out assholes.

Huh? The design and implementation of the Internet, and its predecessor the ARPAnet, was done with roughly 99% military funding. The fault tolerance was there from the start, because the military explicitly wanted a comm system that would survive constant attack by enemies under battle conditions. The scientists and engineers involved understood this quite well, and testing by implementing and running "cyberattack" software was routine from the very early days.

Saying that such attacks "never occurred to the well meaning scientists and engineers" not only shows ignorance of how the Internet came to be, but also dismisses the hard work of a lot of the people who created it. I worked on a number of test suites back in the 1980s that could be (and sometimes explicitly were) characterized as "attack" packages. This was neither a joke nor an accusation; it was a simple description of how the test suites worked. Stress testing and testing-to-destruction is an old concept in most kinds of engineering, and the ARPA/Internet was no exception.

One of the real problems is that the commercial Internet is managed by companies that have a strong motive to save money by cutting back on "unnecessary" things like testing and redundancy (so that the saved money can be redirected to managers' bonuses, of course ;-). But this was actually understood quite well by the military funders. It's part of why the design didn't include low-level security, but emphasized redundancy and a "just deliver the bits undamaged" approach. It was understood that the only meaningful security is the type called "end-to-end", where the participants in a conversations are the ones that provide and manage the security. If you rely on the suppliers of the low-level equipment, they'll always take shortcuts that make the security worthless. That's pretty much exactly how the Internet works, and always will.

Comment Re:Lets outlaw the word Cyber (Score 1) 57

Nah; the "cyber-" prefix is useful. It's a clear clue that the writer/speaker is relatively clueless about all that interwebs stuff, and only knows a few techie-sounding terms that they use to sound like they know something. Banning the use of such linguistic clues would merely make it a bit more difficult to recognize cluelessness, since we'd have to actually read their comments to decide that they're not worth reading.

It's similar to the use of "hacker", which is another scare term, but it's useful as a clue that the writer is relatively clueless about computer-security issues.

The (mis)use of such terms is also a useful clue to those of us who are trying to find the people who need some educating about technical issues. But that's a different topic, so we should start a new thread if we want to talk about it.

Comment Re:Decimate (Score 1) 231

[...]the H1-B system is totally broken and is being used to help decimate the American middle class.

Dec.i.mate: kill one in every ten of (a group of soldiers or others) as a punishment for the whole group.

As long as it's only one in ten, I'm kind of OK with this. Also, I'm kind of OK with the idea that such punishment is actually deserved, since it implies 90% "good apples" and 10% "bad apples", which, if you've ever worked a middle class job, is very easy to credit as an underestimation.

Except you're a couple of millennia out of date with that definition. I decided to check with a few online dictionaries before commenting. Most of them give a definition much like that of the Cambridge dictionary: [T]o kill a large number of something, or to reduce something severely: Populations of endangered animals have been decimated.

Some do also give the original Latin "kill 1/10th of" definition, but they generally make it clear that that was the Latin meaning, not the modern English meaning. Some even say that it's considered poor form in English to bother specifying the fraction eliminated, on the grounds that it's redundant for people who understand the word and confusing for those who don't.

Comment Re:Is the NYT Racist? (Score 1) 231

**rolls eyes**

I'm so fucking tired of people pretending there is only one definition for the word "decimate".

Heh; another victim of the "etymology is destiny" doctrine (as one linguist - whose name I've forgotten - called it a few years back).

Yup, in Latin, "decimate" meant to kill every 10th man. In modern English, it means to destroy a significant but unspecified portion of a set. How large depends on the speaker/writer, who usually can't be bothered to give the fraction.

It's yet another case of English raiding another language for useful words, and mangling both their pronunciations and their meanings to the point that speakers of the original languages wouldn't recognize either the sound or the meaning.

But if you want to continue using English, you should recognize this general problem, and take it into account. You'll understand that, while the original sound and meaning of a word in the source language is of historic interest, it's nearly useless in decoding the usage in English.

Comment Re:Foxnews.onion (Score 1) 37

How about theonion.onion? Would that be a meta-site for faking fake news, and hiding the people that are thus releasing actual valid information disguised as satire? Sounds like a useful site for the world's whistle blowers ...

(The folks over at theonion.com have been known to "complain" about all the dummies who post their stories as factual new reports. Maybe we could help them out here.)

Comment Makes sense ... (Score 2) 37

In recent elections here in the US, we've been reading of studies showing that the voters who are most knowledgeable about the candidates and the issues are those who follow various satirical news sites. The Daily Show, the Colbert Report, the Onion, and even Wait Wait Don't Tell Me have been named as being highly correlated with informedness. So yes, it makes sense at least minimal sense to have a satire/parody/humor top-level domain.

Of course, Poe's Law applies even here, and we'll continue to see articles posted as fact, even when they're clearly labelled as satire by their URL.

What I'm looking forward to is someone setting up an actual news site there that specializes in stories that really seem like parody or saire, but are actually true. The world has enough such stories to keep at least a small team of journalists busy.

(And I do expect a reply to the above saying "correlation is not causation", so don't disappoint me ...)

Comment Just the first stage. (Score 1, Insightful) 108

It's probably just a matter of time, perhaps not much time, before some entrepreneurs figure out that is a generally-useful marketing tactic. We can expect that the little "selfie" cameras on phones and tablets are being turned on briefly by assorted ads delivered along with the web page you looked at, and sent back to the mother ship for later use. You won't have to go through the bother of signing in or otherwise identifying yourself, since your ISP/cell company can supply them with that info (for a price). They can then use the photo and your info to persuade you that you should buy some of their products. Or they can just fake the session in which you ordered what they want to sell you.

I generally keep a bit of opaque tape over those cameras except when I actually want to use them.

Lessee, I took the tape off this laptop's camera; let's see if the slashcode knows how to send y'all my photo. It's a Macbook Pro, which should tell you which exploit to use. I'm currently sitting on the patio, in the shade of a grape vine, waiting for the temperature to reach a new historic high here in the Boston area. If you can find my photo, tell me the text on my t-shirt. If anyone succeeds, it'll show that this story isn't just someone's imagination. ;-)

All Finagle Laws may be bypassed by learning the simple art of doing without thinking.