Comment Re:Better Writeup (Score 1) 156
That's how perl was fixed - XOR each incoming key with a random value generated when the hash is initialized.
That's how perl was fixed - XOR each incoming key with a random value generated when the hash is initialized.
The real problem here is that it's fairly easy to compute a set of hash keys that are known to generate collisions on a specific hash table implementation. The easiest fix by far - the fix that perl implemented in 2003 - is to generate a random value when the hash is initialized, and XOR each incoming key with it before processing. That breaks collision prediction on the attacker's side quite effectively.
To be precise, it elements with equal *exit* hash values - the same hash key will simply overwrite prior values. Internally, the language runs a hash algorithm against the key and uses the resulting value to generate an index to the array that *actually* holds the key/value pair. If multiple keys hash to the same index, then the value will actually be another array, containing all the key/value pairs that mapped to that index. You then need to walk that index to find the key you're looking for.
The downside of this, of course, is that if all of your keys map to the same hash value, then you have to walk the list of *all* key/value pairs to find your value. Producing this scenario on demand is how you kill servers with it.
The "real" code fix so far is to transmute the key with a random value (generated at application startup, or at instantiation of the hash map) before running the hash algorithm, thus making it impossible to predict which keys will generate hash collisions. This is how perl was fixed this back in 2003
Most folks seem to simply be setting limits on the number of fields in POST (or the maximum size of a POST payload) for now until they can fix their code. Putting limits on the number of HTTP headers in a request is needed as well, as apache itself puts headers in a hash map.
Entirely plausible. Conficker's phone-home mechanism was an algorithm that hashed the current date/time to generate a nonsense domain name, which it would then try to look up and grab a payload from. All the Bad Guys had to do was register one a few hours in advance, put up the payload, and wait. The groups who were fighting the thing managed to decompile the algorithm and play it forward, generating a list of hundreds of thousands of domain names that they then took to the various registries to get blocked. Paul Vixie was a big part of this, and here's a pretty good article on the group.
It would not surprise me at all if CIA/Mossad/etc managed to get one of those domains un-blocked and used to deliver the Stuxnet payload.
More to the point, it seemed that the biggest initiatives within Yahoo while I was there (from 2009 until early this year) were *all* centered around profit, not users - mainly, cost-cutting and ad tech. As if the goal wasn't to grow users, just grow revenue and profit per existing user. What opened my eyes was when the cost-cutting initiatives that made sense - primarily the data center consolidations, which definitely needed to get done ASAFP - started getting pushed back due to the need for quarter-to-quarter profit management. Bartz should have grown a pair, pushed forward the consolidation even if it meant missing the street for the quarter, allowing Yahoo to reap the rewards much sooner.
I'll also never forget the quarterly all-hands meeting where the major product announcement for the quarter was...*full-page ads on the login page*.
Sorry I didn't stick around to see Bartz go, but I couldn't risk her *not* going.
I'm guessing the $3 comes from $1 for each of the three charges in the original suit - the lowest amount a US Judge is allowed to award a plaintiff. In other words: "I have to decide in your favor, but I'll be damned if you actually get anything out of it".
Richard Branson, Bill Gates, Warren Buffet, Michael Bloomberg, Brin/Page, Zuck, and Larry Ellison would all like to have a word with you.
"Work For Hire" provisions are unenforceable in recording contracts because US copyright law is rather specific about what can be considered a work for hire - and sound recordings are not on the list. It was briefly added in 1999 but was removed a year later.
http://www.salon.com/technology/feature/2000/06/14/love/print.html
Apparently a "work for hire" provision did get slipped into federal copyright law - and I mean literally slipped in while no one was paying attention. After Love's speech brought attention to this, the provision was repealed a year later.
So unless the laws get changed again (and the RIAA *will* try), the artists have the upper hand. Sad to imagine how much they'll spend in legal fees to get to their money though.
As a 39-year-old who switched from perl to python about 3 months ago, I can agree with this statement.
No, unfortunately...I think I just renamed it 'irc' or 'company-irc' or similar.
Is this the same scanning that Google does with GMail? If so, why no outcry there?
True story - when I was implementing an internal IRC network for a former employer, I was instructed to add BitchX to our desktop UNIX builds - but rename the binary.
Or Royksopp's video for "Happy Up Here"...
The RPC system they're using is Thrift (http://thrift.apache.org/)., which they developed because JSON was becoming a bottleneck. And yeah, there's a metric crapload of memcached in their data centers as well. The multi-hour outage Facebook had late last year was due to a near-complete failure of the memcached layer, resulting in an overload of requests to the main mysql farms.
The moon is made of green cheese. -- John Heywood