The point I was making was not to state to create an unaltered copy of the original but get enough data on the variation of the copies to be able to mess up the watermark enough to render it useless. Random pick of formatting/wording in deviating sections from one of the N copies obtained at each case. The result may be that you have variants A, B and C as source and your scrambling causes it to look like variant K, so the buyer of variant K will be blamed until they figure out that they are chasing in the wrong direction.
And if the publisher do change texts in different e-books anyone that wants to get around it would just need a few copies and use a statistical analysis to blank out the differences.
This is similar to what steganography does, so if you mess up the punctuation inserted then it will be really hard to look up the perpetrator - or even that the wrong party will be pointed out.
So now the Pandora's box is opened.
I don't even trust myself.
And you will always have people that flies under the radar that can create trouble. The Una Bomber is a good example of a person that really wasn't leaving much trace for the investigators to follow. You can't get everyone, the actions taken with passport controls etc. is annoying for the public and won't really achieve anything.
Even more "fun" is that the name is the key they are looking for, not the individual - so if someone changes name on their passport they may pass through unchecked while if someone has an identical name but isn't the same person they may be scrutinized five times to Friday for no result.
And then require the supplier to be on site to do the upgrades to make sure that they do it right. Screw anyone that complains, bring it to the highest level of the organization with hard numbers of how much a stop will cost.
Total isolation of mission critical networks is the only thing that works.
And there's another blog entry on it: Where Things Fall Apart: Protocols (Part 2 of 2)
The summary is that there's a mutual authentication key (MAK) that should be different for each vehicle on the road, however if some manufacturer has taken a short cut and used the same key on a large number of vehicles then all those cars are at risk, and looking at the article it seems to be the case - the device works for some vehicles but not other.
As for their habit of going in on the passenger side - that's where the glove compartment is and where it's likely that some valuables are found.
The world is not enough.
It's a bit tricky there - the old drugs may be effective and have side effects, the new drugs are more focused and has less side effects.
However a drug that is more focused may not be effective in all cases because the problem may not be what the drug targets while the old drug was broad enough to work regardless.
The reason for depression may vary greatly between individuals, and even if it is narrowed down to a genetic reason it may vary a lot even there.
Just add a pile of Dallas DS18B20 1-wire sensors.
But if you want to measure other things too, or want a method which doesn't require much effort to apply then the Roomba way is interesting since you can just set it up in the area you want to monitor in a few minutes, leave it for a week and then come back to a well-swept room with a decent amount of data.
And both the long and short tons should be obsolete by now. Even the UK is metric these days.
Why do you need to specify metric tons, it should be enough to specify tons, a.k.a. 1000kg. Stating metric these days is redundant.
B.t.w. the Swedish mile is 10km while the British/American is 1.6093472186944 kilometers.
Of course the performance has improved since the P4, but the point is that it has been by tweaking things like cache, parallel execution with discard of unwanted branches, pre-fetch, various types of pipelining, out of order execution and so on.
All this means that in order to achieve high performance you have an architecture that does a lot of stuff that eventually is getting thrown away because it was done just in case. The catch here is that it costs energy and builds complexity. The benefit is of course that you can get a processor that executes multiple instructions per clock cycle which improves performance.
However there's a limit on the performance gain these improvements can provide.
It's an interesting addition which can be useful for some.
But when it comes to general performance improvement it's rather disappointing. Looks like they have fine tuned the current architecture without actually adding something that increases the performance at the same rate as we have seen the last decades. To some extent it looks like we have hit a ceiling in increased performance with the current overall computer architecture and that new approaches are needed. The clock frequency is basically the same as for the decade old P4, the number of running cores on a chip seems to be limited too, at least compared to other architectures.
One interesting path for improving performance that may be useful is what Xilinx has done with their Zynq-7000 which combines ARM cores with FPGA, but it will require a change in the way computers are designed.
Those extra pixels in height makes a difference, especially if you do something else than watching movies and playing games.
Widescreen for computers isn't really that good since a lot of computer work is about reading and writing, not active content. A 4:3 monitor on a computer makes sense if you work with static content where you want a good overview without resorting to scrolling up and down.
A 1920x1440 monitor would be interesting if it was decently priced.