That article is actually linked to in the post. It's just one of the worst submissions I've seen for a while in making it hard to find the article it is referencing.
That was actually linked to in the post. Sadly, the post was remarkably unclear as to what it was actually referring to.
Which is exactly what the paper is about: the trapezoidal rule which it has 'invented' is equivalent to drawing straight lines through the points (pretty much the simplest curve you can get) and integrating.
There are indeed better ways of doing it, but what's interesting is that the paper claims researchers were using techniques that were even worse.
This isn't integration. This is a numeric technique for estimating the area under the curve (the trapezoidal rule). This is a somewhat different branch of mathematics to integral calculus, which deals in the infinitesimal limits to provide exact results. You can't use integral calculus here, as there is no formula to integrate, only experimental results.
It looks like this area is indeed in need of some interdisciplinary communication: what they really need is for a statistician to come up with a robust formula for this taking into account the errors.
Though a very valid comment (Simpson's Rule would be better), note that you may not be able to apply Simpson's Rule here directly. The basic form of Simpson's Rule needs evenly spaced sample points, which might not be the case for experimental results.
Actually, from the abstract this looks like a moderately interesting paper. Also note that the slashdot summary is (as often the case) wrong. You can't solve the problem the paper is referring to with integral calculus.
The curve that the paper is talking about is an experimental result, not a formula. All you have are the experimental samples from the curve. Without a formula, you CAN'T do integration, and must rely on a numerical technique. What he's 'invented' here is the trapezoidal rule. He'd do even better with something like Simpson's rule, but that might be impossible to apply if the sample points are not evenly spaced. Similar problems occur for the various Runge-Kutta methods.
Although the numerical technique that claims to be invented here is indeed a basic numerical technique, the paper is interesting for pointing out that the even cruder numerical techniques that have been used before are overestimating the curve area, and that is an interesting result.
Because you are too lazy to add it?
It's special because most Android phones are NOT getting a security update for the known flaw.
Because it has become easy to create 2 plaintexts that both hash out to the same SHA-1 value. See the section titled "SHA-1" which talks about attacks on the hash function.
Um, the very article you link to lists no found collisions, only theoretical attacks (where an algorithm could be used to find one faster than a brute force search).
Given that I've yet to see an actual SHA-1 collision published, it's hardly "easy" to do...
The problem with the FireSheep discussion is that there is no current solution to this.
People keep saying that the social media sites should use https. However, they CAN'T use https for the entire session: advertising content delivery networks like AdSense don't support https, so it won't work.
As I've mentioned in other messages, this is the real problem. Advertisements can't be served over https as the major networks like Google's AdSense don't support https. This is exactly the kind of third-party content you mention.
So sites that are funded by advertisement will use http not https.
This kind of thing is the fundamental problem. Interoperability issues like this are why the major advertising content delivery networks (including Google's AdSense) don't support https.
As they don't support https, social media sites can't use https for the entire session as they wouldn't be able to serve ads, and so wouldn't make any money.
So we get insecure social media sites, as these are the only ones that can stay in business.
The real problem is that most social media sites CAN'T use https by default.
Most of the advertising content delivery networks (and this does include Google's AdSense) don't support https.
Thus, if the social media site used https for the entire session, then they wouldn't be able to serve ads, and wouldn't be able to fund the service. So it isn't going to happen.
There is a real problem with current web protocols that security is all or nothing. You can use http and be insecure, or use https and break all kinds of network technologies (e.g. proxy caches). There is no way to have authenticated but not encrypted data, and the browser security functions make it very hard to mix content from different sources.
Blu-ray full HD video tops out at 40 megabits a second. This is easily handled by any half-decent USB flash drive.
There is no need to emulate a particular windows platform, you just need a stable win32 library for compatibility.
It's pretty easy to write windows software that will work on any windows version from 2000. It's not much harder to support 95+. The basics of win32 have been stable for years.
The 16bit elements are now irrelevant (unless you want to run some very old windows games). Even Microsoft doesn't support 16bit software on the 64bit Windows versions.