With an aging population it seems terribly interesting that it could be possible to go after people wirelessly.
This is the important part, not now, but in the future. This is just a demonstration of what is possible, and how the mistakes that are being made now may effect all of us in the future.
From a recent talk by Cory Doctorow, http://boingboing.net/2012/01/10/lockdown.html
As a member of the Walkman generation, I have made peace with the fact that I will require a hearing aid long before I die. It won't be a hearing aid, though; it will really be a computer. So when I get into a car—a computer that I put my body into—with my hearing aid—a computer I put inside my body—I want to know that these technologies are not designed to keep secrets from me, or to prevent me from terminating processes on them that work against my interests.
We need to change the way that the industry and the regulators think about these kind of devices. Security by obscurity is just not good enough.
As patients (now and in the future) we should require/demand that all of the software in these devices is open source or they won't get certified for use as implants.
Many people on this site have said something along the lines of "If I were designing these devices then I would use [xyz] to make them secure".
The important point is that geeks like us aren't designing these devices, and for the companies that are designing these devices security isn't a priority.
Good security is expensive, both in terms of employing extra staff with the relevant expertise, and in terms of developer time to implement and test it. Unless peer reviewed security is required by their customers or government regulations, then it is just not enough of a priority to justify the additional cost.
The worst result from this kind of research would be that our politicos jump at a sound bite solution and make it illegal to own or design a device that could intefere with implanted medical devices. Preventing the good guys from testing their own devices, while making it easier for the bad guys by allowing manufacturers to get away with poor security.
The best result from this kind of research would be that we make peer reviewed security and open source code part of the requirements for certification of implanted devices. But that won't happen unless we keep pushing to make it happen.