Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Bug Medicine Privacy Security Science

Two Different Studies Find Thousands of Bugs In Pacemakers, Insulin Pumps and Other Medical Devices 47

Two studies are warning of thousands of vulnerabilities found in pacemakers, insulin pumps and other medical devices. "One study solely on pacemakers found more than 8,000 known vulnerabilities in code inside the cardiac devices," reports BBC. "The other study of the broader device market found only 17% of manufacturers had taken steps to secure gadgets." From the report: The report on pacemakers looked at a range of implantable devices from four manufacturers as well as the "ecosystem" of other equipment used to monitor and manage them. Researcher Billy Rios and Dr Jonathan Butts from security company Whitescope said their study showed the "serious challenges" pacemaker manufacturers faced in trying to keep devices patched and free from bugs that attackers could exploit. They found that few of the manufacturers encrypted or otherwise protected data on a device or when it was being transferred to monitoring systems. Also, none was protected with the most basic login name and password systems or checked that devices they were connecting to were authentic. Often, wrote Mr Rios, the small size and low computing power of internal devices made it hard to apply security standards that helped keep other devices safe. In a longer paper, the pair said device makers had work to do more to "protect against potential system compromises that may have implications to patient care." The separate study that quizzed manufacturers, hospitals and health organizations about the equipment they used when treating patients found that 80% said devices were hard to secure. Bugs in code, lack of knowledge about how to write secure code and time pressures made many devices vulnerable to attack, suggested the study.
This discussion has been archived. No new comments can be posted.

Two Different Studies Find Thousands of Bugs In Pacemakers, Insulin Pumps and Other Medical Devices

Comments Filter:
  • by Snotnose ( 212196 ) on Friday May 26, 2017 @08:21PM (#54495255)
    Companies used to building medical hardware have discovered microcontrollers and hired the cheapest programmer or two they could find to program it. Companies not used to software, hiring low skilled programmers, probably giving them unreasonable schedules and requirements. Color me shocked.

    Love to hear from one of the programmers who programmed one of these things, hear what they have to say.
    • by Anonymous Coward on Friday May 26, 2017 @10:35PM (#54495783)

      I used to program pacemakers. You are wrong that these are the cheapest programmers they could find, and in general about your assumptions as to how the industry works.

      The medical industry doesn't work that way, at least for life-critical, highest-risk medical devices like pacemakers and implantable defibrillators. If a company operated as you describe, the FDA would shut it down very quickly. The pacemaker company I worked for hired highly skilled people, and used an internal software and hardware development process that makes the other software companies that I have worked for look like a bunch of amateurs by comparison. ISO 9001 compliance and FDA certification is no joke. There were detailed reviews at every level of the software development process: proposal reviews, system design reviews, detailed design reviews, code standards, code reviews, detailed regression tests, near-100% branch coverage (not just statement coverage) by regression tests, static analysis tools, simulation, animal testing, and so forth. It is quite focusing to realize that a coding mistake on your part may very well lead to someone's death.

      Also, the industry is incredibly conservative, since every change, no matter how trivial, has to be justified to the FDA as to why it won't make a device unsafe. At the time I left, about 15 years ago, they were still using a DOS clone in their pacemaker programmers (basically a device that allows a doctor to download a set of configuration parameters into a pacemaker using something that looks like a mouse placed on the patient's chest) because Windows was too poorly understood as to its runtime characteristics (especially with regards to reliable real-time behavior, memory allocation, and interrupt behavior), and too poorly controllable to be trusted, especially as we couldn't get source code for it. (Note: something like Linux has most of the same problems. We were looking into the possibility of using QNX in the future, which looked more promising.) Still, while in that industry, I rarely worked with technology that was less that 10 years old, because it was assumed that in 10 years time, most of the bugs in the technology would have been known and documented, and understood as to their impact. This applied to hardware, operating systems, and development tools. Having tools be reliable (or at least unreliable in well-understood ways) was considered more important than having them be cutting-edge.

      However, that elaborate software process was geared towards patient safety in the face of ordinary threats such as misconfiguring by physicians, low battery brownout scenarios, bits being twiddled by cosmic rays (Yes, really! We took precautions for this!), stray magnetic or electrical fields, unexpected patient heart behavior patterns, and so forth. Many, many precautions ("risk mitigations") were taken for events such as these. Deliberate hacking by outsiders was not really a concern on anyone's mind, particularly since to do so would have required close physical contact (due to one of the aforementioned risk mitigations, which I am not going to go into the details of), and so the data transmission protocols were geared towards detecting errors due to mistransmission or data corruption, rather than ones due to deliberately constructed valid-but-evil data.

      Still, the close physical contact requirement would make it fairly difficult for someone to hack one of these devices in most practical scenarios. I don't see it as much of a threat. If someone is close enough to download a bad configuration into a pacemaker, they are close enough to do much more simple and direct harm to that person, which would be vastly easier to do. I can imagine very convoluted scenarios in which someone could try to perform murder-by-pacemaker, but realistically, there would usually be much more simple and reliable ways that wouldn't require stealthily breaking into a company to steal their specifications, or reverse-engineering their hardware and software. It's just too much effort for even a nation-state

    • by Anonymous Coward

      I used to program pacemakers. You are wrong that these are the cheapest programmers they could find, and in general about your assumptions as to how the industry works.

      The medical industry doesn't work that way, at least for life-critical, highest-risk medical devices like pacemakers and implantable defibrillators. If a company operated as you describe, the FDA would shut it down very quickly. The pacemaker company I worked for hired highly skilled people, and used an internal software and hardware developm

    • I don't even see how it's possible to hack a pacemaker. How do you even connect to it? I doubt those things connect over wifi, the battery would get drained too fast.
      • by Ihlosi ( 895663 )
        I don't even see how it's possible to hack a pacemaker

        That's an issue with your creativity

        How do you even connect to it?

        The same way the doctor connects to it for things like status queries, configuration changes, etc.

        And you're right, not wifi.

    • by Ihlosi ( 895663 )
      Companies used to building medical hardware have discovered microcontrollers and hired the cheapest programmer or two they could find to program it.

      This isn't how it works.

      Firstly, they discovered microcontrollers about thirty years ago, so that's not really new.

      Secondly, they just make the same mistake many other companies do - they want one person that does hardware and firmware development (check some job postings), because that saves money (haha). Which is bad. Someone who's excellent at analog an

    • The really small implants surely use an ASIC with embedded microcontroller, rather than an off the shelf discrete microcontroller and all the other stuff in the ASIC. In which case, the power consumption excuse for not running encryption (it needs too much processing horsepower) doesn't cut the mustard. Intel designed instructions to speed AES processing (AES-NI) https://en.wikipedia.org/wiki/AES_instruction_set/ [wikipedia.org]. It's a design time, design cost issue, and sheer arseholiness for companies to claim that it

      • by Ihlosi ( 895663 )
        In which case, the power consumption excuse for not running encryption (it needs too much processing horsepower) doesn't cut the mustard.

        Using encryption uses more power than not using encryption. And the more power the device uses, the more often you need to cut holes in the patient to replace it. Most patients don't like that part. And encryption doesn't just use power through CPU usage, it also requires more RAM and possibly flash than no encryption, and both memories also draw power.

        Also, consider

  • killing off the less fit?
  • Overly complex (Score:4, Insightful)

    by Gravis Zero ( 934156 ) on Friday May 26, 2017 @08:23PM (#54495277)

    Honestly, if you have 8000 bugs in your system then you haven't just done a bad job of securing your code, you have done a bad job of architecting your software and hardware. Bottom line, they should fire the people in charge of designing this shit and everyone in management who pushed these devices out before they were ready. Alternatively, start holding individuals inside corporations personally liable for things like criminal negligence and you'll find devices will get properly secured instead of being pushed out the door.

    • Re:Overly complex (Score:5, Informative)

      by Ihlosi ( 895663 ) on Saturday May 27, 2017 @04:58AM (#54496555)
      Honestly, if you have 8000 bugs in your system then you haven't just done a bad job of securing your code,

      May I suggest you read the paper first before heading off on a rant?

      Basically all of the 8000 vulnerabilities (not bugs) are due to third-party libraries used in one of the components examined, which include "home monitoring devices" and "physician programmers", both systems that probably run Linux/Windows and hence inherit a lot of vulnerabilities from there.

  • by Applehu Akbar ( 2968043 ) on Friday May 26, 2017 @08:29PM (#54495317)

    If the FDA had to approve all these devices, even at the cost of making the price of everything exorbitant, their rigorous testing would ensure that the firmware wouldn't be riddled with all these bugs.

    Oh, wait --

    • The FDA doesn't actually do testing. The device maker supplies evidence that development followed a process that includes testing, the types and amounts of testing being based on the risks posted by the device.

      A company can potentially lie, and claim they did testing that they didn't - but Goddess help you if FDA figures that out: you're in deep, deep trouble. And they can definitely figure it out during a regular inspection, or if people get injured by your product, etc.

      • Certainly, but the FDA is supposed to manage a testing process that assures, in a case like this, that ISIS wouldn't be able to pull a trick like setting off all the stimulating pacemakers (the kind that can actually reboot a failing heart) in Hollywood right in the middle of Academy Awards night.

  • by bfmorgan ( 839462 ) on Friday May 26, 2017 @08:46PM (#54495411)
    Please understand that money is the reason that companies make this devices. When security concerns raise their ugly head, they get slapped down. Ego, we have devices that are open for exploitation. Security will also be ignored when $$$ are you objective. If customers stop buying these devices because they are insecure then, and only then will manufactures add the cost of security into the price of these devices.
    • To exploit a pacemaker, you will have to reverse engineer the communication protocol and API for a particular brand and model of pacemaker, then ghetto-hack your own portable pacemaker programmer, then find a victim who happens to have that exact brand and model inside their chest, and then hold your device right up against their chest for more than an awkward amount of time while your device reconfigures the victims pacemaker. There are cheaper, faster, and far more effective ways to mess people up, but go
  • No question the CIA knows all about these bugs, this vector is an ideal assassination technique. Just think if they had this tech in the 60's and Fidel had had a pacemaker or other medical device like one of these. The bar for political assassinations for Western nations has risen (a little), but the Russians would be all over this as well.

  • I'm working with some other folks to start a company to develop, manufacture, and market open-source medical devices. We all have extensive experience in developing commercial medical devices - defibrillators, radiation therapy for tumors, etc - and we're convinced that getting more eyeballs to review software and hardware will substantially increase safety and reduce costs.

    Yes, we know how to work with the FDA and so forth.

    Stay tuned...

    • You say open-source, but will your organisation commit explicitly and non-retractably that ALL the code and hardware (including any ASIC HDL) will be published in a way that any individual can duplicate any design independently without agreeing to any contract, such as NDA?

      In which case, how does your business model work?

  • by Ihlosi ( 895663 ) on Saturday May 27, 2017 @05:14AM (#54496573)
    Are debugging interfaces (e.g., JTAG and UART) present on home monitoring devices or physician programmers? Are the interfaces or functionality disabled prior to distribution?

    So how is the manufacturer supposed to diagnose devices that malfunctioned out in the field? If you lock the debugging interfaces, they can usually only be reactivated by completely clearing the devices flash memory - or even not at all.

    Are third-party libraries used in software development?

    What kind of question is that?

    Ok, I've seen code without any third party libraries. It was all assembly, only available in hardcopy, written for an 8051 and about 30 years old.

    Is the firmware image for the implantable cardiac device mapped into protected memory to prevent arbitrary writing to memory addresses?

    I would guess the implantable device doesn't use a microcontroller beefy enough to have an MPU. That would reduce battery lifetime.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...