Please create an account to participate in the Slashdot moderation system


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Submission + - Homeland Security Preparing Guide for Securing Connected Things (

chicksdaddy writes: The Department of Homeland Security is readying a set of security guidelines for Internet of Things device makers and for consumers that it will soon release, The Security Ledger is reporting (

DHS, which houses the U.S. Computer Emergency Readiness Team (CERT), as well as the U.S. Secret Service, is assembling a set of strategic principles that it says will help safeguard and secure the Internet of Things by providing high level guidance to industry about how to design and manufacture secure connected devices. For consumers, DHS will lay out guidelines about how to manage the risks posed by Internet connected devices in their homes, cars and businesses.

Robert Silvers, the DHS Assistant Secretary for Cyber Policy, told Security Ledger that the agency thinks it can play a key role in setting cross industry standards for the Internet of Things.

“What we’ve come to recognize is that the Internet of Things is a full-blown phenomenon,” said Rob Silvers, the DHS Assistant Secretary for Cyber Policy. “We think everyone. Govt. industries, consumers need to get serious about reasonable security being built into IoT devices. And we need to do it now before we’ve deployed an entire ecosystems,” he said.

Silvers will outline the agency's forthcoming guidance in a speech at The Security of Things Forum ( in Cambridge, Mass on Thursday.

Submission + - FTC Warns Consumers: Don't Sync To Your Rental Car! (

chicksdaddy writes: The Federal Trade Commission is warning consumers to beware of new ‘connected car’ features that allow rental car customers to connect their mobile phone or other devices to in-vehicle infotainment systems, The Security Ledger reports. (

“If you connect a mobile device, the car may also keep your mobile phone number, call and message logs, or even contacts and text messages,” the FTC said in an advisory released on Tuesday. ( “Unless you delete that data before you return the car, other people may view it, including future renters and rental car employees or even hackers.”

The Commission is advising renters to avoid syncing their mobile phones to their rental car, or to power devices via a USB port, where settings on your device may allow automatic syncing of data. Consumers who do connect their device should scrutinize any requests for permissions. Renters are also urged to remove their device from the vehicle’s memory before handing it back over to the rental firm.

Submission + - Was St. Jude Medical Device Hack Report Just Armchair Engineering? (

chicksdaddy writes: The battle of words over warnings from a Wall Street trader about serious security flaws in implantable medical devices ( continued on Tuesday, as researchers from The University of Michigan joined St. Jude itself in raising doubts about research that was used by the investment firm Muddy Waters to bet against ( or “short”) the stock of St. Jude Medical, a major medical device maker, The Security Ledger reports (

In a statement released on Tuesday, Kevin Fu and Thomas Crawford of the Archimedes Center for Medical Device Research did not directly challenge the findings of the report by Muddy Waters and the firm MedSec, but did suggest that, rather than being evidence of a successful attack, the output observed by the researchers may have been typical for a home-monitored implantable cardiac defibrillator (ICD) device being tested while not properly connected to a patient.

“The U-M team reproduced error messages the report cites as evidence of a successful ‘crash attack’ but the messages are the same set of errors that display if the device isn’t properly plugged in,” the University said in a statement.

“We’re not saying the report is false. We’re saying it’s inconclusive because the evidence does not support their conclusions,” said Fu, U-M associate professor of computer science and engineering and director of the Archimedes Center for Medical Device Security. Fu is also co-founder of medical device security startup Virta Labs.

In a separate blog post, Kevin Fu of the University of Michigan said the research that informed the Muddy Waters report may be an example of 'armchair engineering.' (

The conflict may come down to how different viewers interpret the same events. The behavior witnessed by the MedSec researchers and described in their report may not have been a security issue, but simply evidence of the device acting as designed, Fu and his colleagues say.

A defibrillator’s electrodes are connected to heart tissue via wires that are woven through blood vessels the wires are used both for sensing operations and to send shocks to the heart, if necessary. No surprise, when the defibrillator is not connected to a human host, the data transmitted by the device is quite different.

“When these wires are disconnected, the device generates a series of error messages: two indicate high impedance, and a third indicates that the pacemaker is interfering with itself,” said Denis Foo Kune, former U-M postdoctoral researcher and co-founder of Virta Labs” in a statement.

That behavior is very similar to what is described in the Muddy Waters report on St. Jude as evidence of a successful attack.

While medical knowledge isn’t necessary to find vulnerabilities in a medical device or even hack them, it is critical to understanding the clinical implications of any software flaws and whether there is the possibility of causing harm to patients, Fu said.

Submission + - The Big Short: Security Flaws Fuel Bet Against St. Jude (

chicksdaddy writes: Call it The Big Short – or maybe just the medical device industry’s “Shot Heard Round The World”: a report from Muddy Waters Research ( recommends that its readers bet against (or “short”) St. Jude Medical after learning of serious security vulnerabilities in a range of the company’s implantable cardiac devices, The Security Ledger reports. (

The Muddy Waters report on St. Jude’s set off a steep sell off in St. Jude Medical’s stock, which finished the day down 5%, helping to push down medical stocks overall. (

The report cites the “strong possibility that close to half of STJ’s revenue is about to disappear for approximately two years” as a result of “product safety” issues stemming from remotely exploitable vulnerabilities in STJ’s pacemakers, implantable cardioverter defibrillator (ICD), and cardiac resynchronization therapy (CRT) devices. The vulnerabilities are linked to St. Jude’s Merlin@home remote patient management platform, said Muddy Waters.

The firm cited research by MedSec Holdings Ltd. a cybersecurity research firm that identified the vulnerabilities in St. Jude’s ecosystem. Muddy Waters said that the affected products should be recalled until the vulnerabilities are fixed.

In an e-mail statement to Security Ledger, St. Jude’s Chief Technology Officer, Phil Ebeling, called the allegations “absolutely untrue.” “There are several layers of security measures in place. We conduct security assessments on an ongoing basis and work with external experts specifically on Merlin@ home and on all our devices,” Ebeling said.

More controversial: MedSec CEO Justine Bone acknowledged in an interview with Bloomberg ( that her company did not first reach out to St. Jude to provide them with information on the security holes before working with Muddy Waters.

Information security experts who have worked with the medical device industry to improve security expressed confusion and dismay.

"If safety was the goal then I think (MedSec's) execution was poor," said Joshua Corman of The Atlantic Institute and I Am The Cavalry. "And if profit was the goal it may come at the cost of safety. It seems like a high stakes game that people may live to regret."

Submission + - One in Five Vehicle Software Vulnerabilities are 'Hair on Fire' Critical (

chicksdaddy writes: One of every five software vulnerabilities discovered in vehicles in the last three years are rated “critical” and are unlikely to be resolved through after the fact security fixes, according to an analysis by the firm IOActive, The Security Ledger reports. (

“These are the high priority ‘hair on fire’ vulnerabilities that are easily discovered and exploited and can cause major impacts to the system or component,” the firm said in its report (, which it released last week. The report was based on an analysis of more than 150 vehicle security flaws identified over three years by IOActive or publicly disclosed by way of third-party firms.

The report studied a wide range of flaws, most discovered in IOActive’s work with automakers and suppliers to auto manufacturers, said Corey Thuen, a Senior Security Consultant with IOActive. Thuen and his colleagues considered what kinds of vulnerabilities most commonly affect connect vehicles, what types of attacks are most often used to compromise vehicles and what kinds of vulnerabilities might be mitigated using common security techniques and tactics.

The results, while not dire, are not encouraging. The bulk of vulnerabilities that were identified stemmed from a failure by automakers and suppliers to follow security best practices including designing in security or applying secure development lifecycle (SDL) practices to software creation. “These are all great things that the software industry learned as it has progressed in the last 20 years. But (automakers) are not doing them.”

Submission + - Software Errors Already Affect Patient Outcomes. (We Just Don't Measure Them.) (

An anonymous reader writes: Medical errors linked to the failure of medical device hardware and software may already impair patient health, but little is known about the problem, because it is rarely measured, The Security Ledger reports.(

Speaking on a panel focused on medical device security at Codenomicon 2016 ( in Las Vegas on Tuesday, a group of leading medical device and information security experts said that software errors that affect patient care almost certainly occur, but more needs to be done to identify and measure them if care delivery organizations hope to improve patient outcomes.

“I believe there has already been patient harm,” said Dr. Dale Nordenberg, the co-founder and Executive Director of the Medical Device Innovation, Safety & Security Consortium.

Nordenberg told Security Ledger that discrete interactions that patients have with medical devices each year in healthcare settings in the U.S. numbers in the billions, making errors and malfunctions that affect patient care in some way a certainty.

Only rarely do such incidents warrant notice. In May the Food and Drug Administration published an alert about an incident in which antivirus software caused a medical diagnostic computer to fail in the middle of a cardiac procedure, denying physicians access to data and potentially endangering patient safety. (

Recent news reports have also underscored the fragile nature of many clinical networks. Widespread infections of ransomware like SamSam ( have crippled clinical networks and forced clinical staff to cancel patient appointments, delay procedures and fall back to paper record keeping.

Despite such incidents, there is no official effort to track the link between software or hardware failures, malicious software infections or user-related errors and patient outcomes.

“In medicine, outcomes drive decisions about what to do, and we don’t have data that’s clear enough to design intervention programs,” Nordenberg told the audience at the event.

Submission + - Thousands of Bugs Found on Medical Monitoring System (

chicksdaddy writes: The Department of Homeland Security warned of hundreds of vulnerabilities in a hospital monitoring system sold by Philips. Security researchers who studied the system said the security holes may number in the thousands, according to a report by The Security Ledger (

The Department of Homeland Security’s Industrial Control Systems Cyber Emergency Response Team (ICS-CERT) issued an alert on July 14 ( about the discovery of 460 vulnerabilities in the Philips Xper-IM Connect system, including 360 with a severity rating of “high” or “critical” severity. But an interview with one of the researchers who analyzed the Xper system said that the true number of vulnerabilities was much higher, numbering in the thousands.

Xper IM Connect is a “physiomonitoring” system that is widely used in the healthcare sector to monitor and manage other medical devices. Research by two companies, Synopsys and Whitescope LLC, working in collaboration with Philips, found that the system is directly afflicted by 460 software vulnerabilities, including 272 in the Xper software itself and 188 in the Windows XP operating system that Xper IM runs on. The vulnerabilities include remote code execution flaws that could allow malicious code to be run on the Xper system as well as vulnerabilities that could expose sensitive information stored on Xper systems.

Submission + - Auto Industry Publishes Cybersecurity Best Practices (

chicksdaddy writes: The Automotive industry’s main group for coordinating policy on information security and “cyber” threats has published a “Best Practices” document (, giving individual automakers guidance on implementing cybersecurity in their vehicles for the first time.

The Automotive Information Sharing and Analysis Center (ISAC) released the Automotive Cybersecurity Best Practices document on July 21st, saying the guidelines are for auto manufacturers as well as their suppliers.

The Best Practices cover organizational and technical aspects of vehicle cybersecurity, including governance, risk management, security by design, threat detection, incident response, training, and collaboration with appropriate third parties.

Taken together, they move the auto industry closer to standards pioneered decades ago and embraced by companies like Microsoft. They call on automakers to design software to be secure from the ground up and to take a sober look at risks to connected vehicles as part of the design process.

Submission + - Facebook Offers Innumerate Explanation For Its 1% Black Tech Workforce 1

theodp writes: Back in 2014, Gas Station Without Pumps patiently explained that while the case can clearly be made for female and black students being under-represented in Advanced Placement Computer Science exams, pointing to states with zero female or Black AP CS test takers is not the way to do it. Of the eleven states that had no Black test takers in 2013, GSWP explained: "The zero black AP CS test takers for the nine states can be fairly confidently attributed to the lack of AP CS test takers, and in Maine to the shortage of black students. For Alaska, the lack of black AP CS test takers is probably due to the shortage of AP CS test takers in the state." But that didn't stop Facebook from using the dramatic-but-statistically-fallacious arguments on Thursday to explain away its still-1% Black tech workforce. "It has become clear that at the most fundamental level, appropriate representation in technology or any other industry will depend upon more people having the opportunity to gain necessary skills through the public education system," said Facebook Global Director of Diversity Maxine Williams, who was tasked with explaining why Facebook's diversity efforts don't seem to be working (Facebook's tech workforce is 48% White, 46% Asian, 3% Hispanic, 1% Black, 2% Other). "Currently, only 1 in 4 US high schools teach computer science," Williams continued. "In 2015, seven states had fewer than 10 girls take the Advanced Placement Computer Science exam and no girls took the exam in three states. No Black people took the exam in nine states including Mississippi where about 50% of high school graduates are Black, and 18 states had fewer than 10 Hispanics take the exam with another five states having no Hispanic AP Computer Science (CS) test takers. This has to change." To give Facebook's innumerate explanation some context, according to 2015 AP Data, Mississippi had a grand total of five AP CS test takers. And in the three states where no girls took the exam — Montana, Mississippi, and Wyoming — boys respectively took zero, five, and three AP CS exams.

Submission + - Ransomware in Hospitals Violates HIPAA Patient Privacy Law (

chicksdaddy writes: Ransomware infections have been plaguing the healthcare field for much of the last two years. But amidst all the reports of hospitals hamstrung by encrypted, clinical systems, there’s been precious little talk about whether such incidents are violations of patients’ privacy under the federal HIPAA legislation. Now we have an answer: yes.

Security Ledger reports ( that the U.S. Department of Health and Human Services on Monday issued new guidance ( that suggests strongly that ransomware infections that affect electronic patient health information (ePHI) are reportable violations under HIPAA.

“When electronic protected health information (ePHI) is encrypted as the result of a ransomware attack, a breach has occurred because the ePHI encrypted by the ransomware was acquired,” HHS said in its guidance. (PDF)

The new guidance comes after a period of consideration and debate within policy circles about whether having patient records encrypted by ransomware should count as a “breach” of patient privacy. In theory, the files aren’t being accessed and viewed, simply scrambled and held for ransom. Or so the thinking went.

Writing on the Virta Labs blog (, Virta CEO and University of Michigan researcher Kevin Fu, noted that the HHS guidelines get a lot right: ruling out an exemption for systems with Full Disk Encryption running (ransomware, by its very nature, operates when the machine is running and the operating system and file system are accessible).

Fu expected that the guidelines would be “bad news” for the majority of Health Delivery Organizations (HDOs) covered by HIPAA. “The OCR guidance means you just got clarity on whether ransomware results in a breach. Sorry, the answer is yes, unless you have methodical evidence to the contrary.”

Submission + - Open Source Downloads Almost Doubled in 2015 (

chicksdaddy writes: The use of open source software exploded in 2015, almost doubling from the year before, according to a report from the firm Sonatype. ( The company, which manages the world’s largest repository of open source components, said it received 31 billion download requests from its Central Repository during 2015, up from over 17 billion such requests in 2014. The average enterprise downloaded 229,000 open source components during the same period.

However, software quality continues to be an issue, with a survey of 25,000 applications revealing that close to 7% percent of open source components in use had a known security defect that could lead to successful attacks.

While 7% (actually 6.8%) might not sound like much — just one of every 16 components — in the supply chain world, its a pretty ugly statistic, Sonatype warned. “Imagine if one in every 16 of the parts in your iPhone were known defective – or 1 in every 16 parts in your car,” Derek Weeks, a Vice President and advocate for DevOps at Sonatype told The Security Ledger.(

The State of the Software Supply Chain Report surveys data from Sonatype’s Central Repository, a public repository of open source components for the Java development community to reveal high level trends within the open source industry. Sonatype also tapped data from other open source repository including, NPM, DockerHub and Nexus, the company’s private repository.

In 2015, that data showed a hockey-stick like curve marking the increase in open source component use and activity across the space. Sonatype said that the volume of open source download requests has increased 64 times over since 2007, driven by a shift in application development towards a component-based architecture that heavily relies on open source to accelerate development by leveraging already-created software components.

Submission + - Feds Contemplate Bounty Program for Medical Devices (

chicksdaddy writes: The Security Ledger notes ( that the U.S. Department of Health and Human Services is considering a bug bounty program for medical devices and healthcare technology, modeled after the Department of Defense's recently launched Hack the Pentagon program. (

The Chief Privacy Officer at the Department of Health and Human Services (HHS) has made public statements that suggest HHS is considering a similar program.

Speaking at the Collaboration of Health IT Policy and Standards Committees meeting on June 23, Lucia Savage, chief privacy officer at HHS’s Office of the National Coordinator for Health Information Technology, said that the practice could show promise at HHS if it was scaled up to meet health care needs, Federal Times reported on June 23rd. (

"This is a struggle for devices as well,” she said. “You can’t hack something in the field, because what if the hacker disrupts the operation of the device. Similarly, health data and EHRs, we may not want to have the hacker accessing your live data because that might cause other problems relative to your obligation to keep that data confidential."

"Given that space and given the need to improve cybersecurity, is there something that ONC can do to improve that rate at which ethical hacking occurs in health care?” Savage wondered.

On June 17, U.S. Secretary of Defense Ash Carter announced preliminary results from the program, which invited some 1,400 vulnerability hunters to try their luck on DOD systems. In all, the DOD paid bounties for 138 vulnerabilities submitted by 250 researchers. In all, the DOD paid out $150,000 in bounties, with about half going to the hackers.

Submission + - Study Finds Password Misuse in Hospitals is Endemic (

chicksdaddy writes: Hospitals are pretty hygienic places — except when it comes to passwords, it seems.

That's the conclusion of a recent study ( by researchers at Dartmouth College, the University of Pennsylvania and USC, which found that efforts to circumvent password protections are "endemic" in healthcare environments and mostly go unnoticed by hospital IT staff.

The report describes what can only be described as wholesale abandonment of security best practices at hospitals and other clinical environments — with the bad behavior being driven by necessity rather than malice.

"In hospital after hospital and clinic after clinic, we find users write down passwords everywhere," the report reads. "Sticky notes form sticky stalagmites on medical devices and in medication preparation rooms. We’ve observed entire hospital units share a password to a medical device, where the password is taped onto the device. We found emergency room supply rooms with locked doors where the lock code was written on the door--no one wanted to prevent a clinician from obtaining emergency supplies because they didn’t remember the code. "

Competing priorities of clinical staff and information technology staff bear much of the blame. Specifically: IT staff and management are often focused on regulatory compliance and securing healthcare environments. They are excoriated for lapses in security that result in the theft or loss of data. Clinical staff, on the other hand, are focused on patient care and ensuring good health outcomes, said Ross Koppel, one of the authors of the report, told The Security Ledger (

Those two, competing goals often clash. “IT want to be good guys. They’re not out to make life miserable for the clinical staff, but they often do,” he said.

Submission + - At Black Hat's Oscars: An Award for Hacking Junk (

chicksdaddy writes: In a sign that hacking connected “things” is joining the mainstream of the information security awards, The Pwnies (, a long-running awards ceremony that is the hacker community’s equivalent of The Oscars (or at least The People’s Choice Awards) is adding an award for “Junk Hacking” to its 2016 roster, The Security Ledger reports. (

The awards, which are handed out at the annual Black Hat Briefings conference in Las Vegas in August, added a “Pwnie for Best Junk Hack” to its list of new awards.( But in a nod to the security industry’s penchant for stunt hacking and the technology industry’s penchant for unwarranted complexity, the award will be given to researchers who “discovered and performed the most needlessly sophisticated attack against the most needlessly Internet-enabled ‘Thing.'”

Justine Bone, Chief Technology Officer at the firm, said that combination applies to the Junk Hacking category. The Internet of Things has only amped the silliness, giving an IP address to everything from kitchen appliances to tooth brushes to stuffed animals. (See also: @InternetofShit (

Despite all the silliness, however, Bone said that the community can learn from efforts to compromise connected stuff, which can still inspire subtle and creative hacks that have wider applications. “It may be that there’s some exploit in your connected toothbrush that could also be used against a home security system,” she said.

The Best Junk Hack category is among a slew of new award categories that are being added this year, the 10th year that the Pwnie Awards have been held. Among other new categories that are being added are Pwnies for the “Best Cryptographic Attack,” the “Best Backdoor,” and the closely related “Best Stunt Hack,” awarded to “the researchers, their PR team, and participating journalists for the best, most high-profile, and fear-inducing public spectacle that resulted in the most panic-stricken phone calls from our less-technical friends and family members.”

Slashdot Top Deals

Somebody ought to cross ball point pens with coat hangers so that the pens will multiply instead of disappear.