The rash of recent data breach disclosures in the healthcare industry lays bare some very poor security programs and lax behavior. Whether sensitive data lost through carelessness or weak controls that made it almost impossible for hackers not to steal it, the impression is that the healthcare industry is still in the Stone Age (say around 2005) when it comes to data protection. While it’s fair to say that healthcare has tended to lag in security, let’s not throw it all on this particular sector. There but for fortune…. But, overall, the healthcare security record looks pretty dismal.
In part, we’re hearing about all these healthcare records breaches because the HITECH Act required disclosure of lost or stolen unencrypted patient information. So, you’d expect to be hearing about more and more of these types incidents, which seemed to be rare a couple of years ago. This year’s Symantec Internet Security Threat Report showed that healthcare had by far the highest percentage of data breaches of any sector: 43%, compared to the next highest sectors, government (14%) and education (13%). Symantec believes that the HITECH reporting obligations are a major factor in this surge. Another likely factor is the HITECH mandate to convert health records to digital format (electronic health records or EHR, which my auto correct keeps trying to change to HER).
The latest reported breaches could form the basis for a data protection white paper entitled, “Don’t Let This Happen to You:”
- The hack into health records in Utah that resulted in the theft of 780,000 of claims, some 280,000 of which included Social Security numbers, revealed a disturbing lack of basic good security practices. In stark contrast to officials’ claims that they have a strong, multilayered defense in depth, the evidence is that while multilayered, perhaps, it was neither strong nor had much depth. The stolen data was unencrypted because it wasn’t required by federal regulation. A single admin password — in this case a weak one as a result of failure to follow policy — was all the hackers needed to get at all that data.
- A laptop with 34,500 patient records from Howard University Hospital was stolen from the car of a former contractor. The records were not encrypted, but that’s only one issue. Why was the contractor allowed to or able to put the data on his own laptop? Or any laptop for that matter. Theft or loss of laptops at airports, left in taxis, stolen from cars is commonplace, reported in one study a few years ago at thousands per week just in major U.S. airports. They should never hold sensitive information. Certainly not thousands of patient or customer records. The theft of a Department of Veterans Affairs laptop containing financial and personal information of 25 million veterans (the number still causes me to stare at the computer screen with my mouth open) was the incident that really focused our attention on this particular data loss problem and data breaches in general. And that was in 2006. You’d think six years was sufficient time to learn our lesson, but maybe not so much until something goes wrong. Publicly wrong.
- Backup tapes holding records of 800,000 people enrolled in the California Department of Child Support Services were apparently lost by FedEx while being returned to storage contractor Iron Mountain after IBM used them for to test whether the support services could be run remotely in case of an emergency. Again, the data was unencrypted.
Both the California and Howard incidents also focus on the need for masking or obfuscating sensitive data, or creating phony data in the same formats as genuine data for testing, development, etc. There are few if any cases in which contractors or even internal development or testing personnel need actual data, particularly sensitive client information entrusted to the organization’s care.
- Another 10 backup disks containing the records of 315,000 former Emory University Hospital patients is simply missing. Vanished. No one seems to know how, but the Hospital says an employee who misunderstood security procedures was responsible. Oops. The employee was neither fired nor disciplined. The hospital’s response is to re-educate its employees about proper handling procedures. Good luck with that. Even trained humans make mistakes.
- In South Carolina, a state employee was charged in the transfer of records of 228,000 Medicaid recipients, including names, phone numbers, addresses, birth dates and Medicaid ID number. There are numerous possible security issues concerning trusted insiders: Was the access to confidential information necessary to perform his job? If so, did he need full information or could some of it be masked or obfuscated? Were there proper access controls and reporting/alerting mechanisms (from IAM systems, DLP, etc.) that would set off an alert in the event of transfers of this kind of data, in this case to an email account? Are there policies prohibiting the use of personal email accounts on department-managed devices, especially for employees with this type of access?
These are lessons that all organizations should have learned by now, though they clearly have not. They indicate not blips in security programs but systemic weaknesses. These types of incidents can and do occur in all sectors, but areas such as financial services, government and government contractors, and even retail have had more experience in developing, paid more attention to security programs and been more strictly scrutinized and, at times, penalized by regulatory bodies than the healthcare industry. Enforcement of HIPAA security rules, even since the advent of HITECH, has been spotty.
Annual security surveys conducted by the Healthcare Information and Management Systems Society (HIMSS) over the last few years continue to show that the majority continue to spend 3% or less of their total IT budget on security. A separate HIMSS report says that only 45 percent of U.S. hospital protect electronic health records by conducting and reviewing a security risk analysis. My guess is the organizations that were victimized in the latest incidents were not among the 45%. If they were, they did not do a very good job.
And getting back to the Symantec findings, 43% is 43%, the disclosure requirement factor notwithstanding. The clear implication is that organizations holding healthcare information are doing a pretty terrible job protecting sensitive data. Organizations across sectors can draw lessons from the recent incidents, and all can be doing a better job. But healthcare, as a sector, it seems, is lagging.
It is timely, therefore, that the Health Information Health Information Trust Alliance (HITRUST) has announced that 14 member organizations and the U.S. Department of Health and Human Services have announced formation of HITRUST Cybersecurity Incident Response and Coordination Center, a clearinghouse for sharing information about hacker attacks on healthcare organizations as well as best security practices. There's a lot of work to be done.