HHS Publishes Ransomware Guidance

HHS has published guidance for hospitals and other covered entities in light of recent prominent ransom attacks on hospital data.  The Q&As address Security Rule safeguards which can prevent ransomware and other malware, and also assist in identifying, investigating, responding to and mitigating ransomware attacks. Specifically, HHS notes that the presense of ransomware or any malware on a covered entity or its business associate's systems is a "security incident" as defined under HIPAA.  HHS also notes that, although a breach determination is a fact-specific inquiry,

When [ePHI] is encrypted as the result of a ransomware attack, a breach has occurred because the ePHI encrypted by the ransomware was acquired (i.e., unauthorized individuals have taken possession or control of the information), and thus is a "disclosure" not permitted under the HIPAA Privacy Rule.

Unless the covered entity or business associate can demonstrate that there is a "...low probabilkity that the PHI has been compromised," based on the factors set forth in the Breach Notification Rule, a breach of PHI is presumed to have occurred.

HHS provides the following examples for consideration as part of the risk assessment which must be conducted to determine whether there is a low probabiity that the ePHI was compromised:

  •  the exact type and variant of malware discovered;
  • algorithmic steps undertaken by the malware
  • communications, including exfiltration attempts between the malware and attackers' command and control services
  • whether the malware propagated to other systems, potentially affecting additional sources of ePHI. 

HHS further states,

Correctly identifying the malware involved can assist an entity to determine what algorithmic steps the malware is programmed to perform.  Understanding what a particular strain of malware is programmed to do can help determine how or if a particular malware variant may laterally propagate throughout an entity's enterprise, what types of data the malware is searching for, whether or not the malware may attempt to exfiltrate datam or whetheror not the malware deposits hidden malicious software or exploits vulnerabilities to provide future unauthroized access, among other factors. 

The full ransomware guidance can be found here.  

Copiers result in $1.2 million settlement and CAP for Affinity Health

Yet another covered entity has been hit with over $1 million to settle potential violations of HIPAA, this time for improper disposal of photocopiers.  Last week, OCR announced a settlement had been reached with Affinity Health Plan, Inc. ("Affinity"), a managed care plan in New York, for potential HIPAA violations stemming from a breach in 2010. 

Affinity had reported the breach after it was informed by CBS Evening News that confidential medical information was on the hard drive of a photocopier previously leased by Affinity.  Originally estimated at over 400,000 affected individuals, as reported by DataBreaches.net., OCR noted in its press release regarding the Resolution Agreement that up to 344,579 individuals were reported as potentially affected by the breach. 

CBS had purchased the copier along with three others as part of an investigatory report on digital photocopiers and identity theft.  As reported by CBS,

...it wasn't until hitting "print" on the fourth machine - from Affinity Health Plan, a New York insurance company, that we obtained the most disturbing documents: 300 pages of individual medical records. They included everything from drug prescriptions, to blood test results, to a cancer diagnosis. A potentially serious breach of federal privacy law.

OCR stated in its press release that Affinity failed to erase the data contained on copier hard drives when it returned multiple copiers to leasing agents. In addition, the photocopiers were not addressed as part of Affinity's risk assessments or policies.  Director of OCR, Leon Rodriguez, stated,

This settlement illustrates an important reminder about equipment designed to retain electronic information: Make sure that all personal information is wiped from hardware before it's recycled, thrown away or sent back to a leasing agent...." 

As part of the Corrective Action Plan ("CAP"), Affinity must use "best efforts" to retrieve all photocopier hard drives that were previously leased and safeguard all ePHI maintained therein, within five days, certifying such to OCR in writing (and yes, these copiers may have been returned years ago).  Affinity is also required to conduct a risk analysis of risks and vulnerabilities to ePHI incorporating all electronic equipment and systems and develop a plan to address and mitigate any resulting riska and vulnerabilities that are identified.

I hardly need to point out that this resolution agreement is yet another scary reminder and lesson to all covered entities that electronic PHI (ePHI) needs to be disposed of and properly wiped wherever it may reside.  Although servers, desktops, laptops, and mobile devices immediately come to mind as potentially holding ePHI, many covered entities may still be unaware that most newer photocopiers are capable of saving a digital image of documents that are faxed, scanned or copied.  And many covered entities may also not be properly conducting HIPAA required security risk analyses/assessments for all other equipment which could maintain or transmit ePHI. 

To that end, I re-emphasize the following lessons that we've learned from Affinity and some of its companions on OCR's hotseat who have paid dearly as shining examples of how NOT to safeguard PHI and ePHI:

  • Address need for encryption for everything that has ePHI, from laptops to mobile devices and yes, even photocopiers
    • Idaho Hospice ($50K)
    • Providence Health ($100K)
    • Mass Eye/Ear ($1.5M)
    • Alaska DHSS ($1.7M)
  • Dispose of ePHI properly
    • CVS ($2.25M)
    • Rite Aid ($1M)
  • Do not remove PHI or ePHI from your facilities without assessing the risks and safeguarding it
    • Mass General ($1.5M)
  • Choose your Business Associates' wisely (and have written BAAs with them)
    • BCBS Tennessee ($1.5M)
    • Arizona Cardiologists ($100K)
  • Conduct COMPLETE risk assessments that address all ePHI no matter where it may be located (and update them as needed)
    • BCBS Tennessee ($1.5M)
    • Idaho State ($400K)
    • Arizona Cardiologists ($100K)
    • Wellpoint ($1.7M)
  • Have written policies (and actually implement them)
    • Rite Aid ($1M)
    • CVS ($2.25M)
    • Cignet Maryland ($4.3M)
    • Mass General ($1.5M)
  • COOPERATE with OCR!
    • Cignet Maryland ($4.3 million)

The full Press Release and Resolution Agreement with CAP can be found on OCR's website

Document Disposal Company Responsible for old Patient Records found in Park

Over 277,000 patients were notified by Texas Health Harris Methodist Hospital in Fort Worth ("Texas Health Fort Worth") earlier this month of a breach of their health information.  Only patients seen between 1980 and 1990 whose records were maintained on microfiche are affected or potentially affected by the breach. 

Texas Health Fort Worth's business associate, document destruction company Shred-It, was contracted to dispose of the old microfiche records. As reported by the Star-Telegram, because the microfiche could not be destroyed on-site, Shred-It was to transfer them to another facility for destruction.  

Somehow "lost" or misdirected during transit, the records found themselves in a park where a concerned citizen found them and contacted the Dallas police.  Records were reportedly found in at least two other public locations, and contained names, addresses, Social Security numbers, birth dates and health information. As Texas Health Fort Worth stated in a press release,   

We have no knowledge that any of the information included on the microfiche has been accessed or used inappropropriately.  Furthermore, microfiche is no longer commonly used and specialized equipment is needed to read the information it contains. 

While certainly it is unlikely that the average Joe has access to microfiche equipment, it is inexcusable that the records wound up in a park, of all places, to begin with. Although Shred-it "assured" Texas Health Fort Worth that it took appropriate action as a result of the incident, Texas Health Fort Worth has switched vendors.  I would expect other hospitals in the area to follow suit. It remains to be seen whether OCR will investigate Shred-it for this incident. 

WellPoint hit with $1.7 million for Security Weaknesses in Online Application

The increasingly heavy-handed OCR announced news yesterday of yet another resolution agreement for HIPAA violations; this time hitting WellPoint Inc., a managed care company, with $1.7 million for an Internet breach that occurred between 2009 and 2010 affecting over 600,000.  HHS stated in the press release,

This case sends an important message to HIPAA-covered entities to take caution when implementing changes to their information systems, especially when those changes involve updates to Web-based applications or portals that are used to provide access to consumers’ health data using the Internet.

Data (including names, birth dates, social security numbers and health information) was unsecured in a web-based application database after an upgrade.  The resolution agreement alleges that the Data was disclosed improperly over a five month period.  HHS indicated that,

  • WellPoint failed to implement policies for authorizing access to ePHI;
  • WellPoint failed to perform an "adequate" technical evaluate after a software upgrade affected authentication controls; and
  • WellPoint failed to implement technology to verify (authenticate) access to ePHI by authorized individuals.

Covered Entities affiliated with WellPoint include certain Anthem, Blue Cross and Blue Shield, and UNICARE health plans, among others.  There was no Corrective Action Plan accompanying the resolution agreement, which seems to indicate OCR was happy with the mitigative action taken by WellPoint after the fact. However, the Indiana attorney general's office had filed suit against WellPoint back in 2010 for failing to provide notification as required under state breach laws, and the Connecticut attorney general's office opened an investigation as well. 

For entities planning software and other upgrades and modifications (all you "Meaningful Users", to start), you can retrieve a copy of the news release and resolution agreement to give to and hammer home with your Security Officer and IT Departments here

Lessons from the Idaho State University CAP

Back in 2011, Idaho State University (Idaho State) experienced a breach of PHI affecting approximately 17,500 individuals after firewalls on its servers were disabled at one of its outpatient clinics. It appropriately notified HHS in August of that year whereafter (surprise, surprise) HHS informed Idaho State that it would be investigating Idaho State's compliance with HIPAA.

HHS released news of its settlement with Idaho State on May 21, 2013, with Idaho State agreeing to pay $400,000 as part of the Corrective Action Plan (CAP) to resolve allegations that:

  • It did not conduct a risk analysis for over 5 (five) years;
  • It did not implement security measures sufficient to reduce risks and vulnerabilities to ePHI for that same period;
  • It did not implement procedures to regularly review information systems activity for that same period.

As part of the CAP, Idaho State, which operates as a hybrid entity with several covered entity components, must beef up its documentation and specifically designate its covered entity components (i.e., its outpatient clinics). Unsurprisingly, Idaho State is also required to provide HHS with its most recent risk management plan and information systems activity policies for "review and approval" by HHS.  Idaho State must also complete and submit a compliance gap analysis indicating all changes to compliance status with the required provisions of the Security Rule. 

Although Idaho State experienced a breach of PHI AND was informed in November of 2011 that HHS was investigating its compliance with HIPAA, according to HHS, Idaho State did not get around to performing a risk assessment, reviewing information systems activity or identifying gaps in security measures until the summer of 2012 and post-Thanksgiving, November 26, 2012. It is baffling that, after experiencing a breach which was caused by firewall protections being physically disabled for over 10 months, Idaho State appears to have not done much to assess and safeguard against future problems. 

Or did it? Maybe it was just too little, too late. But part of Idaho State's problem could simply have been that it couldn't prove what steps it had taken towards HIPAA security compliance.  Although Idaho State clearly dropped the ball in failing to realize firewalls protections were disabled for almost a year at its Pocatello Family Medicine Clinic, it may have been more compliant than the CAP suggests and simply had nothing to show.

Increasingly, covered entities are realizing that saying and believing they are HIPAA compliant is about as effective with OCR as your teenager telling you he cleaned his room as he runs out the door to the movies.  It's like high school all over again - if you can't "show your work" and prove your HIPAA compliance through documentation, regular reports and reviews, and clearly defined privacy and security policies procedures, OCR simply isn't going to buy it when they show up at your door. 

To be sure, many covered entities have been completely lax about security until now.  Conducting a comprehensive risk assessment (documenting that it was done and periodically reviewed) and having processes in place for ongoing risk management are some of the biggest things OCR has repeatedly been driving home.  Too often, as Idaho State's CAP illustrates, security risk assessments are inadequate and fail to properly identify security risks and vulnerabilities to ePHI. 

On the other hand, many covered entities think that they are compliant with the Security Rule, but really aren't.  A covered entity may conduct a risk assessment of its EHR or EMR, for example, but fail to assess the security risks and vulnerabilities associated with other systems that feed into it or maintain PHI, or with workflow processes, resulting in PHI accidentally being made available online (think Phoenix Cardiac Surgery or Stanford Hospital). Furthermore, where risks and vulnerabilities are identified, appropriate security measures are not always evaluated and action taken as needed to correct them.

As we can see from Idaho State, performing a comprehensive risk assessment now isn't necessarily going to cure your failure to do so before and an overwhelming number of covered entities could still be in the hotseat even if they are actively beefing up their HIPAA privacy and security. And there's still the risk that what has and is being done is simply too little to satisfy OCR. 

However, good faith efforts and diligence to bring your organization into compliance with the Security Rule implementation standards and specifications will go a long way toward lessening the likelihood and impact of an unwanted OCR investigation, not to mention minimizing the risk of breach and harm to your patients and organization.  It is far easier to seek forgiveness for past transgressions from OCR with a robust updated HIPAA security management program in hand. 

The $1.7 Million Flashdrive...Alaska Medicaid Settles HIPAA Violations

Even state agencies are not invisible to the all-seeing eye of OCR.  The use, and subsequent theft of, an unencrypted flashdrive cost the Alaska Medicaid agency $1.7 million, according to the Office of Civil Rights (OCR) in a news release issued yesterday. According to OCR, an employee of the Alaska Department of Health and Human Services (ADHHS), the state's Medicaid agency, had an unencrypted flashdrive possibly containing PHI stolen from his car back in October 2009.  ADHHS reported the breach promptly to OCR, which began an investigation in the beginning of 2010. 

In the Resolution Agreement, OCR stated that ADHSS had failed to:

  • Complete a HIPAA risk analysis;
  • Implement sufficient risk management measures;
  • Complete security training for ADHHS workforce members;
  • Implement device and media controls; and
  • Address device and media encryption.

The Resolution Agreement require ADHHS to revise and submit to OCR its policies and procedures relating to access to e-PHI, specifically with regard to tracking and safeguarding devices containing e-PHI, encryption, disposal and re-use of such devices, responding to security incidents, and appropriately applying sanctions for violations. In addition, ADHHS is required to conduct a risk assessment of the confidentiality, integrity and availability of e-PHI, and implement security measures sufficient to reduce risks and vulnerabilities identified.  The Resolution Agreement also requires ADHHS to provide specific training on the new policies.   

We all know the considerable security risks that are accompanied by use of unencrypted flashdrives, laptops and other portable devices and media by employees, residents and other workforce members -- now with a hefty price tag of $1.7 million.  Even for entities that have policies and procedures in place prohibiting use of such unencrypted devices, or that implement software that automatically encrypts any information saved to such devices, clearly communicating and enforcing these and the entity's other security policies and procedures is critical to avoiding security breaches and defending against potential OCR audits. 

While encryption isn't per se required to be implemented by HIPAA, it is an "addressable" implementation specification of the Security Rule.  This means that you must assess whether encryption would be "reasonable and appropriate" for ePHI "at rest" and in transmission, and if not appropriate, clearly have in place alternative safeguards and mechanisms to secure electronic PHI.  It has become all too clear that not encrypting flashdrives, laptops, hard drives and other devices and media that can potentially leave the safety of your facility can not only result in a reportable security breach, but also some serious explaining to OCR when it comes knocking on your door. And remember, if a security incident occurs and the information that was stored or transmitted was encrypted, you are likely not required to notify patients that a security breach has occurred.  

To help assess whether your security management process will stand up to OCR review, keep an eye out for our next post reviewing the the newly released OCR Audit Protocol for the HIPAA performance audits. 

Mass. AG Levies 750k Judgment on Hospital for Data Breach

Massachusetts Attorney General Martha Coakley announced on May 24, 2012 having reached a settlement agreement with South Shore Hospital for failure to protect personal and confidential health information of over 800,000 patients. 

“Hospitals and other entities that handle personal and protected health information have an obligation to properly protect this sensitive data, whether it is in paper or electronic form,” AG Coakley said. “It is their responsibility to understand and comply with the laws of our Commonwealth and to take the necessary actions to ensure that all affected consumers are aware of a data breach.”

The consent judgment requires South Shore Hospital to pay a total of $750,000, including $250,000 in civil penalties and $220,000 towards an education fund for protection of PHI and personal information.  However, South Shore Hospital did receive a "credit" for security measures it implemented after the breach occurred of $275,000, leaving only $475,000 payable. 

The consent judgment also requires South Shore Hospital to undergo audit and report results of certain security measures, as well as take steps to ensure compliance with HIPAA business associate provisions and other federal and state security requirements.  In addition to failure to comply with HIPAA business associate obligations, South Shore Hospital also failed to comply with HIPAA and state obligations to implement appropriate safeguards, policies, and procedures to protect patient information, and appropriately train its workforce in safeguarding the privacy of PHI. It also neglected to ensure that the contractor itself had procedures in place to protect such PHI, according to the AG. 

Three boxes full of unencrypted computer backup tapes had been sent to a subcontractor of Archive Data Solutions in 2010 to be erased and resold; however, the subcontractor only received one of the boxes and the remaining two were never recovered.  According to the AG's office, South Shore Hospital did not have a business associate agreement in place with the contractor nor had it informed Archive Data that the backup tapes contained PHI.

The backup tapes contained Social Security Numbers, names, financial account numbers, and medical diagnoses.  As reported by HealthDataManagement, South Shore Hospital had determined in July 2010 that the missing backup tapes was not a breach requiring individual notice to affected and potentially affected individuals.  Rather, it posted a prominent notice on its website, citing state law provisions permitting alternative notifications where costs would exceed $250,000 or where over 500,000 residents are affected. 

It is unclear whether this breach was reportable and therefore actually reported to the Department of Health and Human Services (HHS) under the HITECH Breach Notification Rule.  Although the PHI here was unencrypted and therefore "unsecured" within the meaning of the HITECH Breach Notification Rule, covered entities are also required to conduct an assessment to determine whether an incident poses a "significant risk of harm" to the individual(s) that would give rise to a reportable breach.  Most importantly, a breach in and of itself does not automatically mean a HIPAA violation has occurred.

If a covered entity determines that there was a breach, all affected individuals and individuals reasonably believed to be affected are required to receive written notice of the breach, as well as HHS where over 500 individuals have been affected.  HITECH also permit alternative notification but only where the contact information of an individual is incomplete or where written notice has been returned undeliverable to the covered entity attempting to notify such individual of a reportable breach. 

Aside from South Shore Hospital's obvious failure to obtain a business associate agreement and apparently even inform Archive Data that it was a business associate subject to certain HIPAA provisions, it is unclear what else it was South Shore Hospital did or failed to do that contributed to the 750k settlement agreement and other alleged HIPAA and state law violations.  The AG's office noted that multiple shipping companies had handled the backup tapes, but did not otherwise indicate whether it was the lack of policies and procedures for safeguarding PHI and training workforce in such safeguards that resulted in the missing backup tapes (again, a breach itself does not automatically mean a HIPAA violation has occurred) or whether the focus was on the hospital's overall HIPAA and state law compliance program.

What is even more noteworthy is that the AG stated South Shore Hospital failed to determine whether Archive Data had sufficient safeguards in place to protect the PHI it would receive on the backup tapes prior to destruction.  This clearly places an obligation upon covered entities to go beyond ensuring that the business associate agreement itself is in compliance with HIPAA by requiring the business associate to implement reasonable safeguards to protect PHI.

While covered entities have always been, and should be, responsible for appropriate oversight and monitoring of their business associates, just how far is a covered entity responsible for going?  Does a hospital need to request that the business associate provide copies of its policies and procedures for safeguarding PHI? Policies and procedures for data destruction or erasing data?  Information on how its staff is trained on the business associate's obligations under HIPAA and the business associate agreement? 

And if a hospital is not satisfied with a business associate's policies and procedures, can it require additional safeguards and processes be implemented? Should a hospital also require notification by a business associate of potential breaches and security incidents to safeguard against bad calls? With business associates frequently resisting the inclusion of any provisions in a business associate agreement beyond the bare minimum required by HIPAA, covered entities may find it increasingly difficult to provide the required levels of oversight, safeguards and assigned responsibility.

With over 22% of reported breaches since 2009 involving business associates, as reported by HealthcareInfoSecurity, and with only one case (see Minnesota AG case against Accretive Health) so far targeting business associates directly for HIPAA violations, covered entities remain liable for the actions of their business associates, despite that business associates are now directly subject to certain HIPAA provisions. Covered entities also bear the brunt of a breach, as it is their patients who may be seriously harmed.  As determining liability for breaches and other security incidents between a covered entity and a business associate involved remains quite uncertain for now, the business associate regulations (expected "soon" ever since last year) will be a welcomed ray of clarity for covered entities and business associates alike. 

Yet Another Medicaid Breach; Emory Loses Back-up Discs

This April appears to have been designated "National Breach" month.  In what is the second massive breach of Medicaid data this month, over 200,000 South Carolina Medicaid beneficiaries have been notified of a breach of their health information.  The South Carolina Department of Health and Human Services discovered on April 10 that an employee had emailed 17 spreadsheets of beneficiary health information to his personal email account, including names, addresses, social security numbers and Medicaid ID numbers, but no medical information. 

The former employee and project manager, Christopher Lykes, has since then been fired and arrested, charged with five counts of confidentiality violations under the South Carolina Medically Indigent Assistance Act, and one count of disclosure of confidential information, according to ABC News, Charleston. According to Department of Health and Human Services Director, Anthony Keck, the records were transferred to at least one other person, although it is unknown yet why the information was accessed. 

Investigations showed that the information was available through normal reporting processes, however, Department policies and procedures did not require employees to justify needs for information, which has now been rectified by the Department.  An external IT consultant has also been hired to conduct a full risk assessment of all data and IT systems. 

As I posted earlier this month (see my previous blog, Utah Medicaid Claims Data Hacked), this is the second Medicaid breach this month.  Utah, at least, can blame European hackers for the breach, rather than its own policies and procedures, which has since skyrocketed from its original estimate of 24,000 to almost 800,000 Medicaid beneficiaries or individuals who received health services and whose Medicaid status may have been inquired about by their health care provider, as well as CHIP recipients. This makes it one of the top breaches reported over the past few years. The Utah Department of Health has updated its toll-free number for Medicaid clients to call and added additional information about the breach on its website.   

And finally, continuing the April breach theme, Emory Healthcare Systems reported this past week that 10 back-up discs went missing from storage at Emory University Hospital, containing data of 315,000 patients, including likely its own CEO's information.  Oops.  The data related to surgical patients treated at several Emory facilities from September 1990 through April 2007 and contained names, social security numbers, dates of surgery, diagnoses, and surgical codes, as well as names of surgeons and anesthesiologists. 

Patients were notified beginning April 17, although the discs went missing sometime in February.  Emory stated in its online notice that it does not believe any of the data was or will be misused, as the backup discs were for an obsolute software system long-deactivated by Emory.  However, Emory has offered one year of free credit monitoring and has implemented additional security data control measures. 

Along with the recent $100,000 settlement agreement between HHS and a Phoenix cardiac surgeons group, these breaches hammer home the need for a comprehensive HIPAA Compliance Program and periodic risk assessments.  See Helen's post last week for the significance of this settlement agreement and the steps covered entities can take to protect themselves against breaches and privacy and security violations. 

Utah Medicaid Claims Data Hacked Affecting Over 24,000

The Utah Department of Health (UDOH) has experienced a data breach of its Medicaid claims data of over 24,000 individuals.  The breach was reported to UDOH by the Utah Technology Services Department on Monday, April 2nd, and while the initial hacking is suspected to have occurred on Friday, March 30th, UDOH stated that information began to be removed from the server on Sunday, April 1 (perhaps merely coinciding with April Fools' Day...). 

Currently, UDOH suspects the hackers originated from Eastern Europe, and according to Reuters, has been able to pinpoint it to within certain countries.  The Department of Technology Services had recently moved the claims data to a new server, and, despite a multi-layered security system, the hackers were able to circumvent and access potentially client names, addresses, birth dates, Social Security numbers, physician’s names, national provider identifiers, addresses, tax identification numbers, and procedure codes for billing.

UDOH is still investigating the scope of the breach, and has yet to determine exactly what types of information were compromised as well as the identities of all of the affected Medicaid clients.  So far, UDOH believes only one server was hacked.  The affected server was shut down, and new security measures implemented, according to Reuters and UDOH. 

UDOH is currently advising all Medicaid clients to monitor their credit and bank accounts until those affected can be fully identified and notified.  According to KSL.com, Technology Services Executive Director Steve Fletcher said the server had "weaker controls" than the original server it was exchanged for.  However, Fletcher stated that the agency will investigate further to assess how the hackers were able to circumvent the security system and do whatever may be necessary to prevent future breaches.

"These hackers are very, very sophisticated and that's one of the things that we want to document so that we can to put more controls in place to make sure that it will not happen again," stated Fletcher.

For more information, check out the UDOH official statement and the Reuters and KSL.com articles.    

Peeling Back BCBS's $1.5 Million HIPAA Settlement Onion

Onion3.png

As many of our readers have already heard, on March 13, 2012 HHS announced that Blue Cross Blue Shield of Tennessee entered into a Resolution Agreement for $1.5 Million Dollars to settle potential violations of HIPAA. You can access a copy of the Resolution Agreement here

I find this new case both instructive and frightening, but one has to peel back the layers of this HIPAA-onion to really understand why the Resolution Agreement between BCBS of Tennessee (BCBSOTenn) and HHS/OCR creates an even greater nerve-racking precedent than may be immediately apparent.

First, it must be noted that OCR initiated its investigation of the Breach incident and BCBSOTenn only after BCBSOTenn submitted its HITECH Breach Report "in compliance with" 45 CFR §164.408.  Therefore, HHS/OCR appears to acknowledge that BCBOTenn's reporting of the Breach was timely, proper and otherwise in compliance with the Breach Notification Rule.  And, while BCBSOTenn did not seem to get much reprieve here for its diligent Breach reporting, it’s important to point out that just because a covered entity experiences a Breach does not in and of itself mean that the covered entity has violated the HIPAA Privacy or Security Rule.  A covered entity must actually fall short of or be non-compliant with a HIPAA Privacy Rule standard or Security Rule standard before an actual violation can be found.

So, at least hypothetically, a covered entity could still be in full compliance with the HIPAA Privacy and Security Rules, even if it experienced a Breach involving or potentially compromising PHI.

In such a situation, as long as the covered entity properly and timely reports the Breach as required under the HITECH Breach Rule, and has a fully compliant, current and effective HIPAA compliance program implemented, then the covered entity should be able to assert that there were no violations of HIPAA or HITECH to give rise to HHS/OCR assessing penalties against it.  However, at least for BCBSOTenn, apparently the costs and burden of going through an investigation to prove that the Breach was not due to an underlying lapse its HIPAA compliance program was not worth it, at least not $1.5 Million.

What may be most chilling from a compliance perspective here, however, is that the Breach incident itself was allegedly caused by an intervening criminal act, and that BCBSOTenn had presumably paid Eastgate to provide security services to safeguard the data closet where the video and audio recordings were being temporarily stored until their scheduled relocation at the end of November 2009;  and, indeed, it seems that Eastgate did have a lot of appropriate physical safeguards in place, including biometric and keycard scan security with a magnetic lock, an additional door with a keyed lock, and basic security services.

So, if BCBSOTenn contracted, paid for and relied on Eastgate to provide security services, one would think that it would be reasonable for BCBSOTenn to believe that it had taken appropriate steps to attempt to safeguard the e-PHI while it was temporarily stored at the data closet.  What is not discussed in the Resolution Agreement, however, but would be interesting to know is whether BCBSOTenn’s contract with Eastgate included HIPAA BAA-type language to ensure that Eastgate was aware of the sensitive nature of what they were securing (i.e., e-PHI), and to contractually obligate Eastgate to have in place at least minimum administrative, technical and physical safeguards with regard to how it ensured the security of the data closet.  This illustrates a good lesson, which is while a security vendor or a building manager may not technically be a HIPAA BA, as historically defined by HHS (because such third parties are not required to access PHI to perform their function on behalf of the covered entity), in any instance where a covered entity relies on a third party to ensure the security of its PHI or e-PHI, including software vendors, data warehouses, cloud providers and other similar types of third parties, it is important to have such third party contractually agree to have in place HIPAA BA-type safeguards, and to agree to be responsible for any damages that may arise from a Breach that is due to their own negligence. In this case, Eastgate did not respond to evaluate an unresponsive gate for the entire weekend. While it is not clear whether this may or may not have been negligent on the part of Eastgate, hopefully BCBSOTenn had provisions in its agreement with Eastgate that required insurance coverage for such incidents and will allow BCBSOTenn to also potentially make a claim for indemnification if there was indeed fault on the part of Eastgate.

Finally, despite the fact that the theft of the e-PHI was the event that precipitated HHS/OCR to conduct an investigation here, it almost seems that its settlement with BCBSOTenn had less to do with the actual Breach incident itself and more to do with what HHS/OCR may have believed could be lacking with BCBSOTenn’s general HIPAA compliance program.  In fact, the corrective action plan (CAP) in the Resolution Agreement does not include any requirement to take any actions, like encryption, with regard to similarly stored data devices.  Instead, the CAP focuses on HHS/OCR having the opportunity to review BCBSOTenn’s written policies for conducting a risk assessment, conducting a risk management plan, addressing facility access controls and a facility security plan, and addressing physical safeguards governing the storage of e-PHI.  The CAP also requires such policies to be revised, IF HHS/OCR suggest “material changes” to the policies, and to be distributed to all BCBSOTenn workforce, who must then sign a certification of receipt, and be retrained. Now, while that is all well and good, I wonder about HHS/OCR focusing on BCBSOTenn workforce when wasn’t it the employees of Eastgate who were the ones that did not respond to the lapse in security?  At least in this instance, then, the real security gap seemed to be with BCBSOTenn’s contracted security vendor’s workforce, not its own.

This case certainly raises questions and concerns with investigation and enforcement processes, but also offers some instruction.  First, it is important for covered entities to review their contracts with third parties that may have access to PHI, and most certainly if such third party may be directly or indirectly responsible for ensuring the security of its PHI.  Covered entities should include clear language regarding allocation of responsibility for security, and severe repercussions, including potential indemnity, if the vendor falls short. Contracts with technology vendors, cloud providers, facility security providers, and the like are all potential areas where security weaknesses and gaps may exist.

Finally, while the outcome of the BCBSOTenn situation may tempt many to be more hesitant with reporting Breaches to HHS, that is not advisable.  Not reporting a Breach incident when it is legally required to be reported under the law could just lead to additional potential penalties for violations of the HITECH Breach Rule.  Thus, while Breach reporting clearly can lead to an OCR investigation, as it did here, the best defense may be for covered entities and business associates to ensure that their HIPAA Policies and Procedures are well-developed, updated, and implemented so that they can all be handed to HHS/OCR as proof of full HIPAA compliance, despite any Breach incident having occurred.

 

Feb 29th is Last Day to Report Breaches of <500 to HHS!

For those that have been logging their "small" Breaches (i.e., less than 500 individuals affected) and waiting to report them to HHS at the end of the year, next Wednesday, February 29th is the LAST day to get your information entered into HHS' Breach reporting website.  While covered entities may opt to report each small Breach to HHS throughout the year (i.e., including the onsies and twosies), the other option is to log each small Breach during the calendar year and report all such small breaches to HHS within 60 days of the end of such applicable calendar year. 

A couple of important points to note about reporting small breaches to HHS:  

First, the HHS-reporting "buck" stops with the covered entity, not the Business Associate. Even if a breach was caused by a Business Associate (BA), under the current HITECH Breach Rule the BA's only reporting obligation is to the covered entity; the covered entity is solely responsible for reporting all reportable Breaches to HHS. 

Goldilocks.pngSecond, follow a 'GOLDILOCKS rule'  of 'Not too much, not too little -- just right'. Covered entities must report all relevant information requested on HHS' online reporting form. However, there are several fields that ask for a typed response.  For example, HHS asks for a "brief description of the breach" including how it happened, any additional information about the breach, type of media and PHI. HHS similarly asks the covered entity to describe "other actions taken" in response to the Breach. But, while a covered entity must report what it is required to report by law, offering too much infomation (including impermissibly disclosing patients' PHI, among other things) could land the covered entity in hot water.

Finally, you better have remembered to collect ALL the required information on your Breach Log!  A covered entity that is planning to report small Breaches at the end of the calendar year must plan ahead and know what information to collect and document, and hint: it's a lot of information that you might not be able to recall at the end of the year unless you documented it as you went along.  Among the information that covered entities should be collecting about each "small" breach includes:

  • Date of the Breach? 
  • Date the Breach was Discovered?
  • Approximate number of individuals affected?
  • What "type" of breach was it? (select: theft, loss, improper disposal, unauthorized access, hacking/IT incident, other, or unknown)
  • Location of the Breach? (select: laptop, desktop computer, network server, e-mail, portable electronic devices, electronic medical record, paper, other)
  • What type of information was involved? (select : demographic info, financial info, clinical info, other) 
  • What safeguards were in place prior to the Breach? (select: firewalls, packet filtering, secure browser sessions, strong authentication, encrypted wireless, physical security, logical access control, antivirus software, intrusion detection, biometrics) 
  • Date individuals were notified? (note: that this date should never be more than 60 days after the Date of Discovery entered, and in any case any "unreasonable delay" in notifying individuals (even if less than 60 days) could be a trigger a closer look by HHS).
  • Actions taken in response (select : privacy & security safeguards, mitigation, sanctions, policies and procedures, or other) 
  • Even though HHS withdrew the Interim Final Breach Notification Rule during the summer of 2010 (and even though we continue to wait for a final revised version of that rule to be published), covered entities are still required to report all Breaches (if there is a positive "Harm" determination) to HHS. HHS specifically points out on its website that "[u]ntil such time as a new final rule is issued, the Interim Final Rule that became effective on September 23, 2009, remains in effect."

    For Breach Notification training & education, click our Workshops button.

    HIPAA Auditor Responsible for Breach in 2010

    In June of 2010, a large healthcare system was informed by its business associate that a breach had occurred, affecting thousands of patients at its hospital.  The breach had occurred the previous month when an employee of the business associate lost an unencrypted flash drive that may have contained patient information.  Although the breach was reported last year, news regarding the breach appears to have begun circulating this past week, most likely due to the new role of the business associate in question. 

    The real kicker is that the business associate was none other than KPMG, the prominent auditing, advisory and tax company that was recently awarded $9.2M by the Office for Civil Rights (OCR) to conduct HIPAA privacy and security compliance audits.  Although the flash drive reportedly did not contain patient information such as social security numbers, addresses, personal identification numbers, dates of birth or financial information, the embarrassing fact remains that a KPMG employee used an unencrypted flash drive to carry around patient information. 

    Not only was I surprised at KPMG's responsibility for the breach, but also the length of time that went between the discovery of the loss of the flash drive by KPMG (May 10, 2010) and the report that was sent to the covered entity regarding the loss (June 29, 2010).  Although KPMG just barely notified its customer within the HITECH sixty day notice requirement, one has to wonder why it took so long for KPMG to discover that the device was missing and/or report it.

    Although I am also curious as to why a KPMG employee would need to carry around patient information on a flash drive to begin with (especially an unencrypted one), this shows that a breach can happen to the best of us.  It also highlights a big problem for hospitals and other health care providers when it comes to security of patient information.  All too often residents, nurses and other health care providers copy patient information onto flash drives, laptops or other unencrypted devices which are easily lost or stolen.  These risks must be identified and aggressively managed by health care organizations and their business associates to minimize the risk of breach to such organizations and the patients they serve. 

    HealthLeadersMedia reports that Susan McAndrew, OCR deputy director for health information privacy, wrote in an email that the case was currently under investigation and as such, OCR could not address KPMG's involvement in the breach.  When asked whether KPMG's involvement in the breach had been considered prior to awarding it the HIPAA audit contract, McAndrew stated,

    The award of the HIPAA audit contract was the result of HHS’ usual, rigorous, competitive process. Specific questions regarding the contract award are procurement sensitive.

    The public notice made available by the hospital on its website stated that,

    KPMG has told us the company is implementing measures to avoid similar incidents in the future, including additional training and the use of improved encryption for its flash drives.

    Improved encryption? The flash drive that went missing reportedly did not have any encryption mechanisms.  One would hope though that KPMG has followed through and improved its security measures, given that it is now an ONC HIPAA auditor with the potential to access patient PHI and other information in the course of its auditing activities.

    Spartanburg Breach Affects 400,000...But They're Not Telling

    According to the Office for Civil Rights (OCR) webpage listing breaches of PHI over 500, a theft affecting an originally unreleased number of patients turned out to have impacted approximately 4,000 patients...times 100.  You'd never guess from the short Press Release available on the Spartanburg Regional Healthcare System website, or, it appears, from any other information released by Spartanburg itself (See HealthDataManagement), but approximately 400,000 patients were affected by the theft of a Spartanburg computer from an employee's car on March 28, 2011.  Although certainly not the largest number of affected patients for a given breach incident (See on the ONC website, for example, AvMed in 2009 with 1,220,000 affected patients, the North Bronx Healthcare Network with 1,700,000 last year, as well as Health Net with 1,900,000 for a breach this past January), the number places Spartanburg squarely within some of the largest breaches of patient information in the past few years. 

    Notice to thousands of patients of the theft began in late May of 2011.  According to the Press Release, the employee was authorized to have possession of the computer which was stolen.  It stated Spartanburg had no reason to believe any information had been misused as the file containing patient Social Security Numbers, names, addresses and dates of birth had been password protected.  However, it notified affected individuals that Spartanburg had made available, free of cost, identity theft consultation and restoration as well as ongoing credit monitoring. 

    Surprisingly, the Press Release is devoid of any information regarding how many patients had been or could have been affected and it does not appear that Spartanburg has acknowledged the high count other than in its required notice to HHS of the breach.  Although initially the full extent of the breach may not have been known to Spartanburg when it first discovered the breach and began to notify patients, the fact that it has still not acknowledged publicly the substantial number of patients affected is perplexing.

    While the HITECH Act does not require that patient notification include how many individuals have been affected by a given breach incident nor does it require the release of any sensitive information regarding the incident, downplaying (or at least avoidance of) the magnitude of the breach certainly wouldn't seem to me to be the top choice among PR options. Given that notice to HHS is required for all breaches affecting over 500 individuals and such information is made available on OCR's website, the information was destined to come out eventually.

    Hospital Theft Leads to HIPAA Criminal Charges

    An Alabama woman has been slapped with criminal charges in connection with the theft of patient information from Trinity Medical Center in Birmingham, Alabama, as reported by The Birmingham News.  Section 1320d-6 imposes criminal penalties where any person knowingly uses a unique health identifier or obtains or discloses individually identifiable health information in violation of HIPAA. 

    The young woman, identified as Chelsea Catherine Stewart, allegedly stole paper surgery schedules from a closed patient registration area at the hospital while visiting a patient.  Stewart was arrested the beginning of June after hundreds of pages of the schedules were found in the house where she was staying by police in connection with an ongoing investigation for mail theft and credit card fraud.   

    The schedules contained the names, dates of birth, social security numbers and certain medical information of approximately 4,500 patients of the hospital.  In addition to the patient information, an affidavit by postal inspector John Bailey stated there were handwritten notes with information of other individuals which could be used for identity theft and a "to-do" list of sorts for fraud.  Notes allegedly read, "Get hospital records together and run credit reports on people to get info."  

    The notice of the theft on Trinity Medical Center's website states,

    "All stolen information has been recovered....The hospital has no reason to believe this information has been or will be used in a way that would cause harm." 

    However, Trinity Medical Center will be offering free credit monitoring for those affected patients.  In addition to the notice on its website, the hospital also notified affected individuals of the theft by mail.

    If convicted, Stewart could face the maximum criminal penalties under §1320d-6 for "intent to sell, transfer, or use individually identifiable health information for commercial advantage, personal gain, or malicious harm" and up to 10 years in jail and $250,000 in penalties.  Stewart also faces unrelated charges of credit card fraud and breaking into a vehicle.  

    OCR Will Address Almost Everything in HITECH Omnibus Rule

    HealthDataManagementhas quoted Susan McAndrew, deputy director of health information privacy in the Department of Health and Human Services, OCR, as saying that the final rules implementing the HITECH Act are to be released within months, if not weeks.  Deputy Director McAndrew recently spoke at the Safeguarding Health Information conference OCR hosted with the National Institute of Standards and Technology (NIST) in Washington.

    The long-awaited rule will be an "omnibus regulation" that is said will include final versions of:

    • the proposed rule to expand HIPAA privacy and security protections;
    • the Breach Notification Interim Final Rule; 
    • the Enforcement and Compliance Interim Final Rule; and
    • the GINA proposed rule.

    Notably, McAndrew is quoted as saying:

    We want to ensure that when we do the final HITECH action it contains as much activity as we can

    Significantly, HealthDataManagement reports that the omnibus final rule will cover new information protection requirements for:

    • business associates and subcontractors,
    • electronic access,
    • research authorizations,
    • student immunization records,
    • restrictions on marketing,
    • restrictions on fundraising, and
    • prohibition on sale of protected health information.

    McAndrew is also noted to have indicated that a separate proposed rule will be issued after the omnibus regulation, and will govern accounting for disclosures (AOD) even for payment, treatment and health plan operations.  McAndrew is quoted as saying that the AOD proposed rule is “very close” to being ready.

    Class Action Sought for Charleston Area Medical Center Breach

    Patients affected by a West Virginia hospital breach that went undetected for several months are seeking certification as a class action as reported by Health Data Management.  Five of the approximately 3,655 affected patients have filed suit against the Charleston Area Medical Center in circuit court seeking damages based on four counts:

    • breach of confidentiality
    • negligence
    • invasion of privacy by intrusion on seclusion
    • invasion of privacy by unreasonable publicity of private life

    The lawsuit, Tabata v. Charleston Area Medical Center, stems from the availability of a database containing patient names, social security numbers, medical information and demographic information on the Internet.  A family member of a patient had found the information while searching the web.

    Although the database was created in September of 2010 by a third party for patient case management in a research subsidiary of the hospital, the fact that it had inadvertently been made publically available went unnoticed until February 2011.  However, the hospital acted quickly upon being made aware of the breach and promptly notified all potentially affected patients within 8 days.

    The hospital had originally offered to pay for one year of credit monitoring as well as an immediate credit freeze at the three credit bureaus for all affected patients.  Free credit reports were also made available to affected patients through the West Virginia Attorney General's Office.  In addition, after discussion with the Attorney General's Office, the hospital hired a risk management group to conduct a security assessment and undertook a number of other measures to protect against further breaches.

    The patients seek as part of the damages for the hospital to extend additional credit and identify protection and monitoring services.  They also ask the court to require that the hospital establish a specific security program as well as award monetary damages for annoyance, embarrassment and emotional distress, and for the lack of security and violation of their privacy.

    Although it is unclear yet what repercussions the hospital may face from the Department of Health and Human Services for the breach, the breach and accompanying lawsuit highlight the importance of monitoring business associates who have access to PHI and the resulting work product.  In addition, frequent and periodic security assessments are crucial to identifying issues before an incident or breach occurs.  A robust and proactive security assessment coupled with a strong information security program will go a long way towards effectively safeguarding patient electronic PHI as well as cutting costs associated with incident-response. 

    Security Breach Response: Lessons Learned from the Epsilon Breach

    Does the notice below look familiar?

    Chase is letting our customers know that we have been informed by Epsilon, a vendor we use to send e-mails, that an unauthorized person outside Epsilon accessed files that included e-mail addresses of some Chase customers.  We have a team at Epsilon investigating and we are confident that the information that was retrieved included some Chase customer e-mail addresses, but did not include any customer account or financial information.

    If it does, congratulations on being one of the unlucky millions affected by the data breach which occurred at Epsilon last week.  The largest distributor of "permission-based" email marketing, Epsilon serves some 2,500+ clients from JPMorgan and Chase to Target and Walgreens, sending over 40 billion emails on their behalf each year. 

    At some point on Wednesday, March 30, Epsilon's systems were hacked, resulting in millions of email addresses and names being stolen, presumably in order for hackers to send mass spam and convincing "phishing" emails to consumers.  The first I became aware of the breach was Monday, April 4, when I received the above notice from Chase, followed quickly by Target, 1-800-Flowers and a variety of other smaller companies over the next two days. 

    As I received the latest emails this morning (World Financial Network National Bank, or WFNNB, and Citibank), I couldn't help but be impressed with how quickly Epsilon was able to detect the data breach, notify law enforcement, and notify its clients affected by the breach, reportedly about 50 companies.  The turnaround time within which many of the affected clients notified their consumers was equally impressive, especially given that these companies likely only received notice from Epsilon right before or over the weekend.

    I automatically wondered: would such a response have been equally efficient and effective if the data breach had occurred within the HIT systems of a business associate of a hospital or within the hospital itself?  Maybe yes and maybe no. 

    HITECH places stringent security breach notification requirements and timeframes on covered entities and business associates who experience breaches of PHI.  In addition, state laws such as the New Jersey's Identity Theft Prevention Act, also place breach notification requirements on these and other entities with regard to certain personal information.   

    Covered entities, as we are all too aware, are certainly not immune from the risk of security breaches.  Many covered entities may not have detailed policies and procedures for detecting and responding to breaches of PHI.  For those that do, are these procedures effectively communicated to key management and employees so that they know how to appropriately react from the first sign of a breach through the sending of required notices?  In addition, how soon and by what mechanisms are business associates required to report breaches, or even suspected breaches, of PHI to the covered entity?

    Although only emails and names were hacked, the Epsilon breach stresses how important it is for covered entities to assess their security breach notification policies and procedures and ensure key personnel know the steps for detecting, assessing and mitigating breaches of PHI and their respective roles and responsibilities BEFORE these individuals are placed in such a situation.

    A mere five calendar days (including the weekend) is quite impressive for a breach response involving so many different companies.  Although perhaps five days might be improbable or even impossible for a covered entity under the circumstances of a given breach, immediate and efficient action and communication are still crucial to an effective breach response.

    Just When You Think the Breach is Over, the Lawsuit Comes

    On November 16th, a class of plaintiffs sued AvMed for a massive breach that resulted in their personal information being put at risk.  In December of 2009, unencrypted laptop computers were stolen from an AvMed facility in Gainesville, Fla.  AvMed initially believed information on about 208,000 members was at risk, but by June 2009 it became apparent that the information of over 1.22 million members was at risk.  Information contained on the laptops included a mixture of name, address, date of birth, Social Security number, phone number, and diagnosis, procedure and prescription information. The attorneys representing the class of plaintiffs maintain that had AvMed taken time to encrypt their laptops, this simple step would have obviated any harm done by the theft.  

    Like other breaches under HITECH involving PHI of 500 or more individuals, the AvMed breach is posted on HHS's Web site. However, although the federal government has enforcement jurisdiction over HITECH, there is still no private right that would allow one to sue under HITECH for breaches (although in the future individual may be eligible to collect a percentage of any Civil Monetary Penalties collected and resulting from violation of HIPAA and/or HITECh that result in "harm" to such individual). 

    Attorneys attempting to sue for damages resulting from a breach are often hard-pressed to keep their complaint from being tossed, unless they can demonstrate the plaintiff suffered actual harm caused by the breach. However, the attorneys representing the class of plaintiffs in the AvMedcase are commercial litigators, and so it will be interesting to see if they come up with more unique causes of action under consumer protection or other laws, and how this will be tested in court.  Stay tuned...

    Oh where, Oh where has the Security Breach Rule gone?

    Today, I was going to draft a follow up article to my previous post to address whether notification was required under the Security Breach Notification Rule. However, when I sat down to begin typing, I discovered that the Breach Rule was gone! Well, maybe not exactly gone, but at least “withdrawn”.

    HHS recently posted on its website the following:

    At this time, however, HHS is withdrawing the breach notification final rule from OMB review to allow for further consideration, given the Department’s experience to date in administering the regulations. This is a complex issue and the Administration is committed to ensuring that individuals’ health information is secured to the extent possible to avoid unauthorized uses and disclosures, and that individuals are appropriately notified when incidents do occur. We intend to publish a final rule in the Federal Register in the coming months. Until such time as a new final rule is issued, the Interim Final Rule that became effective on September 23, 2009, remains in effect.

    So now what?

    For starters, HHS made clear in its last statement that the Interim Final Rule that went into effect in September remains in effect. Therefore, whatever “difficulties” HHS/OMB/OCR may be having with regard to administering or enforcing compliance, this does not automatically give covered entities or business associates a “free pass” to not report Breaches if an incident warrants such notification –to patients, OCR or otherwise – under the Interim Final Rule. As of today, OCR’s website set up to receive breach notifications is still up and running, and so covered entities should continue make any required reports of breaches though that website. Also, entities should be mindful that if their state has implemented a breach notification statute (as many have), then such state law must still be complied with. To see if your state has such a law, visit NCSL's website.  

    As for what HHS will be doing to the Breach Rule? … it’s not clear. It has been suggested that the withdrawal may have been prompted to address certain privacy groups’ concerns regarding the “Harm” threshold. Particularly, the Harm threshold has been heavily criticized by some as creating a loophole to avoid reporting by Business Associates who are required ONLY to notify the Covered Entity regarding a Breach (which then triggers the Covered Entity’s obligation to notify HHS and patients of the Breach caused by the BA). Specifically, as the Breach Rule currently reads, a Business Associate is entitled to go through the same Harm analysis in order to decide whether or not to report the Breach to the Covered Entity. Some have compared this to the “fox guarding the chicken coop”. Nevertheless, I don’t believe that this warrants complete removal of the Harm balancing from the Breach Rule, and here is why:

    Continue Reading

    Aetna "forgets" file cabinet full of patient information

    A reminder to all covered entities out there that may be considering selling their business – don’t forget your file cabinet!! (or computers .. or disks ... or seemingly “empty” boxes where PHI may be lurking…..well, you get the picture).

    NJ Times reports today that Aetna is notifying 7,250 people after paper files containing their PHI was accidentally left in a file cabinet that was being sold after an office move. The press release indicates that over 2,346 New Jersey residents were affected and over 4,013 in Pennsylvania, as well as a few in Connecticut and Delaware. Apparently, the files were voluntarily returned to Aetna after the individual who purchased the file cabinet discovered them. Aetna issued a press release indicating that it “has no reason to believe the information will be misused in any manner." Nevertheless, Aetna is notifying affected individuals and offering them a credit-monitoring service. Aetna also indicates that it has many privacy policies and processes in place, but corrective action will be taken to ensure that such a “mistake” does not happen again.

    The Aetna “breach” raises a number of interesting questions, many which I often am asked about in similar contexts. Specifically: 1) Can PHI be disclosed in connection with a sale of a business? 2) Must a seller purge or maintain PHI that is not transferred in connection with the sale of such business? and, 3) Who do I have to notify in the event of a breach?

    I’ll tackle Questions #1 & #2 in today’s post, and save #3 for follow-up.

    HIPAA actually does not require a patient’s written authorization to use or disclose PHI in connection with the sale of a business, in certain limited circumstances. A sale of a business is considered a “health care operation,” which is defined in the HIPAA Privacy Rule to include:

    “the business management and general administrative activities of the covered entity including, but not limited to … (iv) the sale, transfer, merger, or consolidation of all or part of such entity with another covered entity, or an entity that following such activity [or completed purchase] will become a covered entity, and the due diligence related to such activity.” See §164.501.

    Therefore, if Aetna had sold its filing cabinet to an entity that was acquiring its health plan business, then there would have been no breach under the federal standards. However, in this situation, it appears that the patients’ files were simply inadvertently left in Aetna’s file cabinet after furniture was sold to a random buyer in connection with an office move.  As such, there appears to have been a lapse in either following or implementing adequate safeguards.

    The HIPAA Privacy Rule requires covered entities to implement appropriate administrative, technical, and physical safeguards to protect PHI from intentional and unintentional use or disclosure that is in violation of the Privacy Rule (see § 164.530(c)(1)-(2). However, it is the Security Rule that provides more detailed guidance on the types of safeguards that may be useful. Specifically, the Security Rule requires covered entities to:

    “implement policies and procedures that govern the receipt and removal of hardware and electronic media that contain electronic protected health information into and out of a facility, and the movement of these items within a facility.” (see §164.310(d)(1).

    The Rule goes on then to require covered entities to implement policies and procedures to address the final disposition of electronic protected health information, and/or the hardware or electronic media on which it is stored (see §164.310(d)(2)(i)-Disposal). The Security Rule also requires covered entities to maintain a record of the movements of hardware and electronic media and any person responsible therefore. (see §164.310(d)(2)(iii)–Accountability).

    Although the Security Rule technically applies only to electronic PHI, the Aetna situation illustrates why it makes sense to implement similar sorts of controls for paper PHI. After all, if it makes sense to keep track of computers that store electronic PHI so that such information does not inadvertently end up in the hands of someone who should not have it, would it not make sense to implement similar safeguard controls for a file cabinet that “houses” paper PHI?

    It would seem so.