Search Results for: stanford

$4 Million Stanford Settlement – Business Associate Pays Majority

Remember the $20 Million class action law suit against Stanford due to the posting of an Excel file online by a Business Associate?  The law suit, driven by California state privacy laws recently settled for $4 Million, with the Business Associate paying the bulk of the settlement.  The class action suit, one of five large Stanford related large HIPAA breaches, stems from a 2010 disclosure of emergency room patient data affecting 20,000 patients. The majority of the settlement fund, $3.3 million will come from Stanford’s business associate. Stanford is contributing $500,000 for a vendor education fund and is paying $250,000 in settlement administrative costs.  Though a significant reduction from the $20 Million original claim, the $4 Million settlement price tag is not a drop in the bucket.

The major lesson to glean from this case is that covered entities should better investigate their vendors before transmitting PHI.  Meaning not just simply executing a Business Associate Agreement with an indemnification and insurance provision (though advisable), but also reviewing/evaluating their current security policies, staff training, use of subcontractors, and encryption standards.  For more information about HIPAA please contact Elana Zana.

Stolen Laptop Leads to Stanford’s Fifth HIPAA Breach

Earlier this month Stanford reported its 5th HIPAA breach since 2009.  This is Stanford’s third largest breach, affecting nearly 13,000 patients.   A broken laptop containing protected health information of pediatric patients was stolen from a restricted area of the Lucile Packard Children’s Hospital at Stanford.  The laptop was un-encrypted and contained patient information including: name, medical record number, age telephone numbers, surgical procedures and treating physicians.  Though the laptop had a broken screen, there is still the possibility of extracting the data from the computer.

Stanford’s other breaches include a disclosure  of 20,000 patient records when a subcontractor of a business associate placed patient information on the web seeking assistance with using Excel, the data was left on the website for nearly a year.  This breach has resulted in a $20 Million class action law suit under California law.

Earlier this year, Stanford announced its largest breach, affecting 57,000 patient records when an unencrypted laptop with patient information was stolen from a physician’s car.  In addition, Stanford reported a breach in 2012 of 2,500 patient records following the theft of an unencrypted laptop from a physician’s office.  Lastly, in 2010, Stanford was hit with a fine after failing to notify the state of California of the theft of a laptop by an employee containing over 500 patient records.

Considering Stanford’s previous breaches, encryption of its laptops would be a good course of action to prevent future HIPAA data breaches.  Stanford has reported that it now encrypts its laptops, but the one that was most recently stolen was unencrypted because the screen was broken.

Lessons learned from Stanford’s misfortunes:  encrypt all PHI and destroy broken devices (remember though broken, the data is still valuable to thieves).

For assistance with  HIPAA and/or the breach notification rules please contact Elana Zana.

Rady HIPAA Breach – Access Controls & Training

Rady Children’s Hospital in San Diego announced this week that it has discovered two instances of impermissible disclosure of patient information – both disclosures arising from employees sending spreadsheets containing PHI to job applicants.  Surprisingly, Rady employees did not learn the lesson from their northern California neighbor, Stanford, which recently settled a lawsuit for $4 Million based on similar circumstances of a vendor releasing patient information to a job applicant.  In both the Rady situations (and at Stanford) identifiable patient information was sent to job applicants in order to evaluate those applicants’ skill sets.  The spreadsheets contained names, dates of birth, diagnoses, insurance carrier, claim information, and additional information.  Combined, the breach affected over 20,000 patients.

Rady has announced that it will take the following actions to prevent future events:

• Only commercially available and validated testing programs will be used to evaluate job applicants who will be tested onsite.
• We are increasing data security by further automating flagging of emails that may contain potential protected health or other sensitive information, and requiring an added level of approval before it can be sent.
• Rady Children’s is working with our email encryption provider to further strengthen our protection of sensitive data.
• Rady Children’s continually provides employees with education regarding privacy policies. We will be using these incidents as examples to better inform our leadership team and employees about the risks and the importance of the policies we have in place and train them in these new measures we are taking.

Though these steps are important, it is quite alarming that breaches such as these are still happening.  Why are job applicants receiving spreadsheets with patient information?  As Rady notes above, training exercises are commercially available.  Breaches, such as the one at Rady and at Stanford, reveal several flaws in HIPAA compliance – but two in particular rise to the surface.

1.  Access Controls.  The HIPAA Security Rule stresses the importance of access controls both internally and externally within a covered entity (and now business associates). Who gets access to the PHI, who gives that person access, and what access do they have?  The administrative, physical, and technical safeguard requirements all touch on whether access to PHI for workforce members is appropriate.  For example, a technical safeguard requirement specifically addressing access controls requires that covered entities, and business associates “implement technical policies and procedures for electronic information systems that maintain electronic protected health information to allow access only to those persons or software programs that have been granted access rights as specified in 164.308(a)(4).”  45 CFR 164.312.  Covered entities and business associates alike should evaluate who within their organizations actually need access to PHI to perform job functions.  Does the HR Department or an internal/external recruiter, arguably in charge of hiring new staff, need PHI in order to perform their job duties?  (Note, I do not opine here as to whether access to PHI was properly granted to the workforce members at Rady, as I lack sufficient information to make that judgment).  Determining if access to PHI is appropriate is both a requirement of the HIPAA Security Rule (though it is “addressable” you still need to address it!) and is a good mitigation tactic to avoid impermissible breaches, such as the one here.

2.  Training.  All covered entities and business associates are responsible for HIPAA Security training for all members of the workforce.  45 CFR 164.308.  Though training may vary depending on the workforce member’s use of PHI, all staff must be trained.  Training does not end following an initial session.  Periodic security updates are specifically identified in the Security Rule as an implementation specification.  These updates do not have to be limited to information about new virus protection software installed on the system. They can include valuable tidbits like case studies, HIPAA rule reminders, and HIPAA related headlines.  For some workforce members HIPAA may not be top of mind (specifically for those in business roles that may not deal with patients or patient information on a routine basis).  Providing periodic training updates and reminders, including examples of other HIPAA breaches (i.e. Stanford here) may be very useful in driving home how easy HIPAA breaches can be…and how expensive they are.

Avoidance of HIPAA breaches altogether is nearly impossible, but proper access controls and training can help mitigate against breaches such as the one that occurred here.

For more information about HIPAA Security contact Elana Zana.

 

High Number of HIPAA Mobile Device Breaches – Time to Use Safe Harbor Encryption

Most breaches of electronic protected health information (ePHI) reported to the Department of Health and Human Services (HHS) have related to the theft or loss of unencrypted mobile devices. These breaches can lead to potentially hefty civil fines, costly settlements and negative publicity (e.g. Stanford and Idaho laptops or APDerm thumb drive). Given the increasing use of mobile devices and the significant costs of breach notification, healthcare organizations and their business associates would be wise to invest in encryption solutions that fall within the “safeharbor” for HIPAA breach notification.

Encryption and the “Safeharbor” for HIPAA Breach Notification

Under HHS guidance, ePHI is not considered “unsecured” if it is properly encrypted by “the use of an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without use of a confidential process or key” and such confidential process or key that might enable decryption has not been breached.  To avoid a breach of the confidential process or key, these decryption tools should be stored on a device or at a location separate from the data they are used to encrypt or decrypt.  Encryption processes “consistent with” (for data at rest) or which “comply, as appropriate, with” (for data in motion) the National Institute for Standards and Technology (“NIST”) guidelines are judged to meet the law’s standard for encryption.  If ePHI is encrypted pursuant to this guidance, then no breach notification is required following an impermissible use or disclosure of the information—this is known as the HIPAA breach notification “safeharbor”. [78 FR 5664]

NIST Guidelines for Data at Rest

The NIST guidelines for data at rest do not provide specific requirements for encryption technology– instead, it describes common storage encryption technologies (full disk, volume, virtual, and file/folder encryption) and offers recommendations for implementing a storage encryption solution. A main takeaway from this guide is that “the appropriate encryption solution for a particular situation depends primarily upon the type of storage, the amount of information that needs to be protected, the environments where the storage will be located, and the threats that need to be mitigated.” Despite the lack of bright-line rules, the NIST guide does offer some key recommendations, such as:

  • When selecting a storage encryption technology, consider solutions that use existing system features (such as operating system features) and infrastructure.
  • Use centralized management for all deployments of storage encryption except for standalone deployments and very small-scale deployments.
  • Select appropriate user authenticators for storage encryption solutions.
  • Implement measures that support and complement storage encryption for end user devices.

Encryption Technology for Apple iOS Devices: A Case Study

The good news is that the technology is available to properly encrypt ePHI without being too burdensome.  For instance, Apple’s popular iPhones and iPads fortunately have their own built-in encryption technology.  Every iOS device has a “dedicated AES (Advanced Encryption Standard) 256 crypto engine built into the DMA (Direct Memory Access) path between the flash storage and main system memory, making file encryption highly efficient.”  Setting a passcode turns on Data Protection, and the passcode becomes a key to encrypting mail messages and attachments (or other apps), using 256-bit AES encryption. Notably, Apple’s encryption technology (CoreCrypto Module and CoreCrypto Kernel Module) has been FIPS (Federal Information Processing Standards) certified, a standard that the NIST guide references and approves.

Based on the NIST guidelines for data at rest, the following are some basic steps for implementing a storage encryption technology solution specifically with Apple iOS devices:

  • Ensure that users have up-to-date devices and operating systems (e.g. iPhone 4 or higher running iOS 4 or higher).
  • Work with an IT administrator or security expert to manage deployment of iPhones.
  • Select appropriate passcode requirements to meet your security needs, including timeout periods, passcode strength and how often the passcode must be changed. The effectiveness of data protection depends on a strong passcode, so it is important to require and enforce a passcode stronger than 4 digits when establishing passcode policies.
  • Store/transmit the minimum amount of ePHI necessary to effectuate communication.
  • Disable access to Notification Center and Alerts from locked screen to prevent display of potentially sensitive data.
  • Revise and document organizational policies as needed to incorporate appropriate usage of the storage encryption solution.
  • Make users aware of their responsibilities for storage encryption, such as physically protecting mobile devices and promptly reporting loss or theft of devices.

For additional guidance on mobile device security, the HHS Office of the National Coordinator for Health Information Technology (“ONC”) has also provided helpful tips in “How Can You Protect and Secure Health Information When Using a Mobile Device?”.

As healthcare becomes more mobile, covered entities, business associates, and health information technology vendors should become familiar with the “safeharbor” for HIPAA breach notification and the NIST guidelines for encryption of data at rest and in transit.  For more information about the HIPAA “safeharbor”, encryption standards, or HIPAA in general, please contact Jefferson Lin, Lee Kuo or David Schoolcraft.