Healthcare Mobile Device Encryption: Is It Required?

Encryption of mobile device technology has become essential in the eyes of the OCR.  Although HIPAA treats encryption as an “addressable” safeguard –as opposed to a “required” safeguard— under the Security Rule, the following OCR settlements involving unencrypted mobile devices indicate that encryption is obligatory for HIPAA compliance.

As new technologies emerge and the use of mobile technology in healthcare expands, Covered Entities and Business Associates must ensure that they are monitoring administrative and security measures to keep pace with evolving risks. In each case, below, the sanctioned party failed to properly implement a risk management plan and deploy encryption to protect the data stored on mobile technology.

Stolen USB results in $2.2 million settlement

On January 18, 2017, OCR announced a HIPAA settlement with MAPFRE Life Insurance Company of Puerto Rico (MAPFRE) after a USB data storage device containing electronic protected health information (ePHI) of 2,209 individuals was stolen from MAPFRE’S IT department.

In September 2011, MAPFRE filed a breach report after a USB data storage device was stolen from the IT department where it was left without safeguards overnight; the device included complete names, dates of birth, and Social Security numbers of the affected individuals. OCR’s investigation revealed that MAPFREE failed to conduct a risk assessment and implement security measures sufficient to reduce risk to a reasonable and appropriate level. MAPRE also failed to implement policies and procedures, workforce training for security awareness, and did not deploy encryption or an equivalent alternative measure on its laptops and removable storage media.

In addition to paying $2.2 million, MAPFRE agreed to conduct a risk analysis, implement a risk management plan, develop policies and procedures, conduct workforce training, and provide ongoing reports to OCR.

Lost mobile phone and laptop results in $3.2 million civil money penalty

On February 1, 2017, OCR issued a Notice of Final Determination including a civil money penalty for HIPAA violations against Children’s Medical Center of Dallas (Children’s) after two impermissible disclosures of the unsecured ePHI of over 6,200 individuals stored on mobile technology devices. Children’s is a pediatric hospital in Dallas, Texas, and is part of Children’s Health, the seventh largest pediatric health care provider in the nation.

Children’s filed a breach report in January 2010, reporting the loss of an unencrypted, non-password protected Blackberry device containing ePHI of 3,800 individuals at the Dallas/Fort Worth International Airport. Then in July 2013, Children’s filed a separate breach report indicating an unencrypted laptop containing ePHI of 2,462 individuals was stolen from its premises.  OCR’s investigation revealed that Children’s failed to implement a risk management plan even with prior recommendations to do so, as well as a failure to deploy encryption on its laptops, work stations, mobile devices, and removable storage media. Despite Children’s knowledge about the risk of maintaining unencrypted ePHI on its devices as far back as 2007, Children’s issued unencrypted BlackBerry devices to nurses and allowed workforce members to continue using unencrypted laptops and other devices until 2013.

Laptop stolen from workforce member’s car costs wireless health services provider $2.5 million

On April 24, 2017, OCR announced a $2.5 million settlement with CardioNet after the unsecured ePHI of 1,391 individuals was impermissibly disclosed when a workforce member’s laptop was stolen from a vehicle parked outside the employee’s home. The laptop was unencrypted.  CardioNet is a Pennsylvania based wireless health services provider, offering remote mobile monitoring and rapid response to patients at risk for cardiac arrhythmias.

OCR’s investigation revealed that CardioNet failed to conduct a risk assessment and finalize and implement policies and procedures for compliance with the HIPAA Security Rule. OCR also cited gaps in policies governing the receipt and removal of hardware and electronic media into and out of its facilities, the encryption of such media, and the movement of mobile devices within its facilities.

According to the Corrective Action Plan, CardioNet agreed to conduct a risk assessment, develop and implement a risk management plan, implement secure device and media controls, review and revise its HIPAA training program, and produce ongoing reports for HHS.

For additional information about the use of encryption technology for HIPAA compliance, see HHS’s Guidance to Render Unsecured Protected Health Information Unusable, Unreadable, or Indecipherable to Unauthorized Individuals. Also, see The Office of the National Coordinator for Health Information Technology’s guidance regarding Mobile Device Privacy and Security.

Please contact Anthony Halbeisen or Elana Zana if you have any questions about securing health data on mobile devices.

 

Stolen Laptop Costs Research Institute Millions

The Feinstein Institute for Medical Research (Feinstein) recently agreed to pay, the U.S. Department of Health and Human Services, Office for Civil Rights (OCR), $3.9 million to settle allegations that Feinstein violated the HIPAA Privacy and Security Rules. This settlement confirms the OCR’s position that nonprofit research institutes are held to the same standards as all other HIPAA covered entities.

The OCR began its investigation, after Feinstein filed a breach report revealing that a laptop computer containing electronic protected health information (ePHI) had been stolen from an employee’s car. The laptop contained the ePHI of approximately 13,000 patients and research participants. The laptop was unencrypted.
In addition to the breach, OCR’s investigation determined that Feinstein failed to:

(1) conduct a risk analysis of all of the PHI held at Feinstein, including the PHI on the stolen laptop;

(2) implement policies and procedures for granting access to ePHI to workforce members;

(3) implement physical safeguards for the laptop;

(4) implement policies and procedures managing the movement of hardware that contains ePHI; and

(5) implement encryption technology or to ensure that an alternative measure to encryption was deployed to safeguard the ePHI.

HIPAA does not expressly require encryption of ePHI, however, covered entities and business associates, who do not encrypt ePHI, are required to document why encryption is not reasonable or appropriate. Covered entities and business associates that do not encrypt ePHI are also required implement measures equivalent to encryption to safeguard ePHI.

 
In addition to other violations, the OCR’s investigation revealed that Feinstein failed to document why encrypting the laptop was not reasonable or appropriate. Further, contrary to having measures equivalent to encryption for safeguarding ePHI, the OCR found that Feinstein lacked policies and procedures for the receipt and removal of laptops containing ePHI from its facilities and policies and procedures for authorizing access ePHI.

 
This settlement provides us with three lessons. First, it’s important to realize that research institutes are held to the same standards as other covered entities. To the extent a research institute maintains PHI, it is essential to develop adequate policies and procedures to protect the PHI. Failing to do so, exposes the institute to considerable risk. Second, encrypting ePHI goes a long way towards reducing liability. Had Feinstein’s laptop been encrypted to the NIST standard, Feinstein’s ePHI would have been secured and Feinstein wouldn’t have been required to report a breach. Instead, as is often the case, the OCR’s investigation revealed multiple additional HIPAA violations. By not encrypting ePHI covered entities and business associates risk not only the cost of a breach, but also the potential for added costs following an OCR investigation. Lastly, covered entities and business associates that don’t encrypt their ePHI, are required to document why encryption is not reasonable or appropriate. Failing to do so is a HIPAA violation and subjects covered entities and business associates to liability.

The Myth of a HIPAA Compliant Product

Purchasing a “HIPAA compliant” technology product does not guarantee HIPAA compliance.

There. I said it.

In today’s healthcare marketplace, a vendor’s representation that its product is “100% HIPAA Compliant” is an important assurance for covered entities and business associates. Due to the complex and confusing HIPAA regulations, the idea of “purchasing” compliance can be very attractive.

Unfortunately, you cannot buy HIPAA compliance. To explain, allow me to use the example of encryption technology.

HIPAA Compliant Encryption

Nearly every vendor of an encryption product that targets the healthcare market will claim that the product is HIPAA compliant. This representation is critical because health information that is properly encrypted is exempt from the HIPAA breach notification rules.

But when a vendor states that its encryption product is “HIPAA compliant,” the vendor is merely stating that the product meets the HIPAA encryption guidelines for data at rest (stored data) and data in motion (data that is transmitted over networks).

In reality, the HIPAA Security Rule requires more than merely using technology that meets the encryption guidelines.

The HIPAA Security Rule – What Product is “Reasonable and Appropriate”?

The HIPAA Security Rule standard related to encryption states that covered entities and business associate must: “Implement a mechanism to encrypt and decrypt electronic protected health information.”

Because this standard is “addressable,” an entity must carefully analyze its operations to determine what type of encryption product is reasonable and appropriate for its business.

The analysis must focus on a number of different factors related to the entity, including:

  • The entity’s size, complexity and capabilities;
  • The entity’s technical infrastructure, hardware and software security capabilities;
  • Costs of encryption measures; and
  • Probability and criticality of potential risks to electronic PHI.

For example, if a small entity simply wants to send a limited number secured e-mails containing patient information, a top-of-the-line encryption product for all IT systems may not be necessary. Rather, a basic e-mail encryption product may suffice.

However, if a large health system regularly transmits a large amount of health information over public networks, a basic e-mail encryption product is probably not appropriate.

The vendor of the e-mail product might claim that its product is “HIPAA compliant,” but under the Security Rule, a deluxe encryption solution for the health system’s various IT systems probably makes more sense.

In all cases, it is important for the entity to document why it believes that a selected encryption product is appropriate for its operations.

Conclusion

The takeaway is that HIPAA compliance takes real work. While the idea of buying compliance might be attractive, HIPAA requires covered entities and business associate to look inward and conduct a thorough analysis of their operations.

Do not be misled by thinking that HIPAA compliance can be achieved by entering credit card information and pushing a button.

If you would like more information about HIPAA compliance, please contact Casey Moriarty.

Premera Breach: Is HIPAA Compliance Enough?

Many health care businesses assume that HIPAA compliance guarantees protection from data breaches. Unfortunately, this is not a correct assumption.

The health insurance company Premera Blue Cross recently announced that it was the target of a sophisticated cyber attack.  It is estimated that the personal information of eleven million individuals may have been accessed by hackers.

In the days following the breach, the Seattle Times ran an article about an audit conducted by the federal Office of Personnel Management (OPM)  and Office of Inspector General (OIG) on Premera’s operations prior to the breach.

Due to the health insurance coverage that Premera provides to federal employees, OPM and OIG had the right to audit Premera’s systems to ensure the security of the employees’ personal information.  According to the Seattle Times article, the federal agencies warned Premera of potential vulnerabilities with its information technology security prior to the breach.

What Did OPM and OIG Actually Find?

After reading the article, I assumed that the federal agencies found massive problems with Premera’s HIPAA security compliance.  Clearly, Premera would not have suffered the breach if it had complied with the HIPAA Security Rule, right?

Nope.

Page ii of the audit states the following:

Health Insurance Portability and Accountability Act (HIPAA)

Nothing came to our attention that caused us to believe that Premera is not in compliance with the HIPAA security, privacy, and national provider identifier regulations.

Instead, the security issues that the OPM and OIG found with Premera’s system appear to have involved more advanced features, including:

  • Lack of Piggybacking Prevention; and
  • Although Premera had a “thorough incident response and network security program,” it needed a better methodology for applying software patches, updates, and server configurations.  Note, that failing to appropriately patch software can lead to serious HIPAA violations, including OCR investigations and Settlements.  For more information about patching and HIPAA please read: “Failure To Patch Software Leads to $150,000 Settlement“.

Upon review of the audit report, it appears  that Premera did have fairly robust security safeguards.  For example, although it did not have the physical access control of piggybacking prevention, it had installed a multi-factor authentication key pad for each staff member.

The OPM and OIG certainly found issues with Premera’s security procedures, but the report repeatedly makes it clear that Premera:

  • Had adequate HIPAA privacy and security policy and procedures;
  • Updated its HIPAA policies annually and when necessary; and
  • Required employees to complete HIPAA compliance training each year.

HIPAA Compliance May Not Be Enough

The unfortunate takeaway from Premera’s data breach is that HIPAA compliance may not be enough to ensure security from attacks carried out by sophisticated hackers.

Although a covered entity’s security policies and procedures may technically comply with the HIPAA Security Rule, it is still critical to go further and address any known vulnerabilities that HIPAA may not even require to be addressed.

Contact Casey Moriarty for more information about HIPAA compliance.

Large Data Breach Highlights Risks from Foreign Hackers

Community Health Systems (CHS) has announced that the personal information of approximately 4.5 million patients has been breached.  According to CHS, the information includes patient names, addresses, social security numbers, telephone numbers, and birthdates.

Although the breached records do not contain the details of the patients’ treatment at CHS’ hospitals, the identifying information in the records still meets the HIPAA definition of “protected health information.”  Therefore, CHS will have to follow the HIPAA breach notification requirements.

According to CHS’ filing with the Securities and Exchange Commission, CHS has hired the data security firm, Mandiant, to investigate the breach.  Mandiant has pointed blame at a group originating from China who apparently orchestrated the breach through the use of sophisticated malware.

This large breach should be another reminder for health care providers to safeguard their electronic systems and educate staff members on security policies and procedures.  The type of malware that contributed to the CHS breach can often be installed by a staff member who clicks on a link in an e-mail, or responds to an e-mail from hackers who pose as security personnel.  In addition, health care providers should consider the use of encryption technology that meets the HIPAA breach safe harbor standards.

When in doubt about a suspicious e-mail, phone call, or other communication, staff members should always check with the provider’s information technology personnel and the HIPAA Privacy Officer before taking any action.

If you have any questions about the HIPAA breach notification requirements, please contact Casey Moriarty.

Violation of Privacy Rule Leads to $800,000 HIPAA Settlement

Indiana-based Parkview Health System (“Parkview”) has agreed to settle potential violations of the HIPAA Privacy Rule with the HHS Office for Civil Rights (“OCR”) by paying $800,000 and adopting a corrective action plan to address deficiencies in its HIPAA compliance program. The resolution agreement can be found here.

According to the HHS press release, the OCR opened an investigation after receiving a complaint from a retiring physician alleging that Parkview had violated the HIPAA Privacy Rule. In September 2008, Parkview took custody of medical records pertaining to approximately 5,000 to 8,000 patients while assisting the retiring physician to transition her patients to new providers, and while considering the possibility of purchasing some of the physician’s practice. On June 4, 2009, Parkview employees, with notice that the physician was not at home, left 71 cardboard boxes of these medical records unattended and accessible to unauthorized persons on the driveway of the physician’s home, within 20 feet of the public road and a short distance away from a heavily trafficked public shopping venue. It is unclear whether any of these medical records were actually viewed by anyone else.

In addition to the $800,000 payment, Parkview entered into a corrective action plan that requires them to:

  • Develop, maintain and revise, as necessary, written policies and procedures addressing requirements of the Privacy Rule and the corrective action plan (“Policies and Procedures”).  Specifically these Policies and Procedures must at a “minimum, provide for administrative, physical and technical safeguards (“safeguards”) to protect the privacy of non-electronic PHI to ensure that such PHI is appropriately and reasonably safeguarded from any intentional, unintentional or incidental use or disclosure that is in violation of the Privacy Rule.”
  • Provide Policies and Procedures to HHS within 30 days of Resolution Agreement’s Effective Date for HHS’s review and approval.
  • Distribute Policies and Procedures to all Parkview workforce members.
  • Periodically review the Policies and Procedures and update them to reflect changes in operations at Parkview, federal law, HHS guidance and/or any material compliance issues discovered by Parkview.
  • Notify HHS in writing within 30 days if Parkview determines that a workforce member has violated the Policies and Procedures (“Reportable Events”).
  • Provide general safeguards training to all workforce members who have access to PHI, as required by the Privacy Rule.
  • Provide training on its approved Policies and Procedures to all workforce members.
  • Submit to HHS a final report demonstrating Parkview’s compliance with the corrective action plan.

Organizations should pay careful attention to the transfer and disposal of both electronic and paper patient records. The OCR has provided helpful FAQs about HIPAA and the disposal of protected health information. For more information about complying with the HIPAA Privacy Rule, please contact Jefferson Lin or Elana Zana.

 

 

Rady HIPAA Breach – Access Controls & Training

Rady Children’s Hospital in San Diego announced this week that it has discovered two instances of impermissible disclosure of patient information – both disclosures arising from employees sending spreadsheets containing PHI to job applicants.  Surprisingly, Rady employees did not learn the lesson from their northern California neighbor, Stanford, which recently settled a lawsuit for $4 Million based on similar circumstances of a vendor releasing patient information to a job applicant.  In both the Rady situations (and at Stanford) identifiable patient information was sent to job applicants in order to evaluate those applicants’ skill sets.  The spreadsheets contained names, dates of birth, diagnoses, insurance carrier, claim information, and additional information.  Combined, the breach affected over 20,000 patients.

Rady has announced that it will take the following actions to prevent future events:

• Only commercially available and validated testing programs will be used to evaluate job applicants who will be tested onsite.
• We are increasing data security by further automating flagging of emails that may contain potential protected health or other sensitive information, and requiring an added level of approval before it can be sent.
• Rady Children’s is working with our email encryption provider to further strengthen our protection of sensitive data.
• Rady Children’s continually provides employees with education regarding privacy policies. We will be using these incidents as examples to better inform our leadership team and employees about the risks and the importance of the policies we have in place and train them in these new measures we are taking.

Though these steps are important, it is quite alarming that breaches such as these are still happening.  Why are job applicants receiving spreadsheets with patient information?  As Rady notes above, training exercises are commercially available.  Breaches, such as the one at Rady and at Stanford, reveal several flaws in HIPAA compliance – but two in particular rise to the surface.

1.  Access Controls.  The HIPAA Security Rule stresses the importance of access controls both internally and externally within a covered entity (and now business associates). Who gets access to the PHI, who gives that person access, and what access do they have?  The administrative, physical, and technical safeguard requirements all touch on whether access to PHI for workforce members is appropriate.  For example, a technical safeguard requirement specifically addressing access controls requires that covered entities, and business associates “implement technical policies and procedures for electronic information systems that maintain electronic protected health information to allow access only to those persons or software programs that have been granted access rights as specified in 164.308(a)(4).”  45 CFR 164.312.  Covered entities and business associates alike should evaluate who within their organizations actually need access to PHI to perform job functions.  Does the HR Department or an internal/external recruiter, arguably in charge of hiring new staff, need PHI in order to perform their job duties?  (Note, I do not opine here as to whether access to PHI was properly granted to the workforce members at Rady, as I lack sufficient information to make that judgment).  Determining if access to PHI is appropriate is both a requirement of the HIPAA Security Rule (though it is “addressable” you still need to address it!) and is a good mitigation tactic to avoid impermissible breaches, such as the one here.

2.  Training.  All covered entities and business associates are responsible for HIPAA Security training for all members of the workforce.  45 CFR 164.308.  Though training may vary depending on the workforce member’s use of PHI, all staff must be trained.  Training does not end following an initial session.  Periodic security updates are specifically identified in the Security Rule as an implementation specification.  These updates do not have to be limited to information about new virus protection software installed on the system. They can include valuable tidbits like case studies, HIPAA rule reminders, and HIPAA related headlines.  For some workforce members HIPAA may not be top of mind (specifically for those in business roles that may not deal with patients or patient information on a routine basis).  Providing periodic training updates and reminders, including examples of other HIPAA breaches (i.e. Stanford here) may be very useful in driving home how easy HIPAA breaches can be…and how expensive they are.

Avoidance of HIPAA breaches altogether is nearly impossible, but proper access controls and training can help mitigate against breaches such as the one that occurred here.

For more information about HIPAA Security contact Elana Zana.

 

$4.8 Million HIPAA Settlement – Patient Data on the Web

On May 7, 2014, HHS announced that New York-Presbyterian Hospital (“NYP”) and Columbia University (“CU”) agreed to collectively pay $4.8 million in the largest HIPAA settlement to date. The organizations settled charges that they potentially violated the HIPAA Privacy and Security Rules by failing to secure thousands of patients’ electronic protected health information (“ePHI”).

NYP and CU operate a shared data network that links patient information systems containing ePHI. On September 27, 2010, the two entities submitted a joint breach report following the discovery that the ePHI of 6,800 individuals had been impermissibly disclosed due to a deactivated server, resulting in ePHI being accessible on internet search engines. The ePHI included patient statuses, vital signs, medications, and laboratory results.

HHS Office for Civil Rights’ (“OCR”) subsequent investigation determined that neither entity had conducted an accurate and thorough risk analysis or developed an adequate risk management plan to address potential threats and hazards to ePHI security. Further, OCR found that NYP failed to implement appropriate policies and procedures for authorizing access to its databases and failed to comply with internal policies on information access management.

NYP agreed to pay $3.3 million and CU agreed to pay $1.5 million. In addition, both entities agreed to Corrective Action Plans that require each entity to:

  • Conduct a comprehensive and thorough risk analysis;
  • Develop and implement a risk management plan;
  • Review and revise policies and procedures on information access management and device and media controls;
  • Develop an enhanced privacy and security awareness training program; and
  • Provide progress reports.

Additionally, CU must also “develop a process to evaluate any environmental or operational changes” that impact the security of ePHI it maintains.

This settlement again highlights the necessity for healthcare organizations and business associates to create and implement Security policies and procedures, and to engage in a security management process that ensures the security of patient data.

For assistance on the HIPAA Security Rule requirements, drafting and implementing Security policies and procedures, or general HIPAA assistance please contact Elana Zana or Jefferson Lin.

 

Skagit County Agrees to Pay $215,000 for HIPAA Violations

On March 6, 2014, the U.S. Department of Health and Human Services, Office for Civil Rights (“OCR”) reached a $215,000 settlement with Skagit County in northwest Washington state for violations of the HIPAA Privacy, Security and Breach Notification Rules, according to terms of the Resolution Agreement.  This represents the first OCR settlement with a county government for HIPAA non-compliance. For two weeks in September 2011, the electronic protected health information (“ePHI”) for 1,581 individuals was exposed after the ePHI had been inadvertently moved to a publicly accessible web server maintained by Skagit County.  The accessible files included protected health information about the testing and treatment of infectious diseases.

The OCR investigation revealed that Skagit County failed to provide notification to individuals as required by the Breach Notification Rule and that the county failed to implement sufficient policies and procedures to prevent, detect, contain, and correct security violations. Further, Skagit County failed to provide necessary and appropriate security awareness and training for its workforce members.  As part of the settlement, the county has agreed to enter into a Corrective Action Plan to address deficiencies in various HIPAA compliance areas, including written policies and procedures, documentation requirements, training, and other measures.

This settlement highlights the importance for all covered entities and business associates, whether in the government or private sector, to implement policies and procedures to safeguard ePHI and, in case of a breach, to respond promptly and effectively. For more information about this OCR settlement or for assistance with HIPAA compliance, please contact Jefferson Lin or David Schoolcraft.

High Number of HIPAA Mobile Device Breaches – Time to Use Safe Harbor Encryption

Most breaches of electronic protected health information (ePHI) reported to the Department of Health and Human Services (HHS) have related to the theft or loss of unencrypted mobile devices. These breaches can lead to potentially hefty civil fines, costly settlements and negative publicity (e.g. Stanford and Idaho laptops or APDerm thumb drive). Given the increasing use of mobile devices and the significant costs of breach notification, healthcare organizations and their business associates would be wise to invest in encryption solutions that fall within the “safeharbor” for HIPAA breach notification.

Encryption and the “Safeharbor” for HIPAA Breach Notification

Under HHS guidance, ePHI is not considered “unsecured” if it is properly encrypted by “the use of an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without use of a confidential process or key” and such confidential process or key that might enable decryption has not been breached.  To avoid a breach of the confidential process or key, these decryption tools should be stored on a device or at a location separate from the data they are used to encrypt or decrypt.  Encryption processes “consistent with” (for data at rest) or which “comply, as appropriate, with” (for data in motion) the National Institute for Standards and Technology (“NIST”) guidelines are judged to meet the law’s standard for encryption.  If ePHI is encrypted pursuant to this guidance, then no breach notification is required following an impermissible use or disclosure of the information—this is known as the HIPAA breach notification “safeharbor”. [78 FR 5664]

NIST Guidelines for Data at Rest

The NIST guidelines for data at rest do not provide specific requirements for encryption technology– instead, it describes common storage encryption technologies (full disk, volume, virtual, and file/folder encryption) and offers recommendations for implementing a storage encryption solution. A main takeaway from this guide is that “the appropriate encryption solution for a particular situation depends primarily upon the type of storage, the amount of information that needs to be protected, the environments where the storage will be located, and the threats that need to be mitigated.” Despite the lack of bright-line rules, the NIST guide does offer some key recommendations, such as:

  • When selecting a storage encryption technology, consider solutions that use existing system features (such as operating system features) and infrastructure.
  • Use centralized management for all deployments of storage encryption except for standalone deployments and very small-scale deployments.
  • Select appropriate user authenticators for storage encryption solutions.
  • Implement measures that support and complement storage encryption for end user devices.

Encryption Technology for Apple iOS Devices: A Case Study

The good news is that the technology is available to properly encrypt ePHI without being too burdensome.  For instance, Apple’s popular iPhones and iPads fortunately have their own built-in encryption technology.  Every iOS device has a “dedicated AES (Advanced Encryption Standard) 256 crypto engine built into the DMA (Direct Memory Access) path between the flash storage and main system memory, making file encryption highly efficient.”  Setting a passcode turns on Data Protection, and the passcode becomes a key to encrypting mail messages and attachments (or other apps), using 256-bit AES encryption. Notably, Apple’s encryption technology (CoreCrypto Module and CoreCrypto Kernel Module) has been FIPS (Federal Information Processing Standards) certified, a standard that the NIST guide references and approves.

Based on the NIST guidelines for data at rest, the following are some basic steps for implementing a storage encryption technology solution specifically with Apple iOS devices:

  • Ensure that users have up-to-date devices and operating systems (e.g. iPhone 4 or higher running iOS 4 or higher).
  • Work with an IT administrator or security expert to manage deployment of iPhones.
  • Select appropriate passcode requirements to meet your security needs, including timeout periods, passcode strength and how often the passcode must be changed. The effectiveness of data protection depends on a strong passcode, so it is important to require and enforce a passcode stronger than 4 digits when establishing passcode policies.
  • Store/transmit the minimum amount of ePHI necessary to effectuate communication.
  • Disable access to Notification Center and Alerts from locked screen to prevent display of potentially sensitive data.
  • Revise and document organizational policies as needed to incorporate appropriate usage of the storage encryption solution.
  • Make users aware of their responsibilities for storage encryption, such as physically protecting mobile devices and promptly reporting loss or theft of devices.

For additional guidance on mobile device security, the HHS Office of the National Coordinator for Health Information Technology (“ONC”) has also provided helpful tips in “How Can You Protect and Secure Health Information When Using a Mobile Device?”.

As healthcare becomes more mobile, covered entities, business associates, and health information technology vendors should become familiar with the “safeharbor” for HIPAA breach notification and the NIST guidelines for encryption of data at rest and in transit.  For more information about the HIPAA “safeharbor”, encryption standards, or HIPAA in general, please contact Jefferson Lin, Lee Kuo or David Schoolcraft.