Contributed by Marcia Augsburger and Scott Koller.
The Health Insurance Portability and Privacy Act of 1996 (HIPAA) is 15 years old this year – still acting a bit like an uncertain, wide-eyed teenager responding to new developments. Although more mature, clarified by regulations, and supplemented by the HITECH Act, at its core HIPAA has remained relatively unchanged since its enactment. Societal changes implicating HIPAA, however, have been significant. Over the past five years alone, we saw the rise of Facebook, the domination of Google, and the introduction of powerful personal electronic devices such as Apple’s iPhone and iPad. In addition, technologies such as cloud computing, wireless communication, and telemedicine have reached a level of reliability and affordability that has allowed healthcare providers to expand their reach and services. With every emerging technology, the specter of HIPAA compliance remains a key concern, while its application becomes more murky.
HIPAA was designed to be technology neutral. Accordingly, the statute is worded in terms of principles of compliance instead of specific measures to be implemented. While this permits flexibility so that the law can continue to be relevant as time and technology progress, it also creates ambiguity. Indeed, so ambiguous are HIPAA statutes that there continues to be a debate over its application to a technology as ubiquitous as email.
Nonetheless, HIPAA offers a methodical, step-by-step process for reviewing new programs, applications, and technologies to ensure technical safeguards are in place. The safeguards cover five areas: Access controls; audit controls; integrity controls, authentication, and transmission security. This article addresses each of these, and explains the challenges they present in evaluating compliance issues as applied to emerging technologies.
I. Access Controls
The first area addressed by the Technical Safeguards deals with Access Control. HIPAA requires the covered entity (CE) or business associate (BA) to implement technical policies and procedures that allow only authorized persons access to protected health information (PHI).[i] Apart from this rather broad requirement, HIPAA left the implementation of access controls to CEs and BAs, who may select the technologies that best fit their organizations, so long as the controls are consistent with the four areas of focus within the Access Control standard. The controls must include a unique user identification system, emergency access procedure, automatic termination with inactivity, and encryption.
A. Unique User Identification (Required). HIPAA requires each user to be assigned a unique name or number.[ii] The purpose is to allow the CE to track specific user activity and to hold those users accountable for functions performed while logged into covered systems. When selecting an identification scheme, CEs and BAs should consider how the unique identifier will be used internally and externally. If the identifier is used primarily within an organization by employees, then an entity may use the employee name or similar variation (e.g., jsmith). However, using a random set of numbers and characters may be preferred if the name itself may express PHI (e.g., jsmith on a list of Alcoholics Anonymous members).[iii]
System designers must also be careful to limit the use of the selected unique identifiers. Software programmers for Apple’s iPhone learned this lesson the hard way when, at Apple’s suggestion, they identified specific users using the unique device identifiers (UDID) that were built-into iPhones.[iv] These UDIDs were accessible by other apps, some of which had significantly less security in place for the protection of personal information. By using the same UDID, the protection accorded that identifier is only as good as the least secure app using the identifier. This illustrates the advisability of avoiding a user identification scheme that other applications or software also use.
B. Emergency Access Procedure (Required). The CE must have a procedure in place for obtaining necessary electronic PHI (ePHI) during an emergency.[v] In an emergency, especially where a natural disaster cuts off power, electronically stored information is imperiled. After all, the lifeblood of technology is electricity. To determine the need for back-up generators or paper files, CEs and BAs should evaluate from the outset what information will be needed in an emergency for patient care and treatment and how best to create redundancies to preserve it.
C. Automatic Logoff (Addressable)[vi]. When reasonable and appropriate, a CE must implement electronic procedures that terminate an electronic session after a predetermined period of inactivity.[vii] This is particularly important when dealing with applications for personal electronic devices such as smart phones, which are highly portable and can be easily misplaced. While HIPAA does not mention a specific timeframe, the termination or logoff function should take into account the likelihood of an unauthorized user encountering the system. In addition, screensavers and/or automatic logoffs, which are built into many systems, should always be enabled.
D. Encryption and Decryption (Addressable). To take advantage of a HIPAA “safe harbor,”[viii] CEs must use encryption to protect data from unauthorized access.[ix] Of the four Access Control safeguards, encryption is by far the most difficult to implement. Whereas a unique identifier, an emergency back-up, and an automatic log-off are fairly easy to implement, encryption involves the use of complex algorithms and a series of confidential “keys” used to code or access the data.
At the core of every encryption scheme is a mathematical algorithm, the strength of which depends on its key-length size or bits. HIPAA does not mandate the use of any specific type or strength of encryption. Most financial institutions use 256 bit encryption for banking transactions, while several reputable e-commerce sites use key lengths of 128 bits to process credit cards. Although HIPAA permits flexibility, it would be inadvisable to implement a key shorter than 128 bits. To put this in perspective, it would take a modern computer 149,745,258,842,898 years to break a 128 bit key whereas the same computer could crack a 64 bit key in approximately four minutes.[x]
Even if a lengthy key is used, a mistake or flaw in the mathematical formula can render the entire encryption scheme vulnerable. New or customized encryption schemes pose a greater risk of discoverable flaws than encryption algorithms that have been certified by the National Institute of Standards and Technology (NIST).[xi] This may explain why using NIST standards for encryption qualify as a “safe harbor” under HIPAA.[xii]
In addition to using flawed or short encryption keys, a common mistake in encryption is failure to secure the key itself. When online whistle-blower website WikiLeaks distributed classified government cables to the press, it used a top of the line AES-256 bit encryption but failed to secure the key. The key was published, rendering the entire encryption scheme useless. The security surrounding encryption keys, including old and retired keys, should receive the same level of scrutiny as the data they are protecting.
This is part one of a three part series discussing HIPAA and emerging technologies. Part two explores Authentication, Audit and Integrity controls under HIPAA.
[i] 45 C.F.R. § 164.312(a) for covered entities and business associates under the HITECH Act.
[ii] 45 C.F.R. § 164.312(a)(2)(i).
[iii] Department of Health & Human Services (DHHS), Security Standards. Published 5/2002, revised 3/2007 (“A randomly assigned user identifier is more difficult for an unauthorized user (e.g., a hacker) to guess, but may also be more difficult for authorized users to remember and management to recognize.”
[iv] Hardawar, Devindra, Apple phasing out iOS UDID access to solve privacy woes. Retrieved September 18, 2011, from http://venturebeat.com/2011/08/23/ios-5-udid-privacy/.
[v] 45 C.F.R. § 164.312(a)(2)(ii).
[vi] DHHS provides flexibility to covered entities. stating whether a specification is "required" or "addressable." If the specification is "required," the CE must implement the specification as stated in the Security Rule. If the specification is "addressable" then the CE must:1. Assess whether the specification is a reasonable and appropriate safeguard in its environment and likely to contribute to protecting the entity's electronic protected health information; and 2. Implement the specification or document why it would not be reasonable and appropriate and implement an equivalent alternative measure if reasonable and appropriate. DHHS, What is the difference between addressable and required implementation specifications in the Security Rule? Retrieved September 19, 2011 from http://www.hhs.gov/ocr/privacy/hipaa/faq/securityrule/2020.html.
[vii] 45 C.F.R. § 164.312(a)(2)(iii).
[viii] Specifically, HHS has stated that if an organization uses recommended technologies and methodologies that render PHI unusable, unreadable, or indecipherable to unauthorized individuals, then that PHI would not qualify as “unsecured” PHI for purposes of the breach notification requirements, which only applies to “unsecured” PHI. “Guidance Specifying the Technologies and Methodologies That Render Protected Health Information Unusable, Unreadable, or Indecipherable to Unauthorized Individuals for Purposes of the Breach Notification Requirements Under Section 13402 of Title XIII (Health Information Technology for Economic and Clinical Health Act) of the American Recovery and Reinvestment Act of 2009; Request for Information,” 74 Fed. Reg. 19006 (April 27, 2009)
[ix] 45 C.F.R. § 164.312(a)(2)(iv)
[x] Clayton, Richard, Brute force attacks on cryptographic keys. Retrieved September 18, 2011, from http://www.cl.cam.ac.uk/~rnc1/brute.html.
[xi] National Institute of Standards and Technology, Computer Security Division. Retrieved September 18, 2011 from http://csrc.nist.gov/.
[xii] “Guidance Specifying the Technologies and Methodologies That Render Protected Health Information Unusable, Unreadable, or Indecipherable to Unauthorized Individuals for Purposes of the Breach Notification Requirements Under Section 13402 of Title XIII (Health Information Technology for Economic and Clinical Health Act) of the American Recovery and Reinvestment Act of 2009; Request for Information,” 74 Fed. Reg. 19006 (April 27, 2009).