Creating and conducting an organizationwide risk analysis: Part 2
Editor's note: This is part two of a series about implementing an organizationwide risk analysis. See the May 2016 issue of BOH for part one.
Performing a regular organizationwide risk analysis is a basic HIPAA requirement and also simply good business practice. Beyond checking off an item on the HIPAA compliance list, a risk analysis will help an organization identify and rank security weaknesses, efficiently use resources to address them, and ultimately protect the security and integrity of an organization's data, including PHI, financial, and business operations information. Yet in a world of competing demands and limited resources, a risk analysis may be put off until it's too late. Even if one is completed, security officers may encounter obstacles when trying to act on the results of the risk analysis.
The purpose of a risk analysis is to develop a strategic plan of action that addresses and corrects vulnerabilities, and shouldn't be used to simply create a report on the current state of security, says Kate Borten, CISSP, CISM, HCISPP, founder of The Marblehead Group in Marblehead, Massachusetts. "Only when an organization performs periodic and as-needed risk assessments, and then mitigates significant risks, can the ISO [information security officer] and leadership have the confidence that their security program is functioning and adequate," she says.
A risk analysis is one of several activities that is part of a risk management program, says Rick Ensenbach, CISSP-ISSMP, CISA, CISM, CCSFP, manager of risk advisory and forensic services at Wipfli, LLP, in Eau Claire, Wisconsin. The risk management program is about managing risks to the organization (i.e., business mission, image, reputation, and patient safety and privacy), organizational assets, and workforce. An organization can't mitigate risks it isn't aware of and doesn't understand.
Risks are first identified, then analyzed and evaluated based on what action is needed, Ensenbach says. They also must be monitored on an ongoing basis, a vital step that if missed can undermine an otherwise solid risk management program.
The Health Information Technology for Economic and Clinical Health (HITECH) Act, part of the larger American Recovery and Reinvestment Act of 2009, was created to encourage and regulate the use of technology in healthcare. HITECH brought meaningful use, an incentive plan designed to increase the use of certified electronic medical records, and amendments to the Security Rule of the Health Insurance Portability and Accountability Act of 1996 (HIPAA). Although some provisions of HITECH have not been implemented (e.g., the more robust three-year accounting of disclosures for electronic protected health information [PHI]), the following is a list of the major topics that have been amended with the adoption of HITECH:
A breach of PHI is the last thing a privacy or security officer wants but, large or small, breaches can happen. The best-laid defenses can be undermined by simple human error or a cyber-criminal hacking on the cutting edge of technology. When that happens, you need a security incident response plan.
Disaster plan
A formal security incident response plan should be developed and maintained similar to a data center disaster response plan, Kate Borten, CISSP, CISM, HCISPP, founder of The Marblehead Group, Marblehead, Massachusetts, says. IT departments should be accustomed to disaster recovery plans that guide the department's response to any disaster (e.g., fire, flood, earthquake) that affects computer systems. Security incident response plans can be seen as comparable and equally important.
When a breach is identified, the first step should be to stop the bleeding. Take steps to prevent a recurrence or limit the damage. This could be especially important for security breaches that involve hacking or PHI that was accidentally made accessible to the public on a website or cloud service. In such a situation, it would be prudent to shut down affected websites, portals, or remove access to data repositories, according to Frank Ruelas, MBA, principal of HIPAA College in Casa Grande, Arizona.
Follow a plan from the start to ensure that risks are mitigated quickly. The plan should include appropriate steps to take depending on the type of security incident, who should be part of the incident response team, and how information about the breach should be communicated within the organization, according to Chris Apgar, CISSP, president of Apgar and Associates in Portland, Oregon. Having a detailed plan that lists members of the incident response team means more time can be spent addressing the breach than asking questions about who should be involved.
A security incident response plan will also help an organization determine what level of action it needs to take. "There will be some incidents, including breaches, where it's not necessary to pull together the whole team and go through every step in the plan," Apgar says. "For example, if a patient notifies you that she received another patient's EOB [explanation of benefits], it may not be necessary to call everyone together."
In that example, Apgar says, because the organization already knows who was impacted by the breach, the response is simply a matter of following the breach notification steps set by HIPAA and any applicable state laws.
Creating and conducting an organizationwide risk analysis: Part 1
Editor's note: This is part one of a series about implementing organizationwide risk analyses. Look for part two in an upcoming issue of BOH.
OCR's breach settlements, corrective action plans (CAP), and penalties often take organizations to task for not completing a regular organizationwide risk analysis, yet it's all too easy for this important job to fall by the wayside. A lack of resources and competing demands within an organization can push the risk analysis to the bottom of the list of priorities. But this leaves an organization vulnerable to threats it will only see in hindsight. It also often leads to scrutiny from OCR and the public.