The Role of the IRB and Information Security

Teresa_DoksumSean_OwenOn February 26, 2015, PRIM&R hosted a webinar titled Data Security Incidents: the Role of the IRB and Information Security. This topic was submitted by Teresa Doksum, PhD, MPH, and Sean Owen, CISP, CAP, CRISC, through the 2014 Call for Webinar Proposals. The potential for data security incidents in research with human subjects requires institutional review boards (IRBs) to work closely with information security experts to prevent these types of incidents and, if they do occur, to respond effectively to meet the strict reporting requirements. Topics covered during the webinar included reporting requirements and regulatory definitions, the roles of the IRB and information security department, and procedures to coordinate response to security breaches. Following a lively question-and-answer period at the conclusion of the webinar, Teresa and Sean kindly agreed to respond to some additional questions in writing to share with the readers of Ampersand.

1. Would you be willing to share your data security template?
Yes! You can download our guide with instructions for developing a detailed data security plan, along with sections you can copy/paste to create your own.

2. What procedures should be in place in case research data is confiscated by officials at international sites?
Study teams should always follow the local laws and contractual requirements of their international studies. If your data has been confiscated, you may be dealing with issues that are not security related, but potentially legal or contractual. If your computers or devices have been confiscated, you probably need to work with the customs or other authorities to get the devices released. This should be documented as an incident until the issue is resolved.

There are preventative measures you should take when working internationally. First would be to make sure you don’t disclose the data to unauthorized parties. To do so, full drive encryption would be the best solution (make sure you are in compliance with the International Traffic in Arms Regulations). If you are using USB devices or removable media (e.g., microSD cards) to store or transfer data, they should also be encrypted. Now that doesn’t solve your problem of needing to analyze the data. We recommend that our project teams set up regular intervals to upload the data (as permitted) to a cloud service or the institution’s file transfer service. Preparing ahead of time for when those transfers should be done will be very important.

3. What is the fundamental difference between faxing unencrypted protected health information (PHI) and text-messaging unencrypted PHI?
When it comes to security, at a basic level, they are the same in that both methods involve sending information in an unencrypted form over an untrusted network. When it comes to the risk posed, there are some differences. Faxes communicate by talking directly to each fax machine. This means that unless there is someone listening to that call, there are no other parties involved in the transmission. Other risks include sending to the wrong fax number, someone other than the addressee picking up your fax, or your recipient leaving the fax in an insecure location. In general, outside from digital fax services (e.g., myfax.com, fax servers, and similar digital-to-paper services), there are no other copies of the information.

Short message service (SMS, also known as “texting”) is also not encrypted, but involves a third party. Every SMS communication is relayed by the cellular companies involved. While most cellular companies state they don’t retain the content of the message, the sending and recipient devices will retain a copy. That copy may be backed up into a cloud service or onto a local computer. So the risk with SMS is that the data are quickly, sometimes unknowingly, copied to multiple locations.

4. The webinar presenters referenced disclosure of a data breach within 60 minutes is a federal law found under the OMB Circular 322 and 615. Can you verify this is accurate?
Yes, the 60-minute requirement mentioned in the presentation comes from OMB M-06-19. Depending on the type of relationship your study has with the federal government, this may not be applicable. This HHS policy (since many webinar attendees might work with HHS) may also help.

5. Does the 60-minute timer for reporting data security incidents apply only to Department of Health and Human Services (DHHS)-funded studies or other funders as well?
This rule applies to all federal contracts regardless of agency. Details can be found in Office of Management and Budget (OMB) Memorandum that was released in 2006 (OMB M-06-19).

6. Can you expand more on how you arrived at bringing information security into the IRB? How did you make the case for it and/or entice information security to join you?
It took a couple years. It was clear from the start that studies will only increase in the amount and complexity of digital data, and that the risks associated with hacking and such are best known by those who are in the field every day. The first year was spent figuring out if our security team’s goals aligned with our IRB goals—the IRB protects human subjects and information security protects data. Since we are not a medical IRB and rarely deal with studies that pose risk of physical or mental harm, most of our focus is on protecting the subject’s data from misuse or disclosure and on the institution’s back office data (e.g., human resources information, financial data). However, Abt’s IRB made the case that, due to the volume and complexity of regulations, research data collection activities presented a larger risk to Abt than back office data.

Initially, we used information security in an external capacity as an advisor to the IRB. Over the next couple of years, the advisory role became so instrumental in reducing risk to participants and their data that we invited an information security person to become a voting member of the IRB. The goals of the information security team and the IRB were linked together and their strategic plans were also linked. Both the goal and plans are co-developed. It’s important to note that information security was not absorbed into IRB.

On a corporate level, the decision to bring in security was tied into the company’s values and mission. Research institutions collect and process an abundance of sensitive data, and security breaches and incidents have an impact on funders’ trust that we will protect the participants and their reputation, as well as affect our own reputation. The integration was seen as a way to reduce the likelihood of an incident while having a very small impact to cost—a win/win situation from the corporate standpoint.

The policy to formally expand the IRB’s scope of authority beyond human subjects was made after the integration of information security and the IRB. Information security was already making many of the decisions around project security and the IRB still had a hand in approving the protocol. Changing the policy meant that the integration between the two teams would survive the people who started it.

7. Throughout the [Data Security] webinar, the speakers did not specify that if there is a security incident at your institution, the first call should be to your corporate compliance department or your privacy officer. This is a mandate and not just an individual institutional policy. I’m concerned that webinar attendees new to this subject matter would walk away not knowing their reporting responsibilities. 
There are no hard and fast requirements for incident response. The process for each institution can be different, and can also vary based on the data involved, the type of event, the regulations the institution falls under, and the maturity of those processes. Some corporate compliance departments do not deal with information security related issues. In larger organizations, these responsibilities flow up to the chief information officer (CIO)/chief information security officer (CISO) or through a legal department. Similarly, not every organization is required to have a privacy officer; the existence of a privacy officer is mandated for institutions that fall under the Health Insurance Portability and Accountability Act (HIPAA) or similar regulations.

As noted, reporting responsibilities vary based on the regulation, data, and contract. This also could be complicated by the inclusion of internal policies such as notifying legal counsel (not required) before notifying OHRP (required) for a human subject research issue. We cannot take into account every variation or combination of institutional policy, data type, and regulatory umbrella; for this reason, we indicate that attendees should work with their contracts department and legal counsel to understand what combination applies and develop a process around that. Our presentation was an example.

Please feel free to reach out to Teresa Doksum and Sean Owen if you have any additional questions.

The 2015 Call for Webinar Proposals opens May 1—we invite you to share your idea for a future webinar with us! Please contact us with any questions about submitting a proposal.