Webinar Follow Up: Meaningful Metrics – How HRPPs Can Use Their Data Effectively

I want to thank everyone who submitted questions during the webinar Meaningful Metrics: How HRPPs Can Use Their Data Effectively. In this post, I will address one central theme about which many attendees submitted questions that I didn’t have time to address during the live session: connecting metrics to quality improvement initiatives and HRPP self-assessments/internal audits. After reading this post, you will be able to define additional HRPP metrics not discussed in the webinar. You will also be able to identify metrics that will lead to quality improvement initiatives within their Human Research Protection Programs (HRPP).

One hallmark of effective HRPP metrics use is the systematic coordination of the goals among the HRPP’s stakeholders and the use of those targets to generate collaborative improvement efforts. One way our HRPP consistently does this is to blend quality improvement projects that involve both the IRB and Quality Improvement Program (QIP). We also do this by collaborating on similar projects with other institutional HRPP stakeholders such as the Sponsored Research, Research Compliance, IT, and Conflict of Interest offices.

One significant area of focus for our HRPP self-assessments/internal audits is the revised Common Rule. Despite the revised Common Rule being live for a year and a half now, we still conduct routine checks on the updated regulations to ensure consistency and accuracy. A few examples and their related quality improvement initiatives are:

Pre-2018 Requirements vs. 2018 Requirements

Our online submission and reporting system allows us to code projects by which set of regulations apply: Pre-2018 Requirements or 2018 Requirements. Our categorizations generate a host of reports that are used to check several dimensions of the revised Common Rule. One dimension is applicability where studies under 2018 Requirements are internally audited for the correct regulatory citations of waivers and alterations, use of Key Information in consent forms, posting of clinical trial consent forms, etc. Another dimension is regulatory oversight where reports are used to ensure that studies with regulatory oversight from the FDA and/or DOJ are categorized correctly as under Pre-2018 Requirements (since these two agencies have regulations differing from the revised Common Rule). 

Studies reviewed by the Limited IRB Review Procedure

As the revised Common Rule states, limited IRB review is a condition of exemption (under §46.104(d)(2)(iii), (d)(3)(i)(C), and (d)(7), and (8)). For those projects, the IRB must conduct a limited IRB review to make the determination required by §46.111(a)(7), which states that there are adequate provisions to protect the privacy of subjects and to maintain the confidentiality of data. The regulations do not mandate what those provisions must entail; however, the revised Common Rule preamble provides some suggestions (CITI has a nice handout summarizing these considerations as well). Our HRPP uses these suggestions as guidance and so our QIP is able to conduct internal audits on these criteria within projects that have received Limited IRB Review.

Studies requiring Continuing Review

One of the most significant changes brought about by the revised Common Rule was the removal of continuing review for minimal risk research. Furthermore, the revisions include a specific mandate that a “rationale for conducting continuing review of research that otherwise would not require continuing review as described in §46.109(f)(1)” must be documented in IRB records (under §46.115(3)). Because our online submission and reporting system provides a required field for documentation of this rationale, our HRPP can easily audit 1) the documentation of the rationale and 2) the quality of the rationale. This is an example of where qualitative metrics can be useful for an HRPP. A resulting self-assessment initiative would be to code the rationales to determine which are adequately documented and which are not. One outcome from such an initiative would be a formula or rubric enabling IRB reviewers to generate a high-quality rationale.

Another core area of focus for our HRPP quality improvement initiatives are where there are other, non-Common Rule federal and/or legal oversight requirements of the research. A few examples, and their related quality improvement initiatives, are:

Certificates of Confidentiality (CoC)

Per the 21st Century Cures Act, NIH-funded researchers are automatically issued a CoC if they collect identifiable, sensitive information about participants. The NIH policy states that “for studies in which informed consent is sought, NIH expects investigators to inform research participants of the protections and the limits to protections provided by a Certificate.” As such, our QIP conducts internal audits of the consent forms of such projects to ensure CoC information is included. In addition to this compliance check, two other benefits have emerged as part of this initiative:

  1. We prepared template CoC language in our consent form template for researchers so they don’t have to produce this themselves.
  2. As part of a self-assessment on how to identify studies with a CoC, we enhanced our online submission and reporting system to include a CoC checkbox for applicable studies. This metric not only assists in our reporting efforts but also serves as a point of validation for the IRB reviewer to ensure all of the NIH Policy requirements are met at the time when they check this box in the system. 

The European Union (EU) General Data Protection Regulation (GDPR)

GDPR applies to the processing of personal data from individuals in the EEA. Because there are numerous requirements within GDPR regarding consent, as well as data collection and transfer, our HRPP (in conjunction with IT, Provost’s Office, and Negotiating Offices) continually monitors all GDPR-applicable studies in our portfolio. The two main compliance areas on which our HRPP focuses are:

  1. Ensuring the required consent form language is present on studies that collect data from individuals in the EEA; and
  2. When the research involves secondary data/specimen analysis only, referrals are made to the offices that oversee contracts and agreements with the data/specimen providers.

As part of a self-assessment on how to identify GDPR studies, we enhanced our online submission and reporting system to include a GDPR checkbox for applicable studies. This metric not only assists in our reporting efforts but also serves as a point of validation for the IRB reviewer to ensure all of the GDPR requirements are met at the time when they check this box in the system.

Finally, institutional policies drive many of our HRPP quality improvement initiatives as well. Because institutional policies differ widely between organizations, I present just two broad examples and their related quality improvement initiatives, below:

Reliance on External IRBs

As part of the OHRP and NIH Single IRB policies, our HRPP requires researchers to annually upload the most recent approval letter from the IRB of Record to our online tracking system. Because our system is designed to capture the expiry date, we run continuous metrics on these studies and generate automated reminder notifications to researchers to submit the most current letter.

Projects Requiring a Conflict of Interest Management Plan

Like many institutions, ours has a policy on disclosure of conflicts of interests (financial, scientific, personal, etc.) when such conflicts are related to the research. As part of these disclosures, we require researchers to describe the interest(s) and their plans for managing the interest(s). These plans vary widely depending on the nature of the interest(s); however, they are captured within our online submission system. The documentation of these plans is required and confirmation of any updates/changes is also required annually at the time of continuing review (where applicable). These requirements allow our HRPP to ensure the management plan is on file with the IRB, monitor such plans, and assess whether items in the plan are being upheld. 

From the various metrics reports noted above, our HRPP conducts internal audits to validate that IRB review is compliant with the applicable regulation and/or requirement. From there, if discrepancies are found, we use those metrics to determine whether the inconsistency is systemic or a one-off. Quality improvement initiatives typically result in targeted education and training. These initiatives also provide the opportunity for IRB leadership to revise the HRPP Toolkit–identifying instructional gaps and improving SOPs. Additionally, these efforts also provide us the opportunity to create case studies for IRB reviewers for remediation purposes. Case study training takes place at staff meetings and/or during member education sessions at convened IRB meetings. We find that full group training prevents singling out any one reviewer and enhances learning for the entire IRB. Finally, because our quality improvement initiatives involve many HRPP stakeholders, we find this creates a culture of collaboration and compliance across the institution that benefits both the HRPP and the research community.

Kim Serpico, MEd, CIP, EdD Candidate is the Assistant Director of IRB operations for the Harvard Longwood Campus (HLC) Schools, including Harvard TH Chan School of Public Health, Harvard Medical School, and Harvard School of Dental Medicine. Kim holds a BS in psychology, an MEd in higher education administration, a graduate certificate in legal studies, and is currently working on her doctoral degree in education at Vanderbilt University. Her professional areas of expertise are the ethical conduct and regulatory compliance of human subjects research, and protection of the rights and welfare of participants involved in biomedical and social-behavioral research. Additionally, she serves as co-chair of the Harvard Catalyst Emerging Technologies, Ethics, and Research subcommittee where she co-leads a federally funded committee dedicated to enhancing data security and human subjects research involving novel technologies (i.e. cloud platforms, apps, smart devices). She is also a Certified IRB Professional (CIP).


PRIM&R thanks Kim Serpico for sharing her expertise.

The recording of this webinar (and recordings of all past PRIM&R webinars) is available for purchase. If you would like to purchase the webinar for group viewing, please download the order form (PDF) and send it to registration@primr.org.