As we approach the end of our second year in the COVID-19 global pandemic, the United States has experienced a decrease in overall mental health. According to the CDC, roughly 4 in 10 adults have reported depressive symptoms. To address the risk of COVID-19 exposure from in-person counseling sessions, (and in response to the high demand and low supply of therapists), companies began offering mobile mental health app subscriptions to their employees to aid in coping with the crisis. Two years later, these mobile app vendors have amassed a goldmine of mental health data, and researchers are scrambling to get their secondary use (the use in research of data originally collected for a purpose other than the current research purpose) research applications approved by their IRBs.
The data collected from these mobile mental health apps, in combination with mental health outcomes as identified in a person's Electronic Health Records, promises benefits for the field of mental health support. It may help scientists understand how a prolonged crisis can affect our nation's mental health and find answers to making it better. The risk associated with using data for secondary research purposes is usually limited to breach of confidentiality, and investigators mitigate that risk by implementing processes to protect access to the data. In studies involving algorithms and personalized medicine, the risks increase. So, how do IRBs determine when risks outweigh the benefits?
While at the 2021 Advancing Ethical Research Conference, I attended the panel "Data Advocacy, Governance, and Ethics in an Age of Big Data." The panel speaker's overall message was that we must be data advocates and take a proactive approach to data ethics. We must understand the "political nature of data from policy making...and ensure that data is not misused."
I use this specific example of mental health mobile apps because we are seeing an uptick in these applications due to COVID-19. While it is up to us as individuals to make conscious choices regarding how we share our information, we also depend on IRBs to protect our data when it comes to research.
When users sign up for a mental health application, there is a note about how privacy is managed—a short statement such as "we will not sell or share your data with others." For those with paid subscriptions through their employer, ancillary contractual privacy statements may read, "no health information will be shared between your employer and our company." This reassures subscribers that their companies will not access their mental health data, which encourages widespread participation. But, what happens when investigators apply to use people's identifiable app data?
To effectively make a risk-benefit assessment of mobile health secondary use research, there are three privacy issues that IRBs must consider:
- The vendor's contractual obligations with its customers (companies that purchase subscriptions for their employees)
- The technology's algorithms (if applicable)
General privacy policies lack transparency on what data is being shared with third parties. While contracts between the vendor and research institution typically include language restricting access to user health data, the definition of "health data" is used subjectively, allowing vendors to share large amounts of user data and still be compliant. Lastly, the technology may use algorithms that "learn" about the user to personalize their experience. Not only do algorithms pose risk of bias and error, but personalized apps require users to provide very personal and sensitive information that users may not want shared.
In an ideal world, IRBs would review privacy policies, contractual obligations, and algorithmic risks. However, unless the investigators are forthcoming about these limitations, IRBs are generally not aware of them. Researchers are responsible for describing their rationale for why the study is minimal risk, and IRBs take their word for it. This is the nature of secondary use research.
One may argue that because secondary use projects are minimal risk, the benefits will outweigh the risks. However, when no consideration is given to privacy policies, contractual limitations, and algorithmic bias, we cannot, with confidence, claim that we have adequately justified the risk-to-benefit ratio.
The panel reminded us of the importance of being an advocate of our own data, and the significant failings of governance and ethics in an age of big data. IRBs are in a unique position in which they can—and should—be factoring in these issues when reviewing secondary use research protocols. As the speaker stated, "data is power" and if that data is used unethically, we cannot conclude that the benefits outweigh the risks. Rather than continuing to "accept the risks," we must engage in political discussions and insist on real protections as IRB professionals charged with protecting human subjects and their data.
Tamiko T. Eto, MS, CIP, is the Division of Research Manager for Compliance, Privacy, and IRB Support Services at the Permanente Medical Group of Kaiser Permanente, Northern California. She has over 16 years of experience in human subjects research protections. Prior to that, Tamiko served as Acting Director at Stanford Research Institute's (SRI) Office of Research Integrity and Chair of the SRI IRB. She leverages her experience to implement regulatory policies to health care research projects that delve into AI and novel technology research.
Tamiko works closely with AI researchers and institutional/regulatory bodies in addressing ethical and regulatory challenges related to AI, and has developed tools and checklists for IRBs to use in their review of AI research. She also actively collaborates on research, in an effort to be at the forefront of developing an ethical and regulatory framework for research involving human subjects.
Conference recording packages from PRIM&R's 2021 Advancing Ethical Research (AER) and Social, Behavioral, and Ethical Research (SBER) Conferences are available in PRIM&R's Knowledge Center. You can also search "data" in the Knowledge Center to find a selection of other programs dealing with data and research.
And don't forget to save the date for the 2022 PRIM&R Annual Conference, November 14-18, 2022, which will include sessions like this and so much more to help you learn best practices for your IRB, IACUC, or other research program!