The use of genetic testing in the diagnosis of health conditions, identification of risks, and direction of treatment has led to significant improvements in health outcomes since it first became available in the 1950s, with the development of chromosome analysis to identify the extra copy of chromosome 21 that causes Down Syndrome. Within the last decade, however, it has become apparent that most research in genetics and genomics has excluded minority and underrepresented populations. A 2016 review showed that approximately 80 percent of subjects included in genome-wide association studies were of European ancestry; a similar meta-analysis found less than three percent of all subjects were individuals of African ancestry. The reasons for this bias in subject diversity are “likely to be the result of logistical, systemic, and historical factors” and are the subject of much contemporary discussion. For genetically-based medical treatment to be successful—in particular precision medicine—it is important that the data used in genomic/genetic studies accurately mirrors the population that will ultimately be treated; as such, it is important to understand as fully as possible the factors that may lead to distrust of and rejection of participation in medical research.
When considering historic incidents of mistreatment that may have contributed to mistrust of research and clinical trials among the Black community, many turn to the USPHS Syphilis Study at Tuskegee or the story of Henrietta Lacks. However, there are numerous historic examples of research on nonconsenting Black people that most likely does contribute to the mistrust of research and clinical trials among those in the Black community. Less well-known is the rollout of testing for sickle cell disease (SCD), implementation of mandatory screening laws, and subsequent stigmatization of African Americans, all of which may have had similar impacts on the Black community’s view of research and testing. For Black History Month, we want to highlight this critical story that most likely helped shape this climate of mistrust.
Sickle Cell Disease
Sickle cell disease (SCD) is an inherited blood disorder that specifically affects red blood cells. People with sickle cell disease have red blood cells that contain mostly hemoglobin S—an abnormal form of this protein. This causes red blood cells to become sickle- or crescent-shaped, which makes it difficult for them to pass through small blood vessels, causing less blood to reach certain parts of the body. Tissue that doesn’t receive enough blood flow will eventually become damaged, likely causing pain and severe complications.
Sickle cell disease presents in those people who have two copies of the disease-causing gene. Those who inherit only one copy of the sickle cell gene, however, are said to have sickle cell trait (SCT), which typically has no symptoms. Sickle cell trait is more common in populations whose ancestry is from sub-Saharan Africa, the Mediterranean, South or Central America, or some parts of the Middle East. Carrying a single gene for sickle cell trait also provides protection against malaria, a fact which is thought to have driven evolutionary selection for the trait in parts of the world where malaria is common. Around three million Americans carry sickle cell trait, including approximately 1 in 13 Black Americans.
In 1910, James Herrick noted the sickle shape of red blood cells in a research subject, leading him to coin the term “sickle cell anemia.” Several decades later, in 1949, Linus Pauling described the mechanism by which sickle cells lead to SCD, making it the first inherited disease whose molecular cause had been identified. This and other articles published at the time raised public awareness of the condition.
Eventually, diagnostic tests were developed to help identify the sickle cell trait in patients. In 1972, Congress passed the National Sickle Cell Disease Control Act (signed by President Richard Nixon) which established sickle cell screening, counseling, public and professional education, research, and training. This legislation also led to the creation of federal funding opportunities for SCT/SCD, the first genetic disorder to be recognized with targeted funding. In 1973, scientists developed neonatal screening methods for SCD using blood spots on filter paper; soon after, all 50 US states began to require newborns be screened for the sickle cell gene.
Criticisms and Controversy
There were many problems with the early sickle cell screening programs, including racial and cultural bias, controversy surrounding the accuracy and validity of the screening tests, and inadequate protection of patients’ rights.
The most serious criticism has to do with racist bias around testing for the disease. The first screening programs were based on the population frequency of SCT, which is rarer in white Americans (approximately 0.3% of births, compared to 7% of Black newborns). Screening programs were enacted into laws that almost exclusively applied to African Americans. For example, a New York State law ordered that all persons “not of the Caucasian, Indian, or Oriental races” be tested for sickle cell trait before being allowed to obtain a marriage license. Later, the state would require all “urban” schoolchildren to be screened before being able to attend school. These compulsory screening laws targeting African Americans obscured two key facts: that populations other than African Americans can carry the trait, and that the disease is non-contagious.
Testing African Americans almost exclusively for a disease that wasn’t contagious created a strong, damaging stigma around Black health. By the early 1970s, sickle cell carrier status led to formal discrimination, including the denial of health and life insurance, employment opportunities, and acceptance into the armed forces. (The story of screening for SCT/SCD in the US military is similarly fraught, complex, and ongoing). A national media focus that presented SCT as the “most vital health issue” facing the African American community exacerbated racist attitudes toward black people by exaggerating the degree to which SCD was an unmanageable health crisis.
These laws and testing programs occurred alongside the early-1970s civil rights activity. This was a pivotal moment in the push among Black people for equality in education, desegregation of schools, voter registration, and housing equity. It has been noted that the public health focus on SCT/SCD directed limited resources to a cause that stigmatized Black people at a moment when activists were making clear that other interventions may have done more to improve their health and justice outcomes.
In addition to these social and political missteps, the eagerness to capitalize on the creation of the first diagnostic test for an inherited genetic disorder led to similar mistakes in the medical and scientific rollout of testing. The majority of those screened were school-aged children or young adults who would have most likely already been clinically diagnosed if they actually had SCD, rendering the screenings burdensome, stigmatizing, and unnecessary. Given that there were no existing cures or therapies at the time, many people questioned the purpose and the goal of mandatory testing. Other criticisms of the initial sickle cell screening programs concerned the validity of the screening methods and the potential for misinterpretation of the results. There were reports of misuse of the tests, and of the poor quality control in many of the laboratories performing the tests. Furthermore, state and local programs often lacked adequate knowledge of the genetics of SCD to use the information from screenings productively.
A third major failure of these early sickle cell screening programs was that, amid the push to implement testing and enact laws restricting the freedoms of SCT carriers, many mechanisms that could have mitigated the ensuing stigmatization were not used. These options might have included ensuring test result confidentiality, providing competent genetic counseling and adequate public education, and creating guidelines to ensure quality control of the testing and laboratory facilities.
Treatments and Possible Cure
In 1994, long-term blood transfusions were found to decrease hospitalizations among SCD patients and in 1998 a treatment for SCD was approved by the US Food and Drug Administration. In 2009, blood stem cell transplants were shown to stop progression in sickle cell. Additional drugs to treat SCD and pain crises were developed in 2017 and 2019.
In 2021, early results from a clinical trial utilizing use CRISPR-Cas9 gene-editing technology to correct the mutation causing SCT/SCD on patients’ own blood stem cells to correct the mutated gene showed great promise. In this investigational gene therapy study, which enrolled 35 adults and adolescents with sickle cell disease, a single dose restored blood cells to their normal shape and eliminated the most serious complications of the disease for at least three years in some patients. This single dose therapy corrected the shape of patient’s red blood cells and in the trial completely eliminated episodes of severe pain. This new treatment is a strong positive development in the treatment of SCD and is the first step towards a possible cure.
Throughout the 1970s, many African Americans were denied educational opportunities, jobs, insurance, and marriage licenses, because they were carriers of the sickle cell trait. Misinformation was rampant, and the general public and many healthcare professionals did not understand the inheritance of SCD and the meaning of SCT carrier status.
It is not clear whether these events contributed to the significant underrepresentation of Black Americans in research and clinical trials. However, one can imagine that the cumulative effect of historic nonconsensual research to those in the Black community has contributed to underrepresentation in research and clinical trials. While a number of studies have documented negative attitudes towards genetic testing among African Americans, other studies have identified broad support among this community for genetic testing and research. There are many factors in study design and deployment that have a significant effect on subject population bias, and these need to be addressed. But regardless of the impact the SCD testing story may have played (or continues to play) in creating factors that contribute to underrepresentation of people of color in research and clinical trials, there are important lessons to be learned and remembered.
PRIM&R thanks Jule Gomes Noack, President and Chief Executive Officer, HMEA, for contributions to this piece.
Sharon Shriver, PhD, is the Director of Programs at PRIM&R.