Victor Ekuta

Race Against the Machine: Leveraging Inclusive Technology as an Antiracist Tool for Brain Health

“Zeros and ones, if we are not careful, could deepen the divides between haves and have-nots, between the deserving and the undeserving – rusty value judgments embedded in shiny new systems.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code

Introduction

Achieving fairness in medical devices, Volume: 372, Issue: 6537, Pages: 30-31, DOI: (10.1126/science.abe9195)

When we think about racism, we often imagine people, systems, and institutions. But what we can fail to acknowledge is that technology—medical devices, algorithms, and machines—can also be racist.  Technological advancements have often been hailed for their promise of objectivity and impartiality, offering a glimmer of hope in combating systemic biases. Yet technology is not immune to racism, nor is it immune to bias. On the contrary, history has repeatedly shown that even machines designed to be objective can inadvertently perpetuate discrimination and disparities.

 

Racism in technology is not a distant concept for me. In the spring of 2021, I took part in the MIT linQ Catalyst Healthcare Innovation Fellowship—a unique program that brings together experts from various fields to collaboratively identify and validate unmet medical and health-related needs, explore new project opportunities, and develop action plans. Tasked with identifying an unmet need, I embarked on a search for a project and eventually stumbled upon a striking case study of “systemic racism in miniature”—pulse oximeters. Pulse oximeters are ubiquitous in medicine. Thousands of times a day, healthcare workers use them to measure the percentage of oxygen in the blood and make vital treatment decisions for conditions such as asthma, COPD, and now COVID-19. These devices were presumed to be unbiased tools, functioning uniformly across different racial and ethnic groups. Yet, both my own research and that of others revealed a disturbing reality: pulse oximeters were less accurate in patients with darker skin tones. A deficiency that not only can but already has led to potential misdiagnoses, delayed treatments, and compromised patient care.

 

This jarring revelation hits home even more when we recognize that oxygen is vital for brain health. The presence of racial bias in a tool meant to measure such a crucial substance for the brain’s well-being underscores just how easily systemic racism embedded in technology can go on to compromise the brain health of an entire community.

 

This ultimately poses a pressing question: What is the role of technology in health and society? Is technology a tool meant to help us progress or a weapon that will be used to oppress? The answer, I believe, hinges on how we wield it.

 

Today, the field of medicine, particularly neurology and mental health, is in the midst of a renaissance, with an unprecedented wave of technological innovation. The story of the pulse oximeter serves as a cautionary reminder that the technologies we develop now will invariably shape the brain health of tomorrow. Hence, it is imperative we remain vigilant in addressing and rectifying the systemic racism present in our technologies to ensure that the future of tomorrow will be one of equity and inclusion in brain health. Only then can we harness the true potential of technology as a tool of progression rather than a weapon for oppression.

Understanding Racism in Medical Technology

Regrettably, the pulse oximeter case is just one example of systemic racism infiltrating technology. While technological innovations are relatively recent, there is already a troubling body of evidence revealing hidden biases within medical technology.

 

Bias in Algorithms: Algorithms that were intended to serve as impartial judges of health can end up perpetuating systemic racism and exacerbating inequalities. For example, an algorithm that used health costs as a proxy for health needs wrongly concluded that Black patients were healthier than white patients, replicating health care access patterns due to poverty. Similarly, diagnostic algorithms and practice guidelines that adjust or “correct” their outputs on the basis of a patient’s race or ethnicity, have been shown to guide medical decision-making in ways that may direct more attention or resources to white patients than to members of racial and ethnic minorities.

Fig. 3: The potential sources of racial bias in psychophysiological data collection.

Both effects of racialized negative life experiences on neural responses and embedded phenotypic bias (against darker skin and/or coarse, curly hair) in devices may influence recorded data. Historically, these confounds have not been considered, leading to the exclusion of Black participants from analyses and mislabeling participants as ‘non-learners’, ‘non-responders’ or ‘difficult subjects’. (Reference: Webb EK, Etter JA, Kwasa JA. Addressing racial and phenotypic bias in human neuroscience methods. Nature Neuroscience. 2022;25(4):410-414.)

This cascade of failures raises a crucial question: Who should be held accountable when technology fails? The commonly held belief that technology, due to its scientific and objective nature, is immune from human biases and errors need to be debunked. Technologies are not developed in isolation; they are products of human design, shaped by underlying identities, values, and objectives. Likewise, the social implications of technological tools reflect the larger system within which technology operates.

 

It stands to reason then, that if a technology is racist, it is because it is operating in a racist society. As Deborah Raji, a Mozilla Fellow in Trustworthy AI, eloquently states:

Therefore, accountability for technology failures lies not solely within the technology itself but within the technology designers and the broader system they operate within. Only by acknowledging this shared responsibility can we confront and rectify the systemic racism present in medical technology and strive toward a future of equitable and inclusive healthcare.

Equity centered Design: The Vanguard of Technological Revolution

How, exactly, can we ensure that we are creating inclusive technology that does not suffer from the fatal pitfalls of system racism? Equity-centered design. Equity centered design emerges as the guiding light in the tumultuous sea of technological innovation. Understanding the complexities of diverse backgrounds, beliefs, and practices is the moral compass that could steer technology away from automating racism.

To accomplish this, we can implement the following strategies:

  1. Assembling Diverse Teams: A critical step in promoting equity-centered design is to build teams that reflect the rich tapestry of society. By bringing together individuals from diverse racial, ethnic, and cultural backgrounds, we can infuse different perspectives and experiences into the development process. This diversity of thought enables the identification and rectification of potential biases, resulting in more inclusive and unbiased technologies.
  2. Investing in Inclusive Research: Emphasizing inclusive research practices helps ensure that data collected and analyzed represent various racial and ethnic groups. Too often, research fails to adequately include underrepresented communities, leading to biased outcomes. For example, FDA guidance for approving pulse oximeters says clinical trials should include at least two darkly pigmented people or 15% of the subject pool—whichever is larger. However, recent research highlighting the shortcomings of pulse oximeters, suggests that this standard might be insufficient. By investing in inclusive research, we broaden the scope of knowledge and ensure that technology is developed with a comprehensive understanding of the needs and challenges faced by all individuals.
  3. Actively Involving Underrepresented Communities in Technology Design: Communities that have historically been marginalized must be actively engaged in the design and development of technology. Their firsthand insights are invaluable in identifying potential biases and ensuring that technologies are culturally sensitive and responsive to the needs of all users.
  4. Auditing Technology for Equity: Regularly assessing technology for hidden biases and potential racial discrimination is crucial. Conducting equity audits can help identify any disparities and make necessary adjustments to ensure that the technology functions fairly and inclusively for all users. However, most data analysts do not have access to widely used proprietary algorithms and technology inventions, so cannot typically identify the precise mechanisms that produce disparate outcomes. But what if we could instead design a culturally competent algorithm or technology that could do just that, secretly peering behind the black box of proprietary algorithms to determine or predict if a particular algorithm or technology is likely to code for inequity.
  5. Revaluating the inputs for algorithms or technology: A recent study shows the power of this approach. By training an AI model to learn from Black patients’ self-report of pain, rather than sharpening or replicating a doctor’s own judgment of that pain, there was a dramatic reduction in unexplained racial disparities in pain.
 

By implementing these strategies, we don the armor necessary for this epic battle. Only through cultural competency can technology be molded into an instrument of progress, promoting inclusivity and equal opportunities for all. Embracing equity-centered design allows us to navigate the challenges posed by systemic racism, paving the way for a future where technology becomes a catalyst for positive change, fostering inclusivity and equal opportunities for all.

Conclusion

Many things about the future are uncertain, but when it comes to technology, at least this much is known: it will continue to have a prominent role. But will its role be as a tool or as a weapon? The answer to that question depends in part on perspective—who wields power and how they do it. The future is not predetermined, but it is created. By taking race(ism) out of the equation through equity centered design, we can ensure that future technology is an ally, not a foe, on the battleground for equity in the fields of neurology and mental health. By leveraging inclusive technology as an antiracist tool for brain health, we can dismantle the barriers that systemic racism erects and pave the way for a future where every individual’s well-being is prioritized and respected. Only then can we truly race against the machine to build a world where technology is a force for progress and equality.

Citations

  1. Kadambi A. Achieving fairness in medical devices. Science. 2021;372(6537):30-31. doi:https://doi.org/10.1126/science.abe9195
  2. Zou J, Schiebinger L. AI can be sexist and racist — it’s time to make it fair. Nature. 2018;559(7714):324-326. doi:https://doi.org/10.1038/d41586-018-05707-8
  3. Barbour C. Can a Machine Be racist? Artificial Intelligence Has Shown Troubling Signs of bias, but There Are Reasons for Optimism. The Conversation. Published March 6, 2023. https://theconversation.com/can-a-machine-be-racist-artificial-intelligence-has-shown-troubling-signs-of-bias-but-there-are-reasons-for-optimism-197893
  4. Catalyst Fellowship | MIT linQ Catalyst. Accessed July 27, 2023. https://catalyst.mit.edu/fellowshipoffseason/
  5. Students Who Rocked Public Health 2021 – JPHMP Direct. jphmpdirect.com. Published January 14, 2022. Accessed July 27, 2023. https://jphmpdirect.com/2022/01/14/students-who-rocked-public-health-2021/
  6. Sjoding MW, Dickson RP, Iwashyna TJ, Gay SE, Valley TS. Racial Bias in Pulse Oximetry Measurement. New England Journal of Medicine. 2020;383(25):2477-2478. doi:https://doi.org/10.1056/nejmc2029240
  7. Moran-Thomas A. How a Popular Medical Device Encodes Racial Bias. Boston Review. Published the August 5, 2020. https://www.bostonreview.net/articles/amy-moran-thomas-pulse-oximeter/
  8. McFarling UL. Inaccurate pulse oximeter readings tied to less supplemental oxygen for darker-skinned ICU patients. STAT. Published July 11, 2022. Accessed July 27, 2023. https://www.statnews.com/2022/07/11/inaccurate-pulse-oximeter-readings-tied-to-less-supplemental-oxygen-for-darker-skinned-icu-patients/
  9. Inaccurate pulse oximeter measurements delayed COVID treatment for people of color. NPR.org. https://www.npr.org/2022/06/05/1103145033/inaccurate-pulse-oximeter-measurements-delayed-covid-treatment-for-people-of-col.
  10. After fallow decades, neuroscience is undergoing a renaissance. The Economist. Accessed July 27, 2023. https://www.economist.com/technology-quarterly/2022/09/21/after-fallow-decades-neuroscience-is-undergoing-a-renaissance
  11. The New Jim Code: Reimagining the Default Settings of Technology & Society. www.youtube.com. Accessed July 27, 2023. https://www.youtube.com/watch?v=aMuD_lAy2zQ
  12. Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447-453. doi:https://doi.org/10.1126/science.aax2342
  13. Choy T, Baker E, Stavropoulos K. Systemic Racism in EEG Research: Considerations and Potential Solutions. Affective Science. 2021;3. doi:https://doi.org/10.1007/s42761-021-00050-0
  14. Ricard JA, Parker TC, Dhamala E, Kwasa J, Allsop A, Holmes AJ. Confronting racially exclusionary practices in the acquisition and analyses of neuroimaging data. Nature Neuroscience. 2023;26(1):4-11. doi:https://doi.org/10.1038/s41593-022-01218-y
  15. Webb EK, Etter JA, Kwasa JA. Addressing racial and phenotypic bias in human neuroscience methods. Nature Neuroscience. 2022;25(4):410-414. doi:https://doi.org/10.1038/s41593-022-01046-0
  16. Hailu R. Fitbits, other wearables may not accurately track heart rates in people of color. STAT. Published July 24, 2019. https://www.statnews.com/2019/07/24/fitbit-accuracy-dark-skin/
  17. More on Racial Bias in Pulse Oximetry Measurement. New England Journal of Medicine. 2021;384(13):1278-1278. doi:https://doi.org/10.1056/nejmc2101321
  18. Raji D. How our data encodes systematic racism. MIT Technology Review. Published December 10, 2020. https://www.technologyreview.com/2020/12/10/1013617/racism-data-science-artificial-intelligence-ai-opinion/
  19. Benjamin R. Assessing risk, automating racism. Science. 2019;366(6464):421-422. doi:https://doi.org/10.1126/science.aaz3873
  20. Center for Devices and Radiological Health. Pulse Oximeters – Premarket Notification Submissions [510(k)s]. U.S. Food and Drug Administration. Published 2019. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/pulse-oximeters-premarket-notification-submissions-510ks-guidance-industry-and-food-and-drug
  21. AI could make health care fairer—by helping us believe what patients say. MIT Technology Review. https://www.technologyreview.com/2021/01/22/1016577/ai-fairer-healthcare-patient-outcomes/
  22. Taking Race(ism) Out of the Equation. opmed.doximity.com. Accessed July 27, 2023. https://www.doximity.com/articles/d36afd5e-c99a-4d82-85ea-81aa5979c26d
Stay in Touch with Victor Ekuta

ABOUT THE THOUGHT LEADERSHIP FOR PUBLIC HEALTH FELLOWSHIP

The mission of the Boston Congress of Public Health Thought Leadership for Public Health Fellowship (BCPH Fellowship) seeks to: 

  • Incubate the next generation of thought leaders in public health;
  • Advance collective impact for health equity through public health advocacy; and
  • Diversify, democratize, and broaden evidence-based public health dialogue and expression.

It is guided by an overall vision to provide a platform, training, and support network for the next generation of public health thought leaders and public scholars to explore and grow their voice.