By Natasha Williams, Ph.D., J.D., LL.M., M.P.H.
Former NIMHD Legislative Liaison
National Institute on Minority Health and Health Disparities
Over the last 20 years, the diagnosis and treatment of disease has advanced at breakneck speeds. Currently, we have technologies that have revolutionized the practice of medicine, such as telemedicine, precision medicine, Big Data, and medical artificial intelligence (AI). These technologies, especially AI, promise to improve the quality of patient care, lower health care costs, and better patient treatment outcomes. However, the impact of AI on minority health and health disparities has been largely understudied.
What is AI? The definition of AI is broad and varied and has many subareas. However, the common theme is the ability to “automate or replicate intelligent behavior.”1 Machine learning, which is a subcategory of AI, is the ability of computers to teach themselves and create their own programming. Deep learning, another AI technique, mimics the human brain by creating an artificial neuronal network. Natural language processing (NLP), which was applied by the National Institute on Minority Health and Health Disparities (NIMHD)–funded researchers at the Medical University of South Carolina (MUSC) and is discussed later in the post, helps computers interpret human language. These methods recognize patterns in the data. Since AI is fueled by data, it is imperative that the data be of good quality, inclusive, and free from bias.2 If we fail to ensure these three principles, we could exacerbate health disparities.
AI systems digest large amounts of data from many sources, including but not limited to medical records, medical imaging, and clinical research data. Data bias can occur at multiple levels, from the selection of the data itself to the person who is curating the data. Data bias is introduced through programmer values and perceptions.3 The potential for bias is perpetuated by the lack of women and racial and ethnic minorities in the AI field because their ideas, perceptions, and values are not represented in these AI systems.4 Equally importantly, bias occurs when certain populations, such as racial and ethnic minority, rural, and socioeconomically disadvantaged populations, are missing from the data.5 These populations disproportionately have adverse health outcomes compared with the general population. AI may be a tool to decrease health disparities and improve minority health; however, these populations must be adequately represented in the data. In addition, diversity of the AI technology workforce is essential. The lack of diversity in the field further promotes bias.
The phrase “garbage in, garbage out” describes the vulnerability of AI data. If the data digested by the AI system is flawed, its predictive power to identify disease and treatment options is equally flawed. Therefore, if health disparity populations are not adequately represented in the data, the promise of AI for these populations is not achievable. Furthermore, AI may create a greater divide between those populations whose data are included and those whose are not.
MUSC, an NIMHD grantee, is utilizing artificial intelligence to address social isolation. According to MUSC researchers, social isolation and other social determinants of health are not always in electronic health record (EHR) coded data but are embedded in the clinical notes. By training NLP software to c omb through thousands of clinical notes, looking for references to social isolation, the team identified socially isolated patients with 90% accuracy. The researchers hope to use machine learning to identify clinical and other traits to help them spot socially isolated patients. If you would like to read more about the MUSC research, please click the following link: https://www.eurekalert.org/pub_releases/2019-05/muos-iho051619.php.
Recognizing the need to explore new technologies such as artificial intelligence and natural language processing on EHR platforms, NIMHD issued a funding opportunity announcement (FOA), “Leveraging Health Information Technology (Health IT) to Address Minority Health and Health Disparities.” The FOA supports research that examines how health information technology reduces disparities and improves health outcomes for minority health and health disparity populations. For more information, please click the following link: https://grants.nih.gov/grants/guide/pa-files/PAR-19-093.html.
References
1 Executive Office of the President, National Science and Technology Council, Committee on Technology. (2016). Preparing for the Future of Artificial Intelligence. Accessed June 21, 2019.
2 MITRE Corporation. (2017). Artificial Intelligence for Health and Health Care.
3 O’Neil, C. (2016). Weapons of Math Destruction: How big data increases inequity and threatens democracy. Crown Random House.
4 Campolo, A., Sanfilippo, M., Whittaker, M., Crawford, K., Selbst, A., & Barocas, S. (2017). AI Now 2017 Report. Accessed June 21, 2019.
5 Zhang, X. Pérez-Stable, E. J., Bourne, P. E., Peprah, E. Duru, O. K., Breen, N., . . . Denny, J. (2017). Big data science: Opportunities and challenges to address minority health and health disparities in the 21st century. Ethnicity & Disease, 27(2), 95-106.