Hey, Siri! Remember Your Sisters!
Why artificial intelligence needs to consider the unique needs of older women
Artificial intelligence (AI) is making headlines everywhere. Yet AI applications and implications for older adults, particularly older women, have not been adequately contemplated.
It's no longer a moonshot idea from a science fiction movie. AI is already part of our daily lives — Apple's Siri, Amazon's Alexa and self-driving cars. Now comes ChatGPT, an AI chatbot that can have human-like conversations, compose music, create art and write essays. It has disrupted the world as we know it. Pundits who are not easily impressed describe these advancements as "scary good."
Older women are a unique population and their needs often are neglected because of gendered ageism — discrimination based on their age and sex
Many leaders have asked for a pause on AI development until we gain a better understanding of its impact. This is a good idea — but for reasons well beyond those often identified.
We need to ask: How can we ensure that AI's reach is considering the unique needs of different populations? For example, many countries are becoming super-aged societies where women make up the majority of the older population. Is AI taking the needs of older adults into account?
Without thinking through these questions, we may leave older adults — particularly women, and other marginalized populations — open to discriminatory outcomes.
The needs of older women are often invisible to decision-makers. Older women are a unique population and their needs often are neglected because of gendered ageism — discrimination based on their age and sex. Research has already demonstrated that older women are more likely to experience adverse health outcomes and face poverty and discrimination based on age and sex.
AI perpetuates this discrimination in the virtual world by replicating discriminatory practices in the real world. What's worse is that AI automates this discrimination — speeds it up and makes the impact more widely felt.
AI models use historical data. In health care, large data sets composed of personal and biomedical information are currently being used to train AI, but these data have, in many cases, excluded older adults and women, making technologies exclusionary by design.
Bias In, Bias Out
For example, AI has a valuable use in drug research and development, which uses massive data sets or "big data." But AI is only as good as the data it gets and much of the world has not collected drug data properly.
In the United States, until the 1990s, women and minorities were not required to be included in National Institute of Health (NIH) funded studies. And up until 2019, older adults were not required to be included in NIH funded studies, leaving a gap in our understanding of the health needs of older women in particular.
AI powered systems are often designed based on ageist assumptions.
Excluding older women from drug data collection has been specifically detrimental because they are more likely to have chronic conditions that may require drugs. Thus, they are more likely to experience harmful side effects from medications.
Also, AI powered systems are often designed based on ageist assumptions. Stereotypes such as older adults being technophobes result in their exclusion from participating in the design of advanced technologies.
For example, women make up majority of the residents in long-term care homes. A study found that biases held by technology developers toward older adults hindered the appropriate utilization of AI in long-term care.
There also needs to be further thought given to loss of autonomy and privacy, and the effects of limiting human companionship because of AI. Older women are more likely to experience loneliness, yet AI is already being used in the form of companion robots. Their impact on older women's wellbeing, especially loss of human contact, is not well studied.
There also needs to be further thought given to loss of autonomy and privacy, and the effects of limiting human companionship because of AI.
This is how older women get left out from properly benefitting from advancements in technology.
How to Avoid Bias
The World Health Organization's (WHO) February 2022 policy brief "Ageism in Artificial Intelligence for Health" outlines eight important considerations to ensure that AI technologies for health address ageism. These include participatory design of AI technology with older people and age-inclusive data.
We would add the need to consider the differences between women and men throughout. All levels of government also need to think about how AI is affecting our lives and get innovative with policy and legal frameworks to prevent systemic discrimination.
Ethical guidelines and the routine evaluation of AI systems can help prevent the perpetuation of gendered ageism and promote fair and equitable outcomes.
It's time we rethink our approach and reimagine our practices, so that everyone can participate and take advantage of what AI has to offer.