Implications of Gendered AI in Elder Care

Feminist methodological literature review for HCI 450: Foundations of Human-Computer Interaction

This research review utilizes a ‘methodological’ organizational approach - first, with an overview of existing feminist HCI methodologies, then discussing literatures around feminised AI, gendered caregiving and feminised AI in elder care.

 

Defining a integrated feminist framework for Human Computer Interaction (HCI)

The last two decades of HCI inquiry and research has uncovered the osmosis by which interaction researchers, analysts and designers affect the world around them and in many cases,  enact change - for better or for worse. These nuances create a challenging relationship to our  science. If our discipline purports to impose a singular perspective or value system on others, “designers have an ethical obligation to aspire toward a moral equivalent of scientific objectivity.” (Barzdell, et al., 2011) 

Feminism in social science research argues that science and socio-economic, political and cultural values are inextricably linked and supports a scientific objectivity that encompasses rather than eschews diverse experiences and vantage points. ‘Scientific objectivity’ is a worthwhile goal but that ‘objectivity’ should be relational and rooted in connection.

Barzdell et al., (2011) describe several guiding principles for an integrated feminist HCI methodology. Those principles most relevant to this research review are: connection to feminist theory, a firm commitment to methodology, demonstrated empathy in research and evaluation of subjects, disclosure of the researcher’s ideological commitments, diverse and mixed research methods, and reflexivity - the continuous self questioning of one’s own research.


Artificial Intelligence and Social Robotics: Existing Feminist Critiques

Building on the implications of a singular scientific objectivity, Adams’ (1995) feminist epistemological critique of AI and robotics posits: Should the subject or knower be seen as an individual or the community? For example, a computer system with the creator’s expert knowledge hardwired is an individual subject. Adams (1995, p. 361) describes this as  knowledge engineering - the parts of artificial intelligence with an ‘expert’ system. 

While this system is marketed as a model of ‘consensus knowledge’, the expertise fails to account for ‘unacceptable’ points of view and lacks a faithful representation of the majority.  This implicates the majority of AI design in systemically disempowering individuals outside the hegemony of white, middle-class men. (Adams, 1995) Holders of knowledge become increasingly invisible, continually refracted through the mythological pretense to objectivity, excluding alternative paths and understandings. 

Adams also addresses the distinction between ‘knowing that’ and ‘knowing how’, an epistemological difference could have dramatic effects on the social role of caregiving. Propositional knowledge - ‘knowing that’ - is described by Adams as logic typically imparted to systems. (1995, p. 367) This ‘knowing that’, in turn, is all the knowledge that can be understood by such a system, thus closing the loop on any knowledge that is not propositional.

Imagine the knowledge surrounding midwifery or child rearing. This knowledge is of ‘how’ not ‘that’ - and herein lies the danger of self-referential propositional knowledge as the basis of systematic computing: failure to recognize and understand different bodies of knowledge.

In this system of propositional knowledge, little room is left for ‘knowing how’ and the knowledge of women’s work is relegated to non - knowledge.  The focus on the ‘know that’ “invalidates the lived experience of women’s work and makes invisible the skilled bodily knowledge which that brings.” (Adams, 1995, pg.371) Imagine the knowledge surrounding midwifery or child rearing. This knowledge is of ‘how’ not ‘that’ - and herein lies the danger of self-referential propositional knowledge as the basis of systematic computing: failure to recognize and understand different bodies of knowledge.  

Weber (2005) presents a feminist critique of socially assistive robots, specifically investigating the ramifications of ‘feminised’ artificial intelligence in caregiving and calling attention to the issue of ‘sociality’ - the idea that humans have evolved to socially interact with entities that also behave in a social manner. (Weber, 2005, p. 211).  

Designers of social robots often reduce this sociality to prosaic relational patterns - that of mother to child, or owner to pet.  These relational patterns are specifically designed to elicit a ‘nurturing response’ by the user, but in the case of increasingly popular pet robots for the elderly, one must consider the possible implications. Do social robots reinforce existing social norms? Are these social robots being designed to fill the labor gap? And perhaps, most importantly, are they a substitute for the ‘knowing how’ of establishing and maintaining human connection? 

Ubiquitous socially assistive technologies have also undergone a rapid feminisation. Bergen (2016) defines Siri as a cyborg, a “cybernetic system of parts”, composed of those AI designers who script her responses. This is an evolution of Adams’ ‘propositional knowledge’, now encased in a woman’s voice - an invisible knower with a lack of ‘know how’.  This new evolution in knowledge engineering is not only problematic in the continued erasure of traditionally feminine knowledge, but in a new and profound disembodiment: using “stereotypical traits of femininity” to work against the agency of real women. (Bergen, 2016, p.100) 

The author addresses the role of ‘emotional intelligence’ or the “ability to perform affective labor while maintaining composure, remaining unaffected by external stressors” in occupations such as flight attendants, administrative assistants, and caregiving. The increase in socially assistive technologies within these domains highlights the divorce of exploited labor from those bodies that perform it (Bergen, 2016, pg. 102) and underscores the increased bodily dissociation of knowledge. 

Zdneck (2007) continues this discussion with an expansive review of ‘female’ AI assistants across platforms and search engines. Many of these ‘virtual women’ are deployed as tools, performing rote and mundane tasks. The author offers nuanced critique of how these interfaces are designed, stating that the connections between gender, service roles and desire are obfuscated as a result of non attention by designers and developers in character development. (Zdneck 2007) Any discussion of gender is circumvented  by appeals to ‘humanness’, ‘believability’, ‘giving users what they want’, or base return on investment and market forces. (Zdneck, 2007, 409) 

Implications of Gendered AI in Elder Care

Utilizing the Bardzell et al. feminist HCI framework and synthesizing above existing feminist critiques, we are empowered to evaluate existing industry based review of available assistive social robots. Robinson, et al. (2015) provide a comprehensive review of existing healthcare robots and draw attention to an important industry dichotomy - that of rehabilitation robots and companion robots. 

These technologies serve very different purposes in caretaking: rehabilitation robots addressing physical and physiological issues while companion robots seek to mitigate the psychosocial burden of aging.  In evaluating the available assistive and companion robots, this review fails to mention gendering of the companion robots or discuss consequences of transferring skilled care giving  to assistive robots (Refer to Appendix for more information on these available technologies).  

This industry review makes no mention of knowledge erasure, displacement, reinforced social stratification and systemic disempowerment for either human caregivers or the elderly and demonstrates that there is a dire need for critical application of feminist HCI methodology in both user research and interface/interaction design.

Feminist HCI critiques of AI and Social Robotics in Elder Care

Present discussion of AI and social robotics in elder care are two- fold - addressing both the potential user implications and critique of ‘feminised’ or gendered AI in elder care.  Concerns about elderly users of socially assistive robotics surfaced by Sharkey, et al. (2010) include potential reduction in human connection and contact, increased feelings of loss and objectification, and a transgression of personal liberty. These themes are connected to those critiques put forth by Weber (2005) and Adams (1995) -- lack of agency inherent within a technology reinforces a lack of agency for the user, especially for a disempowered group such as the elderly. 

“This bears the question - do socially assistive robots create competition for these already disempowered laborers? How do these feminised pieces of technology affect actual people? ”

Boyer (2004) examines existing prototypes that demonstrate subservience and gendering specifically in the field of elder care. The author considers both who should be taking care of the elderly and where - adult children? Hired caregivers? In the space of the home or a more systemized care facility? These questions should be paid close attention to before the design of a socially assistive AI or robot can begin. Boyer also makes mention of gendered labor - ‘waged’ labor performed historically by men and the ‘affective’ labor performed by women.  (Boyer, 2004, p. 76)

The care of the elderly has long been considered ‘affective’ labor and is now often performed by the most marginalized group - “women of color, who are undervalued and underpaid despite the required emotional intelligence of ‘know - how’ knowledge. This bears the question - do socially assistive robots create competition for these already disempowered laborers? How do these feminised pieces of technology affect actual people?

Conclusions and Future Discussion

The continued feminisation of socially assistive robots and AI reinforces attitudes that ‘care-work’ is a lesser form of labor, and less deserving of fair pay, benefits and protection. Additionally, emerging technologies create job instability for those people who are skilled caregivers. How can we as HCI researchers and designers continue to provide impactful research and design recommendations to AI interface designers and programmers? 

First, end goals of the elderly user must be thoroughly researched to ensure that AI and socially assistive technologies are designed and built to address the needs of a diverse population and helping, rather than harming the psychological and physical safety of the user.  

Second, we must ensure that those technologies employ a feminist epistemological design and shift from propositional knowledge to a ‘know - how’ knowledge base. This could include designing tools and assistive technologies for those skilled caregivers to ease caregiver burden and bolster an emotional connection, rather than simply replace it. 

Utilizing a feminist HCI methodology in this research could help illuminate the existing gender dynamics in caregiving,  drive creative approaches in designing outside of the established hegemony and bring much needed visibility to ‘know - how’ knowledge and skilled caregiving. Future directions for study could include an updated, comprehensive review of new and emerging technologies using a feminist HCI methodology. User research of the elderly using these products utilizing a feminist HCI methodology could also illuminate specific shortcomings and provide clearer design implications.

 

References

Adam, A. (1995). A Feminist Critique of Artificial Intelligence. European Journal of Women’s Studies, 2(3), 355–377. https://doi.org/10.1177/135050689500200305

Bardzell, S., & Bardzell, J. (2011). Towards a feminist HCI methodology. Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems - CHI 11. doi:10.1145/1978942.1979041

Bergen, H. (2016). ‘I’d Blush if I Could’: Digital Assistants, Disembodied  Cyborgs and the Problem of Gender. Word and Text: A Journal of Literary Studies and Linguistics,11, 95-113. Retrieved May 11, 2019, from http://jlsl.upg-ploiesti.ro/

Boyer, K. (2004). The Robot in the Kitchen: The Cultural Politics of Care-Work and the Development of In-Home Assistive Technology. Middle States Geographer, 37, 72-79. Retrieved May 11, 2019.

Robinson, H., MacDonald, B., & Broadbent, E. (2014, July 03). The Role of Healthcare Robots for Older People at Home: A Review. Retrieved from https://link.springer.com/article/10.1007/s12369-014-0242-2

Sharkey, A., & Sharkey, N. (2010). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology, 14 (1), 27-40. doi:10.1007/s10676-010-9234-6

Weber, J. (2005). Helpless machines and true loving care givers: A feminist critique of recent trends in human‐robot interaction. Journal of Information, Communication and Ethics in Society,3(4), 209-218. doi:10.1108/14779960580000274

Zdenek, S. (2007). “Just Roll Your Mouse Over Me”: Designing Virtual Women for Customer Service on the Web. Technical Communication Quarterly, 16(4), 397-430.  doi:10.1080/10572250701380766

Previous
Previous

Advanced Directives for Dementia

Next
Next

Eidetic: A Working Memory Game