
How AI Can be Implemented More Fairly in Home Health Care and Low-Wage Work Settings
Key Takeaways from new research in Home Health Care
As AI becomes more common in the home and the workplace, research on its effects is still relatively rare. Specifically, little research examines AI’s impact on low-wage workers in the context of home health care.
New research from the Initiative on Home Care Work at Cornell University’s Center for Applied Research on Work (CAROW) points to how stakeholders can use AI more fairly in home health care and low-wage work settings.
Who are Home Care Workers (HCWs)?
HCWs are formal, paid caregivers who provide at-home care for patients with serious illnesses, such as heart disease, dementia, Alzheimer’s disease and more. HCWs include personal care aides, home health aides and certified nursing assistants.
In general, home care workers are a marginalized workforce. 85% are women, and a large number are immigrants and/or from racial and ethnic minorities. HCWs are paid low wages for challenging, high-stakes work. Many home care patients have dementia and other serious conditions that need constant care and monitoring. HCWs work long hours isolated within patients’ homes where there is an unequal power dynamic between the workers, their patients and their patients’ families. As a result, the home care industry sees high turnover and burnout—among HCWs and other staff and management.
Perhaps, in this unique work setting, home care agencies and workers can benefit from AI. The problem is that the benefits and burdens of AI are not shared evenly between the agencies, workers and tech companies that develop AI.
The Challenge of AI in the Home Health Care Setting
Naturally, there are pros and cons to AI in the home health care setting. It provides home care agencies great efficiencies in pairing HCWs with patients and scheduling, for example. It can also benefit the individual HCW by giving useful medical information and reminders on the job.
However, advances in technology don’t always benefit workers. For example, one technology commonly used in home health care surveils HCWs’ comings and goings via GPS. “‘Who is running it?’ Towards Equitable AI Deployment in Home Care Work,” conducted extensive interviews with HCWs, management and senior leadership at home health care agencies, as well as worker advocates. Some of the study’s findings revealed that:
- AI can create additional, invisible labor for HCWs, like troubleshooting and regularly inputting data.
- AI can make mistakes, and when it does, those mistakes can negatively impact the patient and the worker responsible for their care.
- AI may further isolate HCWs—often already working by themselves at a patient’s house—by further reducing the contact they have with supervisors and peers.
- AI factors in an HCW’s emotional intelligence and other soft skills that might make them well-suited to a particular patient. In turn, the HCW can lose income by missing out on potential pairings.
- HCWs are often unaware that AI is already in use, how it works, or that it uses and stores data from their interactions with patients, even after leaving an employer.
Several participants in the study worried that they would be to blame for any errors in AI judgement. “I think it will be my fault because [the AI system] is only giving me suggestions, and I am the person who decides to do that suggestion or not. So, I will blame myself,” said one HCW interviewee.
These interviews revealed that AI in home health care reinforces power imbalances. It strengthens the tech companies who design it and the agencies who deploy it while disempowering the surveilled and assessed HCWs and, in turn, threatening patient outcomes. There are three main channels to level the playing field: policy, union and advocate organizations and home care agencies.
Recommendations for Policymakers
- Ensure HCWs and their clients can opt out of AI use in home care.
- Establish a transparent and clear process for HCWs to contest an AI decision.
- Cement a regular and transparent process to audit the AI’s benefits and risks and how those are distributed among stakeholders.
- Policymakers must partner with home health care unions and advocacy organizations to provide resources to support democratic governance structures.
- Policies should go beyond broad governance principles to provide guidance for implementing AI regulations practically.
- Fund research and commission large-scale studies that examine AI in additional contexts (e.g., rural, international) and that involve additional stakeholders (patients, families, clinicians, tech companies, policymakers, etc.)
Recommendations for Home Health Care Unions and Advocates
- Provide HCWs with data literacy training—Educate HCWs on how to use AI on the job, contest its outcomes, contribute to AI governance structures, and give their fully informed consent to using AI and its data gathering.
- Form democratic governance structures—Data cooperatives bring stakeholders, including front-line workers, together to discuss the use of AI in homecare settings.
- The study found that unions and health care advocacy organizations are best positioned to advocate for the less empowered.
- Within these democratic governance structures, educate and empower marginalized/vulnerable workers to be active members of governance structures.
- The burden of gaining additional AI expertise should not fall solely on this overworked and underpaid workforce. Ensure genuine inclusion and efforts to incorporate frontline workers’ ideas. Beware of tokenizing marginalized/vulnerable stakeholders without real inclusion.
Recommendations for Home Care Agencies
- Establish processes to assess an AI tools' safety, reliability and fairness before purchasing them.
- Regularly audit the decisions of any AI tools being implemented and check for bias.
- In addition to audits, ensure there is a mechanism in place for ongoing monitoring of AI tool implementation and how data is used.
The study referenced in this explainer was funded by the Innovation Resource Center for Human Resources.
Written by Anne DeCecco
Edited by George Adanuty
Photo credit: Lindsay France, Cornell University