Care home operators are increasingly turning to AI for help - but why?
AI within the care industry
Use of AI technologies by care providers primarily aims to save time and improve compliance, care outcomes and productivity. Whilst helping operators reduce back-office processing costs (particularly when combined with automation services) and access real-time data, they can also implement processes such as night-time monitoring and fall detection, improving outcomes for both care providers and those receiving care.
AI can be used not only to monitor and record residents’ medical statistics, such as heart rate and blood pressure throughout the day and night, but it can also interpret the data, flagging any potential medical issues to care providers that could be detrimental to the resident and suggesting improvements to care.
Beyond detecting and alerting care providers of any glaring issues, such as if a resident stops breathing or their heart stops, AI technologies are attuned to subtle changes or abnormal statistics, suggesting what the issue might be and how it may be remedied. For example, an AI assistant detected that a resident was displaying potential indicators of diabetes based on the data it had collected, and which had otherwise gone undetected. It suggested a diet plan that should be implemented to mitigate issues and therefore improve quality of life, which the resident enacted as suggested. As a result, the resident’s medical statistics substantially improved, evidencing the efficacy and capability of these technologies.
Potential issues of AI within care
There are, however, several issues with AI which have the potential to be problematic within the care context as a result of the production of unreliable information or misleading advice.
Generative AI models often experience hallucinations, an incident whereby they produce incorrect, nonsensical or misleading information whilst, unhelpfully, presenting it in a convincing manner that appears plausible to the untrained eye. This issue generally occurs when AI is provided with large masses of data and becomes confused. It is therefore essential that any AI output is checked before decisions are made and care plans are altered to avoid negative repercussions.
Sycophantic alignment is also a common issue, referring to the tendency of generative AI models, particularly those trained with human feedback, to prioritise agreeing with users, even if it means sacrificing accuracy or ethical considerations. In practice, this means that AI can effectively be swayed by the user to some extent, adapting its responses to please and align with the user’s expressed views, even though that may not necessarily be correct. This clearly has the potential to be harmful within a care context and can manifest in various ways, such as providing inaccurate information, failing to challenge false premises, or offering unethical advice.
Also particularly problematic is stochastic fluctuation, which refers to the inherent randomness or probability involved in the behaviour of AI algorithms and models. This randomness can manifest in various ways, including in the outcomes of predictions. This means that AI output may be unreliable, sometimes even producing a different answer when asked the same question twice.
Although these issues are becoming much less common with progression in AI, it is highly recommended that any care provider considering implementing AI processes engages a reputable AI service provider with the appropriate expertise to develop private and bespoke AI systems. Such systems can be trained to mitigate and even prevent these issues altogether, enabling AI to be utilised to produce positive outcomes within the context of care, improving efficiency, accuracy, and ultimately the quality of care provided.
If you have any queries in relation to the care sector or AI, please don’t hesitate to contact the authors Rebecca Hughes and Phoebe Tulley in the Freeths Health & Social Care Team.
Get in touch
The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.
Related expertise
Contact us today
Whatever your legal needs, our wide ranging expertise is here to support you and your business, so let’s start your legal journey today and get you in touch with the right lawyer to get you started.
Get in touch
For general enquiries, please complete this form and we will direct your message to the most appropriate person.