Professional Comment

ChatGPT and the Adult Care System – We Need To Be Aware Of The Limitations

Nourish Care’s Chief Product Officer Jeremy Baldwin on why the care sector should be careful in rushing to adopt AI

There are many great examples of AI being used to positive effect in health and social care leading some care providers to look at open platforms like ChatGPT to generate care plans and make calls on tech providers to build into their systems. I get the attraction but we shouldn’t rush in.

Yes, plans created through ChatGPT can read really well and appear person-centred, saving time, and improving the perceived standard of the plans generated, but there are significant data privacy, clinical safety and quality issues to consider. These relate to how and when the AI is being used, who or what is making the decision and who is responsible if something goes wrong.

A single care provider deciding to accept these risks in their own clinical safety cases is one thing while setting best practice by integrating into systems such as Nourish that are used at scale across the sector is quite another.

A common understanding of AI is technology performing human tasks and decisions. At the most basic level, asking the AI to perform a specific, usually administrative, but cognitive task like writing a report through to the AI automating decisions or tasks based on a set of rules and then to autonomous applications where the AI is doing both the decision making and action without intervention – care delivering robots, which begs the question, will the human be replaced? Can you take the human out of health care? Or perhaps put another way can the AI care?

In reality, this is all a long way off in the care sector, if it happens at all. The careful, considered and responsible adoption of AI will reap benefits but, as always, will take longer to manifest than we think.

Automation in our digital systems is nothing new. It’s core to Nourish and how we are developing best practice to help ensure that the right things happen at the right time.

What’s changed is the availability of open language models like ChatGPT. They are trained to understand natural language, intent, and context for the action or decision and can respond with human-style conversation. This makes them useful and attractive.

Using ChatGPT to give better structure to a care plan, and make it easier for others to consume is fine as is using it as a starting point for personalised care plans but not if it’s being used to generate the plan from scratch.
A ChatBot that makes it easier to access and understand information from a defined source — such as an individual’s personalised care plan and record — and is really powerful, but caution should be applied if this ends up being advice or recommendation based on data from multiple, open sources. Where is this data coming from? What happens if there is a conflict? Which fact (or more likely opinion) should I use?

Nourish is advancing rapidly and one of the biggest areas of growth is integrations. What will revolutionise care is the use of devices and wearables, along with home automation devices that allow us to improve care in community settings, particularly in people’s homes. It’s a huge technological leap. All of these things become part of the puzzle in being able to identify what normal looks like for this person and to be able to identify and monitor anomalies. This makes it easier to identify those at greatest risk, or urgent need and manage resourcing accordingly.

AI will continue to grow and Nourish is effectively creating a model of what good social care looks like. We’re ideally placed to do that as we’re instrumental in providing digital care support in almost a quarter (24%) of the market, which gives us access to a huge dataset that is growing all the time. AI modelling of these data sets over time, will inform better practice but we still need human beings to make those important care decisions.

Automations that reduce the burden on care teams for administrative, reporting and compliance have to be a good – and safe thing. Automated workflows that guide and nudge on the next best action will lead to faster and better interventions, but caution is needed if this strays into the generation and application of treatment plans or response to an event.

This need to keep decision-making in human hands won’t hinder advancement. Those predicting that we will be cared for by robots and automated systems in a few years’ time will be proven wrong. It’s not going to be futuristic care of the imagination, it will be slow because the system is slow. There is still a very long way to go and one thing I predict is that we’ll seriously underestimate the impact that AI will have on social care in the future.

About Nourish Care
Nourish is the leading provider of digital care management software in the UK. Nourish was one of the first digital social care record suppliers to be recognised as an NHS Transformation Directorate Assured Supplier at launch and is accredited by PRSB as a Quality Partner. The easy-to-use technology provides care teams with person-centred tools, timelines, assessments and more to drive outstanding care and improve outcomes for those with support needs. Nourish works with more than 3,500 care services in the UK and overseas within residential homes, nursing homes, learning disability services, mental health services, and other care settings.
www.nourishcare.com