Categories AI

AI Tool Transforms Children’s Accounts into Clear Transcripts for Social Workers

AI tools are currently causing significant issues in the realm of social work. Frontline workers have reported instances of incorrect alerts regarding suicidal thoughts and nonsensical outputs, often referred to as “gibberish.”

Last year, Keir Starmer praised what he termed “incredible” AI technology aimed at saving time in social work documentation. However, a study involving 17 councils from England and Scotland, shared with the Guardian, identified a troubling trend of AI-generated errors known as hallucinations infiltrating official records.

As various local authorities adopt AI note-taking software to expedite the documentation and summarization of meetings with clients, an eight-month study conducted by the Ada Lovelace Institute uncovered instances where “potentially harmful misrepresentations” of individuals’ experiences were being recorded in care documentation.

One social worker explained that an AI transcription tool inaccurately “indicated that there was suicidal ideation”; however, they clarified that “at no point did the client actually … mention suicidal thoughts or planning.”

Another case highlighted that AI notes could mention unrelated terms like “fish fingers or flies or trees” while the conversation was about a child’s parents arguing. Experts in social work have expressed concern that such inaccuracies could lead to missed patterns of behavior that pose risks.

Some social workers also voiced worries about the accuracy of transcriptions from individuals with regional accents. While one reported frequent occurrences of “gibberish” in the transcriptions, another remarked, “It’s become a bit of a joke in the office.”

Numerous councils, from Croydon to Redcar and Cleveland, have started providing their social workers with AI transcription tools to assist in recording and summarizing case conversations. The allure of saving time is particularly appealing to local governments grappling with staff shortages.

One widely used AI system, known as Magic Notes, is offered to councils at a cost between £1.50 and £5 per hour of transcription. Most social workers surveyed relied on either the specialized Magic Notes AI or the general-use Microsoft Copilot AI.

The study found that AI transcription can yield notable time savings, allowing social workers to foster better relationships with their clients. “Our findings suggest these tools can enhance the relational components of care work and improve the quality of information documented by social workers,” stated the report following interviews with 39 unnamed professionals.

However, one social worker discovered that when attempting to use an AI tool to revise care documents into a more “person-centered” tone, the system inserted “all these words that have not been said.” Another expressed concern that the technology had “crossed the line between being your assessment and being AI’s assessment.”

The report concluded that “AI-generated inaccuracies that make their way into these records can have serious ramifications, such as social workers making incorrect decisions regarding a child’s welfare, potentially leading to harm and professional repercussions for the workers involved.”

The impact of AI errors is already being felt within the profession, with reports of disciplinary actions stemming from an inability to thoroughly verify AI note-taker outputs and subsequent oversight of clear mistakes, according to the British Association of Social Workers (BASW). The organization is urging social work regulators to provide clear guidelines on the appropriate use of AI tools.

While there is “genuine enthusiasm” among some social workers about the potential of these tools, “they also present new risks to both social work practices and society, including possible biases in report summaries and inaccurate ‘hallucinations’ in transcripts,” noted Imogen Parker, associate director at the Ada Lovelace Institute. “These risks are not being thoroughly evaluated or addressed, leaving frontline workers to face these challenges alone.”

Social workers often receive minimal training on AI—sometimes as little as an hour. Some reported dedicating up to an hour to reviewing AI-generated transcripts, while others spent as little as two minutes. One individual admitted to taking “five minutes to just quickly screen it […] and then cut and paste it onto the system.” Another commented that AI-generated cut-and-paste care plans could appear “horrifically.”

Additionally, there were mentions that some colleagues were either too overwhelmed or apathetic to verify the transcripts.

“The true danger lies in people failing to check what has been documented for them,” argued Andrew Reece, BASW strategic lead for England and Wales. “The time spent on writing helps you reflect on what you’ve heard. If a computer handles that for you, crucial aspects of reflective practice may be overlooked.”

Beam, the company behind Magic Notes, emphasized that its outputs are intended as initial drafts, not final records. “AI tools are being embraced by social workers for a good reason,” stated Seb Barker, co-founder of the company. “The services are stretched thin and hard to navigate; many social workers are at risk of burning out, and the demands for precise, compliant documentation are increasing.”

He added that a bias evaluation had shown Magic Notes performed “consistently and equitably,” and highlighted the tool’s unique features tailored for social work, including automated checks for potential hallucination risks. “Not all AI tools are created equal; specialized tools addressing the specific needs of the sector can differ from generic, low-quality alternatives,” he noted.

The UK government and Microsoft were contacted for their responses.

Leave a Reply

您的邮箱地址不会被公开。 必填项已用 * 标注

You May Also Like