Scotland Yard is leveraging AI solutions from the US tech firm Palantir to scrutinize the behavior of its personnel, aiming to identify underperforming officers, as reported by the Guardian.
While the Metropolitan Police has previously refrained from confirming its use of Palantir’s technology, they have now admitted to employing its AI tools to analyze internal data, including sickness rates, duty absences, and overtime trends. This initiative seeks to pinpoint potential deficiencies in professional conduct.
The Police Federation, which represents frontline officers, criticized this strategy as “automated suspicion.” They emphasized that: “Officers should not be subjected to opaque or untested tools that might wrongly interpret workload pressures, sickness, or overtime as signs of misconduct.”
The Metropolitan Police, which employs 46,000 personnel, is the largest police force in the UK and has been embroiled in numerous controversies, including lapses in officer vetting – most notably illustrated by Wayne Couzens’ murder of Sarah Everard – and the tolerance of discriminatory and misogynistic conduct.
According to the force, “There is evidence to suggest a correlation between high levels of sickness, increased absenteeism, or unusually frequent overtime, and shortfalls in standards, culture, and behavior.”
The time-limited pilot project utilizing Palantir’s technology aims to consolidate data from multiple existing internal databases to “help us identify these behavioral patterns among our officers and staff” and is “part of our broader effort to enhance standards and improve the culture within the Met.”
They further stated, “While Palantir’s systems assist in identifying patterns, it is ultimately the officers who investigate further and make decisions regarding standards, performance, or other issues.”
A spokesperson for the Police Federation expressed concern: “Any system that profiles officers using algorithmic patterns must be approached with caution. Policing operates under some of the most rigorous scrutiny of any profession. If police forces are serious about enhancing standards and building public confidence, the emphasis must remain on effective supervision, fair processes, and human judgment, rather than the automation of suspicion.”
Palantir has also been embroiled in ongoing controversies regarding Peter Mandelson’s relationship with Keir Starmer after he was dismissed due to ties with Jeffrey Epstein. A lobbying firm co-owned by Mandelson, Global Counsel, collaborates with Palantir, which was co-founded by tech billionaire and Trump supporter Peter Thiel.
Last year, Mandelson and Starmer visited Palantir’s technology showroom in Washington DC, where they met with Palantir’s CEO, Alex Karp, shortly after Mandelson’s appointment. Calls for increased transparency regarding Palantir’s public sector contracts in the UK have intensified, including a £330 million deal with the NHS in November 2023 for a federated data platform and a £240 million contract with the Ministry of Defence in December 2025.
In response to the Metropolitan Police’s collaboration with Palantir, Martin Wrigley MP, a Liberal Democrat member of the Commons science, innovation, and technology select committee, stated: “I am concerned about the rights of officers as employees. Surveillance by employers has been contentious even before the introduction of AI. Palantir appears to monitor every aspect of government. So, who is overseeing Palantir?”
Palantir’s AI is already in use by several other police departments to aid investigations through services provided by two regional investigations units.
The Labour Party, in its recent policing white paper, affirmed its commitment to assisting the police in adopting AI responsibly and swiftly. The party plans to invest over £115 million in the next three years “to support the rapid and responsible development, testing, and implementation of AI tools across all 43 forces in England and Wales.”
A representative from Palantir commented: “We take pride in the fact that our software is utilized to enhance public services in the UK. This includes improving police operations, facilitating more NHS procedures, and aiding Royal Navy ships in remaining operational for longer periods.”
In conclusion, the integration of AI technology in policing is a complex issue that raises significant ethical questions. As Scotland Yard utilizes Palantir’s tools to analyze officer behavior, the balance between accountability and privacy must be carefully considered to ensure that the pursuit of transparency does not compromise the rights of officers.