UK Ministry of Justice has developed a computer programme to predict who might commit murder

0
319

by Rhoda Wilson, Expose News:

An algorithm has been developed by the UK’s Ministry of Justice to predict which criminals might later commit murder.   The tool, originally named the “Homicide Prediction Project,” is called the “Sharing Data to Improve Risk Assessment.”

Led by the Ministry of Justice, Greater Manchester Police, Home Office and Metropolitan Police, the project started in January 2023 and, according to a timeline obtained by Statewatch via the Freedom of Information Act, was completed in December 2024 but is yet to be deployed.

It raises concerns about bias and the ethical implications of using such predictive models on vast datasets, including those already facing structural discrimination in the UK such as those within the white British ethnic group, particularly white men.

TRUTH LIVES on at https://sgtreport.tv/

The following was originally published by The National Pulse.

The British Ministry of Justice is advancing in its initiative to create an algorithmic tool aimed at predicting which individuals convicted of crimes might escalate to committing homicide. Known internally as the Homicide Prediction Project, the undertaking emerged through Freedom of Information requests from the civil liberties group Statewatch, which flagged the project as concerning.

Expanding on risk-prediction systems already in place, the project is designed to build upon frameworks such as the Offender Assessment System (“OASys”), which has been used since 2001 to forecast recidivism and inform legal decisions. However, the broad scope of data in this new model has raised red flags. Data utilised, sourced from various police and government bodies, potentially includes information on up to half a million people, some without any criminal history.

Despite officials’ assertions that the project remains in a research phase, uncovered documents allude to future deployments. Sources claim increased collaboration across government agencies and police forces, such as Greater Manchester Police and the Metropolitan Police, to enhance the dataset driving these predictions.

Statewatch has raised ethical concerns about the predictive model’s potential for systemic bias. The British state has already attempted to introduce guidelines that were explicitly two-tier and would have seen ethnic minorities prioritised for bail over white men in the country.

Statewatch’s Sofia Lyall described the algorithm project as “chilling and dystopian,” calling for an immediate cessation of its development. “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed,” she said. She highlighted the risk algorithms pose in creating profiles of potential criminals before any crime is committed.

Read More @ Expose-News.com