backtop


Print

It is reported that the Police AI system after being tested for a longtime, now ready to put to work

Police in Durham are preparing to go live with an artificial intelligence (AI) system designed to help officers decide whether or not to keep a suspect in custody.  The system ranks the suspects at three different levels low, medium, or high risk of offending and has been trained by force.
 

Even though some experts believe that the system could be useful, others think that it could make wrong decisions, so the result should be carefully assessed.

 

 

According to a report, Durham police data for the Harm Assessment Risk Tool (Hart) was collected from 2008-2012 and then was tested on year 2013 and result was monitored over 2 years.
 

The results were accurate most of the time, and accuracy on low risk was 98% and high risk were 88%.   During the trial period, the accuracy of Hart was monitored but it did not impact custody sergeants’ decisions’ according to Sheena Urwin, head of criminal justice at Durham Constabulary. 

“I imagine in the next two to three months we’ll probably make it a live tool to support officers’ decision making,” She told the BBC.

 

Also, she explained that suspects with no offending history would be less likely to be classed as high risk by Hart, if they were arrested on suspicion of a very serious crime such as murder, for example, that would have an “impact” on the output.
 

As it is reported Prof Lawrence Sherman, director of the University of Cambridge Center for Evidence-based Policing, was involved in the tool’s development.
 

 

Some have expressed concerns over algorithms’ potential to bias decision-making in certain contexts.
 

Hart has limitations currently based solely on offending data from Durham Constabulary and does not have access to information in the police national computer, said Ms Urwin.  Which called as a problem Helen Ryan, head of law at the University of Winchester, also she added.” Even without this system, [access to sufficient data is] a problem for the police.” 

However she thinks the system is interesting in principle and that it had the potential to be hugely beneficial following extensive piloting. “ I think it is actually a very positive development, “ she added.” I think, potentially, machines can be far more accurate –given the right data-than humans.”





"Can anyone tell me what MobileMe is supposed to do?... So why the f*** doesn't it do that?" -- Steve Jobs



Most Popular ArticlesThe Best 4K Monitors
July 15, 2017, 6:30 AM
HP 280 G2 MT Desktop PC
July 16, 2017, 6:47 AM
Dell XPS 27 – Large Screen PC AlO with High-End Performance
July 10, 2017, 7:22 AM
iPhone 8 – OLED Screen & 3D Laser Tech
July 14, 2017, 7:45 AM
Comparison: Rock64 vs Raspberry Pi 3
July 11, 2017, 6:53 AM

Latest Blog Posts






botimage
Copyright 2017 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki