Jump to content

Police trial AI software to help process mobile phone evidence


Matrix

Recommended Posts

Artificial intelligence software capable of interpreting images, matching faces and analysing patterns of communication is being piloted by UK police forces to speed up examination of mobile phones seized in crime investigations.

Cellebrite, the Israeli-founded and now Japanese-owned company behind some of the software, claims a wider rollout would solve problems over failures to disclose crucial digital evidence that have led to the collapse of a series of rape trials and other prosecutions in the past year. However, the move by police has prompted concerns over privacy and the potential for software to introduce bias into processing of criminal evidence.

As police and lawyers struggle to cope with the exponential rise in data volumes generated by phones and laptops in even routine crime cases, the hunt is on for a technological solution to handle increasingly unmanageable workloads. Some forces are understood to have backlogs of up to six months for examining downloaded mobile phone contents.

The use of AI and machine learning is slowly spreading into police work, though it remains controversial in areas such as predictive policing. Durham police have been experimenting with AI to assess the suitability of suspects for release on bail.

Earlier this year the chair of the National Police Chiefs’ Council, Sara Thornton, said her organisation was working with the Crown Prosecution Service on disclosure problems and could explore machine learning and AI solutions.

Cellebrite says it has been working with a dozen UK forces, including the Metropolitan police, trialling sophisticated software to help process digital evidence taken from mobile phones and computers. The company cannot name the other forces, it says, due to commercial nondisclosure agreements. The Met confirmed it has been exploring AI developments with Cellebrite.

Until now extracted data has been routinely provided in the form of PDF documents running sometimes into tens of thousands of pages. By contrast, the latest system sold by Cellebrite, called Analytics Enterprise, allows, it is claimed, officers to carry out sophisticated filtering, visualise social networks and feed in data from multiple phones to highlight, via geo-tagging data, when people were in the same place at the same time.

Built-in AI algorithms enable images and videos to be tagged according to whether the content includes weapons, faces, cars, nudity, drugs, flags and other categories. The system can also extract text from screenshots.

“Police officers are under mountains of cases they have to investigate and they don’t have the time or knowledge to go through everything,” said David Golding of Cellebrite. “If you present it in a more intelligible way, it will be much easier.”

While AI software may present material in a more intelligible and accessible form, it is unlikely, lawyers say, to provide a simple fix to disclosure problems. Police and prosecutors stress that they cannot hand over all downloaded data in an investigation to defendants’ lawyers but must screen evidence for relevance, confidentiality and a host of other legal issues.

Nick Baker, deputy chief constable of Staffordshire police and lead officer on digital forensics for the National Police Chiefs’ Council, confirmed that a number of forces, including his own, are working on AI systems.

“AI is the next bit we are exploring,” he said. “It’s early days in terms of its application. This is an area the police need to look at sensitively and proportionately.”

Simply hiring more officers for a “manual solution” to the vast quantities of digital data being produced is “not feasible”, Baker said. “AI is not a panacea but it’s part of the solution,” he said.

“It will have to be used in a set of standards that gives reassurance to the courts so that the reality of what the machine is doing is understood. There are many problems with disclosure but speed of processing data is one of them and AI will certainly help with that.”

Baker said he appreciated there were concerns about bias, reliability and privacy but that there would ultimately always be human control of AI investigative systems – which he expects to become “part of mainstream policing”.

He did not know whether AI has yet been used on active cases: “The trial phase is where it’s sat next to more traditional processes to ascertain its reliability. This is not just some magic solution where police sit back and let a robot sort it out.”

Millie Graham Wood, a solicitor at the campaign group Privacy International, said: “The use of AI and machine learning is hugely controversial. It’s so opaque. What implication does this have for people whose names come up in these communications? It will be like the gangs matrix used by police. There are huge issues with the databases the police hold already.”

Corey Stoughton, advocacy director at Liberty, said: “Once again police forces appear to be secretly adopting radical new technology that threatens our privacy and digital security, without any democratic oversight or debate.

“Powerful tools like this could mean that rape victims are doubly victimised by unnecessary incursions into their privacy, or that bias is built into decisions about what is relevant and what is not. The home secretary must stop allowing police forces to ‘trial’ potentially harmful technologies without first allowing parliament and the public a say.”

A spokesperson for the Metropolitan police said: “Over the past two years we have been proactively exploring more advanced artificial intelligence systems that will enhance our investigative quality and help us to better protect the most vulnerable in society while bringing offenders to justice more swiftly.

“At one stage, this did include liaising with Cellebrite for approximately six months last year – however, this was not contracted but as part of continuous research into business improvement and development. We are currently assessing the scope for another trial within this field.”

Advertisement

The Home Office declined to comment.

How Analytics Enterprise works

Cellebrite’s system features face-recognition software, meaning the police could feed in the photograph of a person of interest to see whether they crop up in mobile phone pictures. An entire database of images could also be fed through the software, according to Cellebrite.

Machine learning algorithms have trained the software to recognise images of child exploitation, it is understood. Such a facility, the firm says, reduces the need for police officers to view so many pictures of child abuse, since visual comparisons are automated. “[Police] have to see these images which are unbelievably horrible,” said Golding.

“Using a system like this, they don’t have to do that. If you have pictures of a room where a victim was and there was a poster, we can look for that poster on all the other pictures. It can save months of trying to go through this.”

The system, the firm claims, is a powerful tool for exposing links and patterns in gang-related crimes. It is already being used by some US police forces.

Cellebrite already supplies self-service kiosks to UK police forces that officers use to download the contents of mobile phones, laptops and other devices so they can pursue investigations.

The company also provides training and support to police high-tech crimes units and runs a service unlocking encrypted mobile phones for law enforcement agencies. Its warehouse in Israel reportedly contains 23,000 types of mobiles. Until January, when it opened a new unit in London, mobiles whose security was difficult to crack were sent to Cellebrite’s offices in Munich.

“The main issue is that there’s so much data and the police are under enormous pressure,” Golding said. “Manpower has been cut and, because of 28-day bail [limits], they need to solve cases much quicker.”

These types of technological capabilities have prompted criticism of retention of custody images by police – regardless of whether individuals are subsequently charged or convicted.

A report published last week by the Commons science and technology committee found that in 2016 the Police National Database contained 19m facial images, 16.6m of which were searchable using facial recognition software.

“What’s of real concern is that these things are happening on the ground without any real debate about how much this infringes on our individual liberties,” said Norman Baker, a Liberal Democrat MP and chair of the committee.

There are also concerns about the accuracy of such technology, with a facial recognition trial at the 2017 Notting Hill carnival incorrectly matching people 98% of the time.

There is also evidence of racial bias in some existing image recognition systems, which have been shown to correctly match white faces more frequently than black faces.

source

Link to comment
Share on other sites


  • Views 329
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...