Microsoft is building an app that can predict criminal behaviour

Advertisement

Prison inmates

REUTERS/Lucy Nicholson

Inmates sit in a classroom at the Orange County jail in Santa Ana, California May 24, 2011.

Microsoft is developing a programme that can predict whether criminals will re-offend within six months, according to a video discovered by Fusion.

Advertisement

The video, which is unlisted and has just 740 views, is a seminar from Jeff King, a senior programme manager at Microsoft. King talks about the company's efforts to create a computer programme that predicts human behaviour.

"It's all about software that has the ability to predict the future," said King.

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

To work this out, the software takes past behaviour - such as gang affiliation, whether the inmate attended rehab, their behaviour in jail, and more - and can predict, with 91% accuracy, whether they will return within six months.

The software is just a proof-of-concept at this stage, according to Microsoft. The fake data being fed into has been compiled to see whether the machine can accurately predict the future.

Advertisement

Microsoft has invested heavily in developing predictive technology. Cortana, the virtual assistant in Windows 10, has been able to accurately predict the outcome of various NFL games based on previous data about players and the conditions of the match.

nfl veterans

Otto Greule Jr/Getty Images

Microsoft's Cortana has been able to accurately predict the outcome of NFL games.

The company is also working on figuring out how a user will react to a new Xbox game based on how they reacted to other games.

One of the issues raised with the software is what, exactly, a law enforcement agency would do with the information. As no crime has been committed, an arrest cannot be made and there is still a 9% chance that the software is wrong.

"My fear is that these programs are creating an environment where police can show up at anyone's door at any time for any reason," Hanni Fakhoury, staff attorney at the Electronic Frontier Foundation, told Fusion.

"We have to think very carefully about what the roles for this technology are," said Jay Stanley, a senior policy analyst at the American Civil Liberties Union. "The data that goes into these algorithms needs to be open and transparent, and the outcome of what gets done with it needs to be closely examined."

Advertisement

NOW WATCH: A 56-year-old man filmed a conversation with his 18-year-old self, and it's going viral