On the heels of reports indicating that the distribution of online child sexual abuse material (CSAM) is reaching alarming proportions – with the number of cases in Canada continuing to double year over year, according to the RCMP – international eyes are focused on a group of Canadian researchers working to stop it.
Student researchers from the University of Manitoba are working with Two Hat Security Ltd. of Kelowna, to develop cutting-edge artificial intelligence software that will be the first in the world to accurately identify previously undetected child sexual abuse images and prevent distribution.
The student placements are part of a five-year program coordinated by Mitacs, a national government-funded agency working to bridge the gap between academic research and business.
“Of all the issues we’re solving to keep the Internet safe, this is probably the most important,” said Two Hat CEO Chris Priebe, noting that stopping CSAM is a challenge every child exploitation unit faces.
“Everyone would like to solve it but it’s very challenging to tackle it because it’s an extremely complex problem and it’s in the darkest corner of the Internet,” he said.
“Whereas existing software tools search the Internet for known images previously reported to authorities as CSAM, Two Hat’s product will accurately scan for images that exploit children as they are uploaded, with the ultimate goal of stopping them from ever being posted – which is why global law enforcement and security agencies are watching closely,” said Two Hat head of product development Brad Leitch.
Current research indicates as many as 22 per cent of all teenage girls are sending inappropriate photos of themselves. In addition, statistics compiled by the RCMP show that child sexual abuse cases in Canada doubled in 2015 and again in 2016, highlighting the critical need for a tool to help tackle the issue.
“This is a rampant global problem,” said Sergeant Arnold Guerin of the RCMP, “The ability to successfully detect and categorize newly distributed child sexual materials will be a game-changer in our fight against the online victimization of children.”
The first phase of Two Hat’s new product development involves students from the University of Manitoba who are at the leading edge of computer vision, deep learning and convolutional neural networks, the three main technologies being applied. Their work is particularly challenging because it is a criminal offence to view CSAM, meaning they are training computers to recognize images they themselves will never see.
“It would be impossible to do this without the support of Mitacs,” said Priebe. By connecting our business with student interns, we’re tapping into researchers at the top of their respective fields who are not afraid to tackle the impossible.”
Over the next five years, the Mitacs interns will be working to develop software that identifies sexually abusive images of children with a high degree of accuracy. The end goal is a software tool that can proactively be applied to stop people from uploading CSAM and can also be used by law enforcement agents to quickly identify and prioritize new cases. In the case of teenagers, for example, it will be possible for the software to send a warning that the image they are about to upload from their phone, tablet or computer is inappropriate or illegal.
“Studies have shown that if you can remind adolescents about the consequences of their actions, there’s a high likelihood they won’t do it,” said Priebe.
The second phase of the multi-year partnership with Mitacs will launch later this year and will involve researchers from Simon Fraser University and Laval University working to identify child sexual offenders on the Internet and prevent them from grooming child victims. Priebe emphasized that solving the problem of CSAM and child victimization on the Internet is a collaborative one.
“We know that as soon as we can successfully place a CSAM detection system onto a network, people will stop using it for that purpose and we will have won that corner of the Internet,” he said.