NetClean is a social business providing solutions to detecting child sexual abuse material and safeguarding against crime in the workplace. One of our technologies reacts when it detects the digital fingerprint of an image or video that law enforcement has classified as child sexual abuse material.
Another technology enables internet service providers (ISPs) to block any part of a URL containing known child sexual abuse material, hence limiting the spread of child sexual abuse material and stopping the cycle of re-victimization.
NetClean develops its technology in conjunction with police authorities to address key legal issues and remain at the forefront of developing efficient technologies.
In a report published in early 2017, INTERPOL identified 10 000 victims of child sexual abuse (CSA) content globally, referring to this figure as the ‘tip of the iceberg’. With such an enormous task ahead of you, what methods are you employing to ensure that NetClean is as effective as it needs to be?
We’re ensuring that our technology is as efficient as it can be, and that we lobby hard to get the word out there about how important it is that all businesses and organizations with computers and networks take steps to safeguard their equipment and networks, and engage with the debate in general.
Specifically, we need organizations to know that when software on their systems detects a ‘known’ image, it often leads to the discovery of previously unknown material, which, in turn, can lead to the discovery of abused children.
In short, we ensure that by following the trail of a detected image, new material can be found, offenders can be prosecuted and children can be rescued.
Would you know the percentage of children being trafficked for sexual exploitation or for the production of CSA material?
This is an interesting question, and one that I do not have the answer to. To begin with, it is important to try to establish what we mean when we talk about exploitation. There is human trafficking, which includes children who are used for sexual means; travelling offenders who engage in child sex tourism; and online child sexual abuse, which includes CSA material. Even though there is definitely a connection between CSA material and trafficking, I would say that CSA material probably mostly falls outside of trafficking. Therefore, it’s not always helpful to conflate these crimes. What we do know from the NetClean Report 2016 is that most CSA material in circulation depicts children from North America and Europe. Whether this is true for children being trafficked for the production of the material, I don’t know. But it is definitely something that could benefit from further research.
A current flashpoint in the world of new media is the online privacy debate. How do you balance the right to privacy with software that essentially monitors what content is being accessed from a given computer or network? Have you found organizations cooperative in this regard?
I think the question is, whose integrity is it that we are talking about? What about the integrity of the children depicted in the images? However, we don’t view the privacy issue as a big problem. This is because our software doesn’t monitor what content is being accessed, and it doesn’t scan what the user is doing: it reacts only when material that has been classified and hashed by law enforcement as child sexual abuse content is being handled. If this happens, privacy is a secondary issue. In addition, organizations generally understand this issue more now than they did five or six years ago, and they often have their own policies and codes of conduct that set out the ethos of the organization and what IT equipment can be used for, and so on. They understand the importance of keeping computers and networks safe, and are starting to realize that having a system that detects online child sexual abuse material is a hygiene factor and just as important as having programs that stop viruses. Today Agenda 2030 is also helping to highlight the fact that fighting online child sexual abuse is an important area of corporate social responsibility.
Do you feel that the solution to mitigating and eventually eradicating the problem of CSA content distribution lies in working with corporate partners and developing preventative software, or with law enforcement and developing corrective software?
I don’t think that working with one over the other is better. The key to fighting this crime is ensuring cooperation between all stakeholders, locally and internationally. Hence, industry, law-enforcement agencies, businesses, healthcare providers, schools – in other words, every sector that has a stake in this issue – needs to be aware of this problem and have a plan to fight it. I would say, however, that businesses do have a big part to play in solving this problem and working preventatively. They might have to come to terms with fighting a problem they don’t know they have, but once they are aware, and once they act on it responsibly, they will not only protect their current workforce and their families, but also safeguard the workforce of tomorrow. Cooperation between the police and industry is also key because a multi-pronged and connected way of fighting online child sexual abuse is needed.
What challenges do ISPs face in seeking to mitigate the risk of CSA material?
The challenge for ISPs is to act quickly. If they provide hosting services, they need to work internationally and have a take down and notice procedure with the police, and they need to ensure that they do this on both a regional and global level. ISPs must diligently block access to known sites and ensure that they work hard to detect and block new ones as soon as possible.
Can you describe on one example of best practice that you feel has been helpful in tackling the risk of child exploitation?
Employers mirror society in general, so there is a strong likelihood that somebody in an organization is viewing or downloading child sexual abuse material. It is important, therefore, that the organization takes its responsibility as an employer seriously and acts accordingly.
A good example of how a business showed best practice happened quite recently. The company got an alarm from the software that they had installed, and the IT security department found two incriminating images, which were reported to the police. The police investigated and searched the perpetrator’s home, where they found several devices with previously unknown material. They also found that the perpetrator had been assaulting his stepchildren. The images on his devices had been shared across the world, and because the police had the devices they were able to not only arrest the man who originally assaulted the children and shared the material, but were also able to work internationally and arrest others further down the supply chain. So, by installing software and reporting the crime to the police, children were protected and the perpetrators prosecuted.