While search engines might be the first place most of us go when we're looking for pictures of anything, child abuse isn't something you'd expect either search engine to offer up. However apparently they can lead to it, which is why both Google and Microsoft have pledged to block results for as many as 100,000 different terms which it deems related to child pornography.
Instead of seeing links to websites when searching for these terms, users will see messages such as “Warning! Child abuse material is illegal,” with a link given to help and advice on the matter. The algorithm adjustments that will block these results will also be used across 150 different languages, “so the impact will be truly global,” according to Google chairman Eric Schmidt.
However, no mention was given in the Daily Mail coverage of what any of the 100,000 odd phrases may be. Only 13,000 of them will point out that child abuse is illegal to the searcher, suggesting that the other 77,000 could well be more ambiguously linked. In that case, this move could have a bigger impact than preventing child abuse viewers; it could potentially censor websites that have nothing to do with the distribution of underage pornography.
Similarly critics of this move have suggested that it will have little impact on the ability for people to find child abuse images, since the vast majority of them are trafficked on Tor accessible networks and through hidden groups that contact each other directly. With that in mind, it seems unlikely that anyone with more than a passing or burgeoning interest in rooting out these images, would do so through such a publicly available platform like Google or Bing.
These thoughts were echoed by Jim Gamble, former head of the Child Exploitation and Online Protection Centre, who said (via BBC) that these measures wouldn't make any difference to how easily people could access illegal child abuse images.
“They don't go on to Google to search for images. They go on to the dark corners of the internet on peer-to-peer websites,” he said, suggesting instead that the government spend £1.5 million on hiring specialised child protection experts to track down online predators and prosecute them.
Certain allegations in recent months make child abuse a sensitive topic at the moment. Even more than usual.
However, if the comment section of the Mail's story is anything to go by, it seems these companies feel compelled to do something to placate the less than technically savvy, who believe that this will make an impact. The same can be seen from the way the current government deals with pornography, with David Cameron's announcement last year that he believed every household should have a compulsory, opt-out filtering system for it, instead of letting parents make the decisions themselves about how to control their household's access to porn.
In this instance though, at least the companies involved in this filtering are doing a little more than just spouting rhetoric and answering the Government's call for action on a scale they surely know must have little to no effect. They're also using their technological know how to track down servers and networks that host and distribute child pornography. Both Microsoft and Google have developed photo and video labelling software, that assigns a unique key to the content. This then allows the companies to track the media as it appears on the deep and sometimes surface web.
While this technology has been shared with the government's Child Exploitation & Online Protection Centre (CEOP), that organisation has faced severe criticism in recent years. When in 2012, Canadian police shared information with CEOP on hundreds of names of British people that had bought child pornography from a Canadian company, no arrests were made in the UK, despite large numbers of similarly charged Canadian citizens facing jail time.
KitGuru Says: The government and Mail commenters may have their hearts in the right place, but I can't see this filtering doing much good. People aren't looking for child porn in Google, they're doing it where it can't be tracked. If anything, this will harm sites that have a legitimate reason for being twinned with those terms. Perhaps charities or counsellors that cater to those that have suffered at the hands of child abusers?