STE WILLIAMS

Google and Microsoft tackle child abuse images with search and YouTube changes

Image of camera lens courtesy of ShutterstockTwo search giants, Google and Microsoft, have agreed on measures that should make it harder to search for child abuse images online on the open internet, while Google has made a groundbreaking move to identify and ferret out videos made by paedophiles on its YouTube service.

YouTube engineers have created new technology to identify videos made by paedophiles, according to Google Executive Chairman Eric Schmidt, whose letter about the changes was posted in the Mail Online on Sunday.

As it is, Schmidt wrote, there’s “no quick technical fix” that enables search engines to detect child sexual abuse imagery, given that computers can’t reliably differentiate innocent pictures of children at bath time and genuine abuse.

That means Google has to rely on humans to review images. Those that are determined to be illegal are given a unique digital fingerprint.

Given these unique digital fingerprints, Google can automatically identify illegal pictures.

But paedophiles are increasingly filming their crimes, Schmidt said.

To address that source of child abuse imagery, Google is now testing a new technology to identify such videos and will hopefully make it available to other internet companies and child safety organisations in the new year, he said.

But the work doesn’t stop at the open internet.

The YouTube announcement came the day before a Downing Street summit at which UK Prime Minister David Cameron was scheduled to announce that British and US law enforcement agencies will jointly target online child abuse by monitoring those who operate on the hidden internet.

A transatlantic taskforce will identify ways of targeting criminals and paedophiles who use secret encrypted networks to distribute abuse imagery.

Google and Microsoft also announced that “up to 100,000 search terms will now return no results that find illegal material”, the BBC reports.

Such searches will also trigger warnings that child abuse images are illegal.

Both companies have introduced new algorithms that will prevent Google Search and Microsoft’s Bing from delivering this type of illegal result.

According to The Guardian, Google’s Schmidt announced that a team of 200 had worked to clean Google Search of search terms that can lead to child sexual abuse images.

The restrictions will first be launched in the UK, after which they’ll be introduced to other English-speaking countries and in 158 other languages over the next 6 months.

Google is also displaying warnings at the top of search results for 13,000 queries.

UK Prime Minister David Cameron welcomed the move.

In a speech in July, the PM had announced new measures to protect children and challenged outfits such as Google, Yahoo, and Microsoft to do their part by, for one thing, adopting a blacklist of “abhorrent” search queries that leave no doubt that a searcher’s intent is malevolent.

Google communications director Peter Barron said that the changes would make it “much, much more difficult to find this content online.”

More of what Barron said, from the BBC’s coverage:

We’re agreed that child sexual imagery is a case apart, it’s illegal everywhere in the world, there’s a consensus on that. It’s absolutely right that we identify this stuff, we remove it and we report it to the authorities.

Unfortunately, Google’s and Microsoft’s efforts to strip results away from child abuse-related search terms is well-meaning but, ultimately, might amount mostly to, at best, a waste of time, effort and money and, at worst, censorship.

It is not on the open internet that paedophiles search for, and find, the images they’re after. Rather, it is on the so-called dark or hidden web where the trafficking in such images mostly occurs.

As pointed out in a recent University of Massachusetts/Amherst research paper on measuring and analysing child porn on P2P networks, such networks are the most popular mechanism for acquiring and distributing such imagery.

It is here that such images are exchanged, mostly via P2P, largely in encrypted format.

A recent report from the Child Exploitation and Online Protection Centre (CEOP) in the UK backed this up:

The commercial distribution of IIOC [indecent images of children] on the open internet is estimated to account for a very small percentage of the transactions taking place. This low level is likely to be a result of the large volume of IIOC in free circulation, particularly over P2P, and widespread awareness of the traceability of conventional payment methods.

The tendency of paedophiles to use the dark web is increasing, according to the CEOP:

The use of the hidden internet by IIOC offenders remained a key threat during 2012 with the number of UK daily users connecting to it increasing by two-thirds during the year. This represents one of the largest annual increases globally, in a non-oppressive regime.

Technologies designed to scour the dark web searching for active paedophiles are likely to yield far better results than anything that Google and Microsoft are doing with regards to search and the open internet.

One such technology is automatic search on P2P networks for query terms commonly used with child abuse content.

This type of tool was used to collect information on three US defendants, who tried to get the evidence dismissed, saying that the automated computer search amounted to warrantless search and was thereby a violation of Fourth Amendment rights against unreasonable search.

A US federal court rejected that claim last week, saying that once the alleged paedophiles had posted abuse images to a P2P network, they surrendered their rights to claim those images were private files.

It is here, in the dark web, that technology advances and court decisions such as this one from last week stand the best chance of battling child abuse.

Paedophiles live in the dark web. That’s where the battle must be waged.

Image of camera lens courtesy of Shutterstock.

Article source: http://feedproxy.google.com/~r/nakedsecurity/~3/lKH5a11qKiU/

Comments are closed.