Privacy’s gone when posting child abuse images to a P2P network, US judge rules
A US court has turned the tables on child abusers who use technology to share images of the abuse.
Specifically, a federal district judge in the US state of Vermont on Friday ruled that putting data up on a peer-to-peer (P2P) file-sharing network means you’ve made it publicly available and can’t then turn around and claim it was private.
The case involves three men charged with possessing child pornography who had filed a motion to suppress the evidence collected from their computer systems, saying that the files were private and the searches violated their Fourth Amendment rights against unreasonable search.
As Computerworld’s Jaikumar Vijayan reports, District Court Judge Christina Reiss wrote in a decision released on Friday that the defendants had essentially given up privacy claims by making the data publicly available on the internet over a P2P network.
The three defendants – Derek Thomas, Douglas Neale and Stephan Leikert – had earlier this year asked that the evidence be suppressed, claiming it had been obtained illegally.
The men contended that law enforcement’s use of the automated P2P search tool that collected information on private files held on their computers constituted a warrantless search.
Police used information about the files to obtain probable cause warrants. The defendants were later charged with possession of child pornography.
To collect the information, investigators used a software suite known as the Child Protection System that automatically searches P2P networks for query terms commonly used with child abuse content.
The police didn’t need to access the files, per se.
As Vijayan explains it, if a query-hit message indicated that it had found a file matching the query term, the application recorded the IP address, the files’ hash values, the actual file names, date and time of response, and other computer details.
The hit message identified the files on a particular computer that matched the query terms and were available for download by other users on the same P2P network.
The searches found that the three defendants’ computers contained files with digital signatures that exactly matched files that were known to contain images depicting child abuse.
When rejecting the defendants’ motion to suppress evidence collected in this manner, Judge Reiss noted that the police’s automated search hadn’t opened or downloaded anything.
All the tool did was to point out files that the defendants themselves had made publicly available for download via a P2P network.
The evidence overwhelmingly demonstrates that the only information accessed was made publicly available by the IP address or the software it was using. Accordingly, either intentionally or inadvertently, through the use of peer-to-peer file sharing software, Defendants exposed to the public the information they now claim was private.
The court’s finding that privacy can’t be expected when using a P2P network is nothing new; it only reiterates what many other courts have found, as a search on the legal blog FourthAmendment clearly shows.
The case in question was originally highlighted on the site, which is kept by John Wesley Hall, a criminal defense lawyer.
When I asked him about this finding, he said that it’s “the same as probably 50 other cases.”
The only thing that’s surprising to me is that people still raise that issue. It’s a settled issue beyond peradventure as far as I’m concerned.
But while the P2P privacy ruling isn’t ground-breaking, the increasingly sophisticated use of internet technologies to catch child predators is at the very least ground-altering.
As pointed out in a recent University of Massachusetts/Amherst research paper on measuring and analysing child porn on P2P networks, such networks are the most popular mechanism for acquiring and distributing such imagery.
It’s a relief to find that the courts aren’t allowing child predators to hide their P2P tracks behind claims of Fourth Amendment violations.
Likewise, it’s encouraging that researchers are using sophisticated animation technologies to create a predator-detection tool such as Sweetie, the lifelike character used to seed 19 public online chat forums with convincing live-action motion that allowed researchers to identify 1,000 child webcam sex tourists.
Child predators are sophisticated users of technology. It’s enabled them to carry out their abuse to a disheartening degree.
Now, thanks to the use of technologies to ferret them out, and thanks to the courts refusing to let P2P technology be used as a smokescreen, we can hope that the tide is turning.
Article source: http://feedproxy.google.com/~r/nakedsecurity/~3/64bMHMLH5OI/