Home > Media News >
Source: https://www.bloomberg.com
by Christie Smythe
Last year a 13-year-old Illinois boy learned an old, painful lesson about online relationships. After swapping messages with someone who claimed to be a 19-year-old woman in search of nude photos, he eventually sent an indecent video of himself. When he had second thoughts and tried to break off contact, “she” threatened to post the video publicly. He told his parents instead, and they called the police. The culprit turned out to be a 24-year-old man in California who had used similar “sextortion” tactics to exploit five other boys.
While pedophiles have been taking advantage of kids online for about as long as there’s been an internet, law enforcement officials say tactics employed by the 24-year-old Californian are becoming more common in at least one way: He found the 13-year-old on Snapchat, which he told police had become his preferred venue for obtaining child porn.
Across the U.S. there are at least six similar criminal cases against pedophiles who used Snapchat to exploit teens for sexual gratification or, in one case, to exploit her own children for money. Although all social networks struggle to varying degrees with this issue, the incidents have led investigators to single out Snapchat as a danger because, they say, the service is especially popular among young people, and its disappearing messages make evidence tougher to find.
“Predators know that they can contact kids on Snapchat,” says Melissa Travis-Neal, an investigator with the Oklahoma attorney general’s office. “They know this is the popular app that kids are on. They both use it to contact kids and they use it to send and receive child pornography, and they think they’re not going to be tracked.” An investigator with the Idaho attorney general’s office, who asked not to be named because of the sensitive nature of his work, says his team cringes when they hear Snapchat is part of a case, because of the added difficulty of catching suspects.
Snapchat parent Snap Inc. says it uses both staffers and automated systems to protect children—and all 178 million users—from unwanted messages, but it wouldn’t provide details. The company, which is just beginning to shake Snapchat’s early image as a sexting app, says it’s also simplifying its forms for reporting abuse. “We will continue to develop new tools and practices to help keep the community safe,” says spokesman Noah Edwardsen. While such incidents are rare, he says, they’re “no less evil and abhorrent than when they happen anywhere else,” and the company works closely with law enforcement to address them.
Social networks such as Facebook and Instagram accounted for 54 percent of the 1,631 such cases surveyed last year by the University of New Hampshire’s Crimes Against Children Research Center, but more-private messaging apps such as Snapchat and Kik were the next-likeliest avenue for contact, at 41 percent. “All of the digital technologies can be abused by the bad guys,” says Parry Aftab, a cybersecurity lawyer who advises companies and government agencies. “Snapchat tends to be used by single traffickers, small neighborhood groups of child molesters. The creeps.”
In April a tipster led law enforcement to a Snapchat account distributing child porn. They traced the IP address to the Guthrie, Okla., home of a 36-year-old man, where they discovered hundreds of explicit images of children on his electronic devices. His case is pending. In another case, a 12-year-old New Hampshire girl told investigators she’d used Snapchat and Kik to send numerous explicit photos of herself and her 10-year-old sister to a 33-year-old man in Fort Wayne, Ind., at his request.
In August sheriff’s deputies in Maricopa County, Ariz., charged a 28-year-old mother with multiple child sex crimes for allegedly using Snapchat to broadcast herself molesting her 3-year-old son and 6-year-old daughter in exchange for a viewer’s money. Other cases involving pedophiles using Snapchat have been filed in Idaho, Missouri, and Wisconsin.
Facebook, while hardly free of child porn or solicitors, has stronger channels for reporting abuse, says Aftab, the cybersecurity lawyer, who advised the company on creating some of those protocols in 2005. Facebook Inc. says it uses artificial intelligence software to flag worrisome material based on words and phrases associated with child exploitation and works with law enforcement and its own professional investigators to track down people searching for such terms.
Photo and video evidence often disappears from Snapchat before police can find it, says Adam Wandt, a professor of public policy at the John Jay College of Criminal Justice in New York, who advises a program on digital forensics. Snap says that given investigators’ requests, it will preserve files sent to and from particular accounts. Kik Interactive Inc., which offers users messaging that’s as anonymous as an old-school AOL chat room, says it also assists law enforcement as needed. Company spokesman Rod McLeod says Kik is expanding its safety team and investing in software that can help better identify and catalog troubling material. —With Sarah Frier