Emily Kennedy spent her undergrad years reading child sex-trafficking ads.
She wanted to understand their ticks: Why was this ad formatted that way? Why did the same ads often have different phone numbers? Kennedy knew that this kind of analysis could unravel at least a portion of sex-trafficking business. And after she graduated from Carnegie-Mellon University, she built a system to do just that.
Traffic Jam, which was developed by Kennedy’s company, Marinus Analytics, has for years detected patterns in sex-trafficking ads and used them to help police find trafficked children and arrest traffickers. The system took a big step up on Tuesday, though.
That’s when the company announced FaceSearch, which allows the police to match a photo of a child’s face to sex-trafficking ads on the internet. The initial photo can come from Facebook or other social media, from a “missing child” ad, or anywhere else. FaceSearch can then scan online photos and “quickly determine whether this potential victim has been advertised online for commercial sex,” according to a company release online.
“Anything they can upload from their computer can be searched,” Kennedy said.
The technology is now available to any law enforcement agency that wants it. Marinus uploaded FaceSearch into the Traffic Jam, so their law enforcement clients who use Traffic Jam — which includes agencies in California, Wisconsin, New Jersey, and more — already have access.
Facial recognition technology and law enforcement are becoming increasingly linked with each passing week. Around 25% of police departments in the United States have access to facial recognition technology. Customs and Border Patrol is using it at select U.S. airports to determine whether your face might one day serve as your boarding pass. And police in Berlin are using it at train stations to “recognize and report detected users or persons from whom a danger could arise or emerge.” This increase in the use of facial recognition has led to a growing body of privacy concerns, but even researchers who have raised those worries see FaceSearch as something different.
“We definitely can’t control the way every single user uses our software, but I’m not too worried about misuse, because it’s so focused,” Kennedy said.
Other companies, she said, provide law enforcement agencies with much broader access to facial recognition technology, and those companies will have to confront the ethical questions therein.