For $29.99 a month, a website called PimEyes offers a potentially dangerous superpower from the world of science fiction: The ability to search for a face, finding obscure photos that would otherwise have been as safe as the proverbial needle in the vast digital haystack of the Internet.
A search takes mere seconds. You upload a photo of a face, check a box agreeing to the terms of service and then get a grid of photos of faces deemed similar, with links to where they appear on the internet. The New York Times used PimEyes on the faces of a dozen Times journalists, with their consent, to test its powers.
PimEyes found photos of every person, some that the journalists had never seen before, even when they were wearing sunglasses or a mask, or their face was turned away from the camera, in the image used to conduct the search.
PimEyes found one reporter dancing at an art museum event a decade ago, and crying after being proposed to, a photo that she didn’t particularly like but that the photographer had decided to use to advertise his business on Yelp. A tech reporter’s younger self was spotted in an awkward crush of fans at the Coachella music festival in 2011. A foreign correspond appeared in countless wedding photos, evidently the life of every party, and in the blurry background of a photo taken of someone else at a Greek airport in 2019. A journalist’s past life in a rock band was unearthed, as was another’s preferred summer camp getaway.
Unlike Clearview AI, a similar facial recognition tool available only to law enforcement, PimEyes does not include results from social media sites. The sometimes surprising images that PimEyes surfaced came instead from news articles, wedding photography pages, review sites, blogs and pornography sites. Most of the matches for the dozen journalists’ faces were correct.
For the women, the incorrect photos often came from pornography sites, which was unsettling in the suggestion that it could be them. (To be clear, it was not them.)
A tech executive who asked not to be identified said he used PimEyes fairly regularly, primarily to identify people who harass him on Twitter and use their real photos on their accounts but not their real names. Another PimEyes user who asked to stay anonymous said he used the tool to find the real identities of actresses from pornographic films, and to search for explicit photos of his Facebook friends.
The new owner of PimEyes is Giorgi Gobronidze, a 34-year-old academic who says his interest in advanced technology was sparked by Russian cyberattacks on his home country, Georgia.
Gobronidze said he believed that PimEyes could be a tool for good, helping people keep tabs on their online reputation.
“It’s stalkerware by design no matter what they say,” said Ella Jakubowska, a policy adviser at European Digital Rights, a privacy advocacy group.
A few months back, Cher Scarlett, a computer engineer, tried out PimEyes for the first time and was confronted with a chapter of her life that she had tried hard to forget.
In 2005, when Scarlett was 19 and broke, she considered working in pornography. She traveled to New York City for an audition that was so abusive that she abandoned the idea.
PimEyes unearthed the trauma, with links to where exactly the explicit photos could be found on the web. “I had no idea up until that point that those images were on the internet,” she said.
When she clicked on one of the explicit photos on PimEyes, a menu popped up offering a link to the image, a link to the website where it appeared and an option to “exclude from public results” on PimEyes.
But exclusion, Ms. Scarlett quickly discovered, was available only to subscribers who paid for “PROtect plans,” which cost from $89.99 to $299.99 per month. “It’s essentially extortion,” said Scarlett, who eventually signed up for the most expensive plan.
But when The Times ran a PimEyes search of Scarlett’s face with her permission a month later, there were more than 100 results, including the explicit ones.
Gobronidze said that this was a “sad story”. Instead, it blocks from PimEyes’s search results any photos of faces “with a high similarity level” at the time of the opt-out, meaning people need to regularly opt out, with multiple photos of themselves. Gobronidze said he wanted “ethical usage” of PimEyes. But PimEyes does little to enforce this, beyond a box that a searcher must click asserting that the face being uploaded is their own.
There are users Gobronidze doesn’t want. He recently blocked people in Russia from the site, in solidarity with Ukraine. He mentioned that PimEyes was willing to offer its service for free to organisations, if it could help in the search for missing persons.
A German data protection agency announced an investigation into PimEyes.
Gobronidze said he had not heard from any German authorities. “I am eager to answer all of the questions they might have,” he said. He is not concerned about privacy regulators, he said, because PimEyes operates differently.
He described it as almost being like a digital card catalog, saying the company does not store photos or individual face templates but rather URLs for individual images associated with the facial features they contain.