Jump to content
  • Clearview Stole My Face and the EU Can’t Do Anything About It

    alf9872000

    • 260 views
    • 9 minutes
     Share


    • 260 views
    • 9 minutes

    One man’s battle to reclaim his face shows regulators across the bloc are failing to reprimand the US face search engine.

     

    MATTHIAS MARX SAYS his face has been stolen. The German activist’s visage is pale and wide, topped with messy, blond hair. So far, these features have been mapped and monetized by three companies without his permission. As has happened to billions of others, his face has been turned into a search term without his consent.

     

    In 2020 Marx read about Clearview AI, a company that says it has scraped billions of photos from the internet to create a huge database of faces. By uploading a single photo, Clearview’s clients, which include law enforcement agencies, can use the company’s facial recognition technology to unearth other online photos featuring the same face.

     

    Marx wanted to know if the company had any photos of his face in its database, so he emailed Clearview to ask. A month later, he received a reply with two screenshots attached. The pictures were around a decade old but both showed Marx, looking fresh faced in a blue T-shirt, taking part in a Google competition for engineers. Marx knew the pictures existed. But unlike Clearview, he did not know a photographer was selling them on stock photo website Alamy without his permission.

     

    Marx says Clearview’s revelation was a wake-up call. “I’m no longer in control of what people do with my data,” he says. To him, it was obvious that Clearview was violating Europe’s privacy law, the GDPR, by using his face, or biometric data, without his knowledge or permission. So in February 2020 he filed a complaint with his local privacy regulator in Hamburg. That complaint was the first filed against Clearview in Europe, but it’s still unclear whether the case has been resolved. A spokesperson for the regulator told WIRED that the case had been closed, but Marx says he has not been notified of the outcome. “It’s almost been two and a half years since I complained about ClearView AI, and the case is still open,” says Marx, who works as a security researcher at the IT security company Security Research Labs. “That is too slow, even if you take into account that it’s the first case of its kind.”

     

    Across Europe, millions of people’s faces are appearing in search engines operated by companies like Clearview. The region might boast the world's strictest privacy laws, but European regulators, including in Hamburg, are struggling to enforce them.

    Since Marx filed his complaint, other people and privacy groups across Europe have done the same. In October, the French data protection authority became the third EU regulator to fine Clearview 20 million euros ($19 million) for violating European privacy rules. Yet Clearview has not removed EU faces from its platform, and similar fines issued by regulators in Italy and Greece remain unpaid. (France said it could not disclose details about the payment, due to privacy rules). But as Europe’s regulators grapple with how to make the company heed their reprimands, the problem is mushrooming. Clearview is no longer the only company monetizing people’s faces.

     

    Like other privacy activists, Marx does not believe it’s technically possible for Clearview to permanently delete a face. He believes that Clearview’s technology, which is constantly crawling the internet for faces, would simply find and catalog him all over again. Clearview did not reply to a request to comment on whether it is able to permanently delete people from its database.

     

    “It will happen again if my face shows up somewhere on the internet,” says Marx. “They [Clearview's algorithms] will not stop crawling.” The company has been telling investors it is on track to have 100 billion photos in its database this year, which averages out to around 14 photos for every one of the 7 billion people on the planet.
     

    The way Clearview works—by sending bots to search the internet for faces and then storing them in a database—makes it impossible to keep EU faces from appearing on the platform, according to CEO Hoan Ton-That. “There is no way to determine if a person resides in the EU, purely from a public photo from the internet, and therefore it is impossible to delete data from EU residents,” he says, comparing his product to others on the market. “Clearview AI only collects publicly available information from the internet, just like any other search engine, like Google, Bing, or DuckDuckGo.”

     

    But the difference between searching with a name and searching with a face is crucial, argue privacy activists. “A name is not a unique identifier. A name is something you can hide in public,” says Lucie Audibert, a lawyer at Privacy International. “A face is not something you can possibly hide in public, unless you walk out of your house with a bag on your head.”

     

    Frustration is growing in Europe that face search engines can keep operating in blatant defiance of regulators’ orders to stop processing EU faces. Ton-That argues that Clearview is not subject to the GDPR because it has no clients or offices in the EU—something regulators dispute. “It’s very hard to enforce a regulatory decision from Europe on a US company if the company is not willing to cooperate,” says Audibert, who wants EU regulators to be more aggressive in their enforcement. “This is really a test case to see what kind of restrictive power the GDPR has.” She does not expect new sweeping EU tech rules to affect the dispute.

     

    The case against Clearview was supposed to act as a warning to other companies that face search engines were illegal within the EU. “Once there’s one case, it’s easy for the authorities to use this as a precedent,” says Felix Mikolasch, data protection lawyer at NOYB, a privacy group that has been supporting Marx in his case. Instead, the opposite has happened. “There’s such a lack of enforcement, it's difficult to convince companies they have to stop doing this because they know they can get away with it,” says Audibert.

     

    Since 2020, Marx has discovered that pictures of his face are spreading. When he searched another facial recognition platform called Pimeyes for his face, the platform unearthed even more pictures than Clearview. One showed him, ironically, giving a speech about privacy. Another showed a local newspaper clipping from 2014, where he had been pictured for providing free Wi-Fi to refugees. Another showed him at an event hosted by a political party, where he says he was discussing local issues such as bike paths.

     
    Pimeyes is technically different from Clearview because it does not store faces in a database, but instead searches the internet for faces when a user uploads a picture, according to privacy experts. The platform is also much more open; anyone can search the site for free, although to see the links where photos are found, they have to pay a monthly fee starting at $36.
     
    The company’s CEO, a professor named Giorgi Gobronidze, also stresses that unlike Clearview, Pimeyes does not crawl social media platforms, such as Facebook, Twitter, or VKontakte (VK). “The fact that theoretically we can crawl social media does not mean that we should,” says Gobronidze, who bought the platform at the end of last year. Instead Gobronidze says Pimeyes makes the internet more transparent. “There are thousands of people who do not know their pictures are being used by different online sources,” he says. “And actually, they have a right to know.”
     

    For people who don’t want to know, Gobronidze says it’s easy to remove their face from his site. “[People] can submit opt-out requests, or they can order a certain picture be removed and blocked from further processing with one click, under each free search result.” Even though Pimeyes is officially based outside the EU, in Belize, the company should never have used his picture in the first place, says Marx. “This company would only be allowed to use your biometric data with explicit consent.”

     

    Pimeyes has incurred controversy before. After a series of news articles criticized its privacy policies in 2020, its previous owners, entrepreneurs Łukasz Kowalczyk and Denis Tatina, decided to sell. But the two men did not disappear from the industry. Instead, according to company records in Poland, they resurfaced as owners of a new face search engine called Public Mirror that is targeted at the public relations industry. One thing Pimeyes and Public Mirror have in common is Marx’s face.

     

    In March of this year, Marx found that Public Mirror had four images of his face in its files. Like other face search engines, it’s not only the pictures themselves that reveal information about Marx, but the online links that accompany them. Public Mirror’s links act like a directory to the media articles that have been written about Marx or the conferences where he has spoken.

     

    Each of these platforms reveals deeply personal information. “You can tell where I study, which political party I like,” Marx says. Together, the pictures these companies have collected of him point to an industry that reveals vastly more information than any social media profile.

     

    When Marx started pulling on this thread back in 2020, all he wanted was for one company to stop collecting pictures of his face. Now it’s bigger than that. Today, he’s calling for regulators to stop the industry from collecting pictures of Europeans altogether. For that to happen, regulators will have to make an example of Clearview. The question is, can they?

     

    Source

    • Like 2
    • Thanks 1

    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...