Tech

Putting facial recognition in the Ukraine war was a bad idea


A startup’s controversial tech donation to Ukraine could also help steer clear of the country’s broader problems.

Perhaps the saying “There is no such thing as bad publicity” holds true for controversial tech companies. New York-based Clearview AI has been criticized by privacy advocates for years for the way it collects billions of images from social media networks to build a face search engine that is recognized by the public. used by police departments. This is the subject of an investigation by the New York Times, and several countries including France and Canada have banned the company.

Still, at least 600 law enforcement agencies have used their technology, and this week Clearview revealed it had provided the Ukrainian government with free access to its “face network” to help thwart the attack. Russian invasion.

Ukraine’s Defense Ministry has not said how it will use the technology, according to Reuters. The Ukrainian government has also not confirmed that it is using Clearview, but Reuters reports that its troops are likely using the technology to destroy Russian special forces at checkpoints. Of Clearview’s database of 10 billion faces, more than 2 billion come from Russia’s most popular social network, Vkontakte, which allows the company to theoretically match multiple Russian faces to their social profiles.

Ukraine has received several offers of help from the tech world, including from Elon Musk and satellite operator MDA Ltd. But Clearview’s offer to Ukraine has, rather, sparked outrage among privacy advocates. Chief among the concerns is faulty facial recognition. It’s too bad that it leads to a wrongful arrest by the police. In a war zone, there are even greater life-or-death consequences.

There is evidence that people using facial recognition often fail to operate it properly. A British study of how London police use the technology to spot suspected criminals on the streets found that officers suffer from “algorithmic disrespect”. In other words, they tend to agree with whatever the software suggests. Even if police aren’t sure if a face captured on camera matches the photo, they’ll assume the match should be correct if the software says so. And while other officials sometimes challenge their colleagues if they disagree with face-matching software, they never challenge those who agree with it, according to 2019 research.

It’s hard to imagine soldiers taking a more nuanced approach amid pressure to defend besieged cities, and with little or no training in how to use such software.

Clearview’s offer ended up feeling like a publicity stunt. The company denies this, saying, “Hoan Ton-That has seen the suffering in Ukraine, and like people and companies around the US and the world, wants to do what they can to help. .”

In something of a positive Streisand effect, Clearview’s seemingly inexplicable reputation for controversy has kept it afloat in the business. Even if their data collection tactics have resulted in the removal of notices from Facebook by Meta Platforms Inc., Google by Alphabet Inc. and Twitter Inc., which moved forward as if there were no problems. It recently told investors that its face database would grow tenfold to 100 billion images, and that it was expanding into the private sector, by helping to verify business associates. contract economics, according to a report in the Washington Post.

But Clearview’s legal problems won’t go away. The company has been plagued by lawsuits in federal court and in several states, including in New York, California, Illinois and Virginia, and faces a wave of regulatory investigations in the UK, France, Italy and some other European countries.

Clearview says legal issues are normal in the startup world. “Just like Airbnb, Uber, PayPal and other iconic innovative startups, there is a key legal component to [our] operating very early,” a Clearview spokesperson said. She added that privacy laws around the world tend to support exemptions for law enforcement and national security.

War tends to reveal both opportunism and generosity. Clearview’s offer to Ukraine could also fall into the latter. More broadly, though, bringing facial recognition technology into a war zone is dangerous, and if doing so becomes the norm, it will be.

Similarly, Clearview has made its police business a success by offering free trials to law enforcement officers.

This column does not necessarily reflect the opinions of the editorial board or Bloomberg LP and its owners.

Parmy Olson is a Technology columnist for Bloomberg Opinion. She has previously reported for the Wall Street Journal and Forbes and is the author of ‘We Are Anonymous.’





Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button