News

Researchers explain why they believe Facebook mishandles political ads: NPR

Facebook has worked for years to improve its handling of political ads – but researchers who have conducted a comprehensive examination of millions of ads say the social media company’s efforts have yielded results. uneven results.

Problems include counting too many political ads in the United States, they say – and a lack of them in other countries.

And despite Facebook’s ban on political ads around the time of last year’s U.S. election, the platform still allowed more than 70,000 political ads to run, according to the research group. at NYU Cybersecurity for Democracy and the Belgian university KU Leuven.

Their study was published early Thursday. They also plan to present their findings at a security conference in August.

After analyzing more than 4.2 million political ads and 29.6 million non-political ads from more than 215,000 advertisers, researchers say that in Facebook’s enforcement efforts on mainstream ads value of the United States, “61% more ads are missed than detected worldwide, and 55% in the United States of ads are found to be in fact apolitical.”

Researcher criticizes Facebook’s use of ‘rude’ methods

Laura Edelson of NYU, lead author of the study, said two things emerged from the study that surprised her.

“One is the very high rate of false positives in the US,” says Edelson.

Part of the surprise there, she added, is due to what she calls the “rudimentary” way Facebook appears to use keyword models to categorize ads and content.

“We could do a lot better,” Edelson said. “This isn’t the state-of-the-art of content moderation or problematic content detection. There are more sophisticated methods that could be used here that Facebook doesn’t seem to use.”

“Facebook does involve humans in some parts of its ad and content moderation policy, but it’s definitely automated first,” she said. “That approach only has problems with accuracy.”

Another surprise, she said, was the problem Facebook had with enforcing the ban on political ads in the US. After the policy was announced, many political advertisers simply stopped running ads. But not everyone is following the ban, Edelson said: “A lot of them keep running ads and just stop claiming they’re political.”

Spotting a political advertiser’s ban should have been easy, she said.

“The errors here are not trivial,” Edelson said. “They just really reflect a lack of investment.”

Facebook responds to researchers’ research results

Responding to a request for comment on the pending research, a spokesperson for Meta, Facebook’s parent company, told NPR:

“The vast majority of political ads they studied were revealed and labeled just as they were supposed to be. In fact, their findings revealed potential problems with less than 5% of total ads. political report.

“If that’s an exhaustive view, this report will also note that we provide more transparency on political advertising than TV, radio or any other digital advertising platform. .”

Being covered in the US: many topics have been politicized

As for what counts as a political ad, the researchers note that Facebook itself indicates its political advertising policy applies to “advertising about social issues, elections or politics”. It then configures its system to enforce rules based on that definition.

In recent years, the definition of what can be construed as a political message has become broader, as language around social and health issues becomes increasingly politicized. The researchers linked that trend to the tendency to mislabel non-political ads with political ads in the US.

Edelson shows how Facebook approaches COVID-19 information.

“A lot of content related to the pandemic and COVID has been politicized,” she said. “A lot of vaccine-related content is politicized. But the way Facebook manages that isn’t subtle or nuanced.”

A big part of the problem, says Edelson, is that Facebook relies on automated detection mechanisms that, in her view, simply aren’t very accurate.

“Ads featuring a person wearing a mask have been flagged as political. Ads that mention or talk about vaccines or COVID have been flagged as political,” she said.

By mislabeling health messages as political ads, Facebook created new problems it had to solve, researchers say.

“Facebook has created a clear policy for government health agencies,” Edelson said, “so that they can be exempt from these policies on political speech. Because they don’t seem to know how to do this. apply them – to capture things that are political about COVID without catching things like, ‘this is where you can go get vaccinated.’ “

In that case, if a community organization wants to run an ad that says it’s hosting a weekend vaccination campaign, “Facebook is likely to flag that ad as a political ad,” Victor said. Le Pochat of KU Leuven, another lead author of the study, said. From there, he said, the ad could be taken down.

“If Facebook did this kind of inaccurate detection, it would probably stop these community organizations from making their vaccine drives public,” he said.

The researchers also found some improvements

From what we know of Facebook’s handling of political ads during the 2016 election, we ask, have researchers seen any improvement since then?

“I can say it’s gotten a little better,” Edelson said.

Le Pochat added: “We see that in our data.

“We’ve found that Facebook may be getting more improperly declared ads than it was a few years ago,” he said.

Le Pochat said the majority of the ads were properly declared.

Research corroborates recently leaked Facebook profile

This study is coming out weeks after Facebook document store from whistleblower Frances Haugen describes the social media giant as not dealing with some political and social complications, especially in countries where people post content in Arabic, Hindi and other widely used languages.

In some cases, the company inadvertently banned everyday words, according to the documents. In other cases, Facebook’s screen system is said to have allowed offensive language to spread.

“Our findings confirm that Facebook really doesn’t pay much attention to ensuring that communities outside of the US … are also protected from harm caused by misleading political advertising,” Le said. Pochat said.

“We see a very high rate of false positives in the United States,” Edelson said. It actually looks like this is because Facebook seems to use a keyword model to detect political content in the US. And we don’t see that pattern in other countries quite as closely. “

It’s a reflection of where Facebook has chosen to invest its money and time, she said.

“To use such a keyword model, you need to have some knowledge of that country’s politics to be able to build that keyword list,” says Edelson. “And it looks like Facebook isn’t invested in understanding the politics of all the countries it runs political ads in. It couldn’t have done that kind of detection in Malaysia or Macedonia or Argentina if they hadn’t spent the money to do so. understand the political landscape in those countries.”

Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button