Photos

Understanding the Instagram Board of Supervisors’ decision to allow nipples


On January 17, 2023, the Supervisory Board made a decision advising Meta to amend its policies on nipple release. Here’s what that decision really says.

guardthe New York Post Office, glamor magazineand a host of other news outlets are reporting that Instagram may finally adjust its policy to allow the posting of nude photos regardless of gender. The news came after the decision of ManagementAn advisory board of Instagram and Facebook, made the decision to appeal the two deleted posts and recommended that Meta adjust their nudity policy to allow topless photography.

What is a supervisory board?

I wrote about the Supervisory Board in a previous article. In some cases, if you have your post removed, you may get a reference number and you are allowed to appeal your case to the board if you have that reference number. The Oversight Board is an independent committee made up of academics, politicians (such as the former Prime Minister of Denmark), ethics experts (including Nobel Peace Prize winners) and others from an independent perspective. consult Meta about their policies regarding content moderation. The council primarily deals with issues of global significance, such as Meta’s responsibility to deal with election- or pandemic-related misinformation and the like.

Most everyone else is wrong about what started this decision

This is decided January 17, 2023 in its entirety online. It was about 30 pages when I printed it to PDF and highlighted the relevant points, so here’s a quick recap.

I have seen countless posts and shared stories with clickable headlines about how Instagram will now release nipples, citing this decision. Essentially, the version you’ll see shared is some topless photo that was shared by a non-binary couple and taken down for violating the nudity policy, and Ban The supervisor asked for them to be restored, and Instagram allowed all the photos to expose their breasts. That’s about 20% correct.

It is true that the images in question feature a non-binary transgender couple who posted two images with captions of trying to raise money on GoFundMe for breast augmentation surgery (flattening breasts) because of their problems. with insurance. The images have been flagged by AI and are not infringing. They were reported by users and reviewed by moderators and were again determined to be non-infringing. The images were reported back and at the time the human moderator discovered they violated community guidelines and removed the images.

This is where people get it wrong. First, the images in question have been completely covered:

Second, and more seriously, everyone’s mistake is that the photos aren’t deleted for violating the nudity policy, they’re removed because Instagram thinks they’re trying to engage in prostitution!

Third, the Board of Directors does not require Instagram to restore photos. Instagram did it on its own before the case was decided. They also acknowledged that the posts were deleted by mistake.

Extremely in-depth analysis of Instagram’s censorship policy

The decision offers insight into how and why Instagram implements its moderation policies. Just as I suspected, it was the exact opposite, and the Supervisory Board tore it up. Here’s what was discovered.

Instagram has secret moderation policies in addition to public facing guidelines

You can find Instagram Community Guidelines here. That’s what the public has access to, but that’s just the tip of the iceberg for the entire set of rules. The decision evaluated and referenced these hidden rules. For example, there are 18 additional rules for nipples that are only available to human reviewers:

For sexual solicitation, there are additional guidelines that include a list of positions commonly used by prostitutes to implicitly request sex in exchange for money:

“Known Questions” refers to a list of internally-only confidentiality guidelines that reviewers use to moderate content.

The supervisory board found these hidden rules completely ridiculous

The supervisory board asked Meta why the AI ​​system flagged these images as offending and they didn’t know. The supervisory board asked why a human reviewer thought this was sexual solicitation and they didn’t know. The Board of Supervisors pointed out how the rules are not only vague, but also inconsistent:

Meta’s policies disproportionately affect women and the LGBTQI+ community

Therefore, exposing a woman’s topless breasts is not allowed in the nude, but covering a woman’s breasts is not allowed in the case of sexual solicitation. Basically, just don’t have covered or uncovered breasts in any of the photos. There were several other incidents where the Board of Supervisors mentioned that the rules were too broad and disproportionately affected women and the LGBTQI+ community.

Meta has a website where they track statistics regarding the enforcement of nudity and sexual activity This. The decision showed that there was a lot of error in applying too broad and arbitrary rules to the regulation of the female body, with only 21% of the images restored. Decided to cite a learn showed that 22% of images of women’s bodies removed from Instagram were false positives.

The rationale behind Instagram’s nudity and solicitation policies

The panel looked at Meta’s rationale behind nudity censorship policies. Meta states that there are two main problems they want to avoid: 1) sharing nude photos of underage girls and 2) unauthorized sharing of nudity. It was impossible to look at a bare chest and determine if it belonged to a 17 or 18 year old, so they decided not to allow female breasts. Likewise, they have no way of knowing if consent is available to share images, so they do not allow those images to be published. Meta cites a survey that found that “90% of victims of the non-consensual distribution of digital intimate images are women.” It’s an issue that affects women’s nudity by a factor of 10, so they don’t allow the use of women’s nipples, while men’s nipples are not regulated.

Recommendations of the Supervisory Board

The supervisory board did not ask Meta to amend its policies for nipple release. It made three recommendations. The first is to have clear criteria to ensure consistent treatment. Second, Meta should provide users with a better explanation of the sexual solicitation criteria. Third, Meta should revise the guidelines for internal reviewers as it relates to sexual solicitation.

This decision has almost nothing to do with nipple display, as that’s not even an issue in the images below. It has more to do with Meta’s policy of scanning images and discretion in deciding whether the women in the pictures ask for sex in exchange for money based on how they pose or what clothes they’re wearing. . This is an absolutely disgusting policy that should never have existed in the first place. It’s great to see this policy evaluated under a microscope and lifted.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button