Photos

Instagram AI scans photos to call prostitutes based on what they’re wearing


Instagram’s censorship policy makes no sense. But the least they can do is not call women prostitutes based on what they’re wearing.

Instagram’s policy doesn’t make sense

As I mentioned in a previous articleInstagram’s Community Guidelines about nudity are confusing. Check out their butt rule on their list of things you shouldn’t post:

The anus is visible and/or full nude close-ups of the buttocks unless photoshopped on to the public.

For some reason, the world is only safe if you post photoshopped completely nude butt shots of celebrities or maybe photoshopped visible anal shots of people. famous depends on how you read that sentence. Having a rule about how far you can zoom in on a completely nude butt is one thing, but what about more exotic things like bestiality? You would think it would be a difficult pass, but you were wrong. On the day after Thanksgiving 2021, while we were all slowly digesting the turkey, the nudity guidelines committee at Instagram changed their policy to add a new category that they allow. allowed but will have an age limit:

The image depicts animality in actual art as long as the image is shared in a neutral or condemning manner and the person or animal depicted is not real.

Luckily, they keep a change log, so you can see what they added, took away, and when. That was added to the November 24, 2021 update to the nudity policy.

Instagram uses AI to scan for posts that violate policies

Instagram describes its policy on its blog this:

Artificial intelligence (AI) technology is at the heart of our content review process. AI can detect and remove content that violates our Community Guidelines before anyone reports it. Other times, our technology sends content to human review teams to take a closer look and make decisions about it. These thousands of reviewers around the world focus on the most harmful content for Instagram users.

And here:

How technology detects violations

UPDATE JANUARY 19, 2022

People on Facebook and Instagram post billions of content every day. Even with thousands of reviewers around the world, it is not possible for them to rate them all by themselves. That’s where the Meta artificial intelligence comes in.

The system then Verified and trained by humans:

When we first build new technology to execute content, we train that technology to look for certain signals. For example, some technologies look for nudity in images, while others learn to understand text. At first, a new type of technology may have low confidence in whether a piece of content violates our policies.

The review teams can then make the final decision, and our technology can learn from each human decision. Over time—after learning from thousands of human decisions—technology becomes more accurate.

Instagram’s policy on sexual solicitation seems reasonable on the face of it

This is the policy from community guide page:

Do not post: Content that offers or solicits commercial services for adults, such as asking, offering, or asking for rates for escort services and sexual cult or domination services pay.

Content that provides or requests an implicit or indirect sexual solicitation and that meets both of the following criteria. If both criteria are not met, it is not considered a violation. For example, if the content is a hand-drawn image depicting sexual activity but does not request or suggest sexual solicitation, it does not violate:

Criterion 1: Offer or request: Content that implicitly or indirectly (usually through providing a contact method) offers or requests sexual solicitation.

Criterion 2: Provocative elements: Content that makes the aforementioned offer or request using one or more of the following sexually suggestive elements…

In short, do not post things that directly or indirectly seek to arrange prostitution. Indirect sexual solicitation requires posts to meet both 1) a request for sex in exchange for money and 2) contain suggestive elements. If these two elements are not present, there is no violation.

Instagram’s AI scans women’s posts and determines if they’re asking for sex based on what they’re wearing

Scanning a post for nudity is one thing. It was a simple training for the AI ​​to see if the fully enlarged nude butt was photoshopped on a public figure, or if a man’s or a woman’s nipples (not allowed). women’s nipples are used, but men’s nipples are allowed). But that’s not what Instagram is doing. It’s reading photos and captions, and determining if the woman in the photo looks sexually demanding based on the way she stands or the clothes she’s wearing. It will do this no matter what the caption says. I know because it happened to me more than 20 times. It also happened to a lot of friends and colleagues.

This is an image I posted in my stories but was removed because the women appeared to be asking for sex in exchange for money. A lingerie company I work for has been posting my photos on their feed every day for about two weeks and I’ve been posting screenshots of that feed and Instagram won’t have any of them. :

This is a deleted image of a woman standing by a window. I can’t show you the picture because it’s been deleted, but I can assure you she didn’t ask for sex. Here’s the caption of that photo, which I can still see in my account settings, where it lists my deleted content:

The important thing about this particular image is that I have requested a review and someone who works for Instagram has reviewed it and still determines that the subject in the photo is in some way seeking to participate in a lawsuit. exchange sex for money. I have come across two cases where a person on Instagram deleted and reviewed the image and determined that the subject was in fact attempting to engage in prostitution.

My story was deleted without any captions other than my bragging about “owning that feed”. My other images without captions almost violate any guidelines. If captions have nothing to do with it, then the decision is based solely on what the model is wearing or how she stands in front of their decision about whether or not she’s asking for sex. Keep in mind, this is not a nudity violation. These are specific Instagrammers that believe these women are trying to get people to have sex with them for money.

Your options for pursuing the matter further are limited

It’s almost impossible to get a real human on Instagram. However, you can always use the Bug Report feature in the help menu.

Three times, I received a message from Instagram that I could resolve my case with Management. The Oversight Board is a committee made up of human rights workers, politicians, educators and company executives that review cases and determine if violations are serious enough to be considered as a committee to determine whether a violation has actually occurred. You can only appeal decisions with reference numbers. Not all decisions have a reference number. They did not state how they determined which cases should receive a reference number for review by the Supervisory Board.

When you go to the Board of Supervisors and enter your reference number, you have the opportunity to fill in information about your appeal and the issues that you think are relevant to your case.

They select only a handful of cases each year. Mine was not selected.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button