Tech

Facebook has a problem with children’s cannibalism


Ensure that due diligence specifies proactive steps to prevent the creation of such groups, aided by swift action to remove any evasive activity when they are flagged and reported. fox. I thought so. Until I came across these groups and started, with growing skepticism, and unable to take them down.

Children who are sharing profile pictures and contact information in the digital space are sexually aroused and drawn into private groups or conversations where other images and actions will be solicited and Exchange.

Even like debate on the Congressional Monetization Act call attention to the use of digital channels to distribution pornographic material, we’re losing ground in terms of how child sexual abuse material was created. Forty-five percent of U.S. children between the ages of 9 and 12 Daily Facebook usage report. (That fact alone creates a mockery of Facebook’s claim that it works hard to keep children under 13 off the platform.) recent research, more than a quarter of children aged 9 to 12 reported having been sexually solicited online. 1/8 reported being asked to submit nude photos or videos; 1 in 10 reported being asked to join a pornographic live stream. Smartphones, internet access and Facebook are now accessible to children’s hands and homes and create new spaces for active hunting. By scale.

of course reported the group which I discovered by accident. I used the system on Facebook’s platform, tagging it as containing “nudity or sexual activity” (next menu) “involving children”. An automated response came back a few days later. The group has been reviewed and does not violate any ” Community Standards. ” If I continue to encounter content that is “offensive or offensive to you” —is there my problem here? —I should report that specific content, not the entire group.

“Buscando novi @ de 9,10,11,12,13 años” has 7,900 members as I report. By the time Facebook responded that it didn’t violate community standards, it had 9,000.

So I tweeted at Facebook and the Facebook newsroom. I’ve messaged people I don’t know but think may have access to people inside Facebook. I tagged journalists. And I reported through the platform’s protocol dozens more groups, some with thousands of users: the groups I found not through porn searches, but just by typing “11 12 13 ” into the Groups search bar.

What became more apparent than ever as I struggled to act was that the limits of technology were not the issue. The full power of AI-driven algorithms has been shown, but it’s working to extend, do not reduce, dangerous for children. Because even when replying after hitting my inbox denied grounds for action, the beginner child porn groups suggested to me were “Groups You Might Like”.

Each of the new groups suggested to me featured the same mix of cartoons, emotional grooming, and gambling invitations to share sexual material as the groups I reported on. Some in Spanish, some in English, others in Tagalog. When I search for the translation of the word “hanap jowa”, the name of a series of groups, it leads me to Article from the Philippines reported on Reddit users’ attempts to remove Facebook groups that put children at risk there.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button