Tech

The Fate of Section 230 belongs to Congress — Not the Supreme Court of the United States


In the near 27 years since the National Assembly passed Article 230 of the Communications Decency Act, courts have interpreted it extensively to protect online communities from liability for user content, laying the foundation for business models of Facebook, Yelp, Glassdoor, Wikipedia, community bulletin boards and many other websites rely on content they did not create.

Some of those protections are at risk next year, as the Supreme Court has agreed to hear the first case that explains the scope of Section 230’s protections. Gonzalez v. Googlethe plaintiffs asked the court to rule that Section 230 does not exclude platforms when they make “targeted recommendations” about third-party content.

Section 230, written in 1995 and adopted in early 1996, does not, unsurprisingly, explicitly mention algorithmic targeting or personalization. However, a review of the statute’s history reveals that its proponents and authors intended the law to promote a range of technologies for displaying, filtering, and prioritizing user content. This means that removing Section 230 protections for targeted content or types of personalized technology would require Congress to change the law.

Like many parts 230 cases, Gonzalez v. Google related to tragic circumstances. The plaintiffs are family members and assets of Nohemi Gonzalez, a California State University student who was studying abroad in Paris who was killed in the 2015 ISIS shootings, along with 128 others. The lawsuit, against Google, alleges that its YouTube subsidiary violated the Terrorism Act by providing substantial support to terrorists. At the heart of this dispute is not just YouTube’s hosting of ISIS videos, but, as the plaintiffs wrote in the legal filing, YouTube’s targeting recommendations about ISIS videos. “Google has selected the users to whom it will recommend ISIS videos based on what Google knows about each of its millions of YouTube viewers, targeting users whose characteristics suggest they would be interested in the videos. ISIS video”, plaintiff Written. In other words, YouTube allegedly showed ISIS videos to people most likely to be radicalized.

Last year, the United States Court of Appeals for the 9th Circuit rejected this argument due to Section 230. However, the Court did not enthusiastically ruled against the Gonzalez family, with judge Morgan Christen writing for the majority. that despite their verdict: “We agree the Internet has evolved into a complex and powerful global tool that the drafters of § 230 could not have foreseen. And the Court did not agree, with Judge Ronald Gould asserting that Section 230 does not exclude Google because its amplification of ISIS videos contributed to the group’s messaging (Section 230 does not apply if the platform even even take part in the content development process). “In short, I do not believe that Section 230 completely eliminates the role of a social media company as a communication channel for terrorists in their recruitment campaigns and as an enhancement for terrorists. the violent and hateful messages they convey,” Gould wrote. After the Ninth Court largely ruled against the Gonzalez family, the Supreme Court this year agreed to review the case.

Section 230 is a little-noticed part of the 1996 overhaul of US telecommunications law. The House added Section 230 to its telecommunications bill, largely in response to two developments. First, the Senate’s version of the telecommunications bill imposed penalties for the transmission of indecent content. Section 230 was touted as an alternative to the Senate’s strict approach, and as a compromise, both the House’s Section 230 and the Senate’s anti-indecent provisions were included in the bill that President Clinton signed it into law. (Next year, the Supreme Court will rule the Senate’s portion unconstitutional.)

Second, Section 230 attempted to address an issue raised in 1995 rule in a $200 million defamation lawsuit against Prodigy, brought by a plaintiff who says he was libeled on the Prodigy message board. A New York trial court judge ruled that because Prodigy reviewed users’ messages before posting, using technology that pre-screens users’ content for “offensive language” and engages in other censorship, its “editorial control” exposes the publisher to as much liability as the author of the article. A few years earlier, a federal judge in New York had reasoning that because CompuServe did not exercise sufficient “editorial control”, it is considered a “distributor” only liable if it knew or had reason to know about the alleged content. libel.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button