Who Will Be the Gatekeepers in 2023?

Technology Policy Brief #78 | By: Mindy Spatt | January 11, 2023

Header photo taken from: Sitra

Follow us on our social media platforms above

Browse more technology policy briefs from the top dashboard

Supreme Court Will Consider Whether Social Media Companies Should Be Liable for Damage Caused Through Their Platforms and More Closely Regulated. Would the Government Be a Better Gatekeeper Than Elon Musk?

107143649 16672534752022 10 31t214944z 567465635 rc2eax96ao5z rtrmadp 0 twitter ceo

Once upon a time, journalists were the gatekeepers to the audience and from power. Then flacks took over the gatekeeping from power (they control access to the famous and powerful and fame is the fuel that fires media today). Now the Web — and weblogs and interactivity — are taking over the gatekeeping to the audience, allowing the famous and powerful to bypass the gatekeepers and vice versa. – Jeff Jarvis, Death of the Gatekeeper

Photo taken from: Dado Ruvic | Reuters, BuzzMachine

[SSB theme=”Official” align=”center” counter=”true” ]

If anything is clear from Elon Musk’s takeover of Twitter, it is that he is ill equipped to be the arbiter of who or what should be banned from the platform.  He appears to be the last person on earth anyone would choose for the job.  Former CEO Jack Dorsey had no particular qualifications to do so, but his decisions didn’t garner the publicity or public dismay Musk’s have.  And if not Musk, or Mark Zuckerberg, who was notoriously late to the Trump/Russia party on Facebook, then who?

Social media companies currently operate with no responsibility for the content being distributed on their platforms.  A federal law, Section 230 of the Communications Decency Act of 1996, shields them from liability for third party content.

It also allows them to take “any good faith action to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”  So whatever efforts Zuckerberg, Musk and other social media giants have made to keep hate speech or other harmful content off of their platforms have been purely voluntary.  And not enough, judging by recent lawsuits by the Social Media Victims Law Center, which describes itself as  “a legal resource for parents of children harmed by social media addiction and abuse.” 


The Center is suing Tik Tok for wrongful death on behalf of the families of 8-year-old Lalani Erika Walton of Temple, Texas, and 9-year-old Arriani Jaileen Arroyo of Milwaukee, Wisconsin who both died after participating in TikTok’s “Blackout Challenge” which encourages users to choke themselves to the point of unconsciousness.

The group has also filed a suit against Snap, Inc. on behalf of the families of eight teenagers and young adults across six states, all of whom died after taking fentanyl-laced pills purchased from drug dealers they connected with on Snapchat.  The suit alleges that Snapchat’s unique disappearing message features, “encourage, enable, and facilitate illegal and deadly drug sales of counterfeit pills containing lethal doses of fentanyl to minors and young adults.”

Although the Center notes on its website that Section 230 needs to be changed, their lawsuits are based on a products liability theory, alleging that the algorithms or other elements of the platforms’ delivery systems are dangerous and defective.  

Nothing talks like money, so if these actions are successful they are sure to spark major changes in the industry.  And two cases currently before the Supreme Court may signal an end to the broad protections of Section 230.


The Supreme Court’s decision to hear cases challenges the legal shield for social media platforms and puts the justices in the middle of a politically fraught debate over whether some of the world’s most powerful companies should be protected as neutral forums for speech or held accountable for the content.

Photo taken from: Bloomberg

Both cases blame social media for ISIS’s ability to recruit and enable terrorists.  In Gonzalez v. Google, the family of an ISIS terrorism victim alleges that YouTube’s algorithms make it easy for ISIS find new recruits, and that You Tube should be held liable for the result of its algorithms regardless of Section 230’s protections. Lower courts have issued conflicting opinions on these questions. 

In Twitter, Inc. v. Taamnehthe court will consider whether “a defendant that provides generic, widely available services to all its numerous users and “regularly” works to detect and prevent terrorists from using those services knowingly provided substantial assistance”  under anti-terrorism laws by not taking action to prevent such use.

The court could resolve these cases by deciding  to treat social media companies as common carriers.  Lower courts and Justice Thomas have indicated support for this approach, treating the companies similarly to public utilities providing communications services.  

In that case, the FCC, more experienced in stopping obscenity than viral suicide videos, would be providing guidance.  And likely not go as far as some advocates would like.  The nonprofit media advocacy group Media Justice has a set of recommended guidelines for social media companies to follow to “stop amplifying the worst content on their platforms” on their website https://www.changetheterms.org.

Engagement Resources​

Click or tap on resource URL to visit links where available 


Robert Reich, Do we want social media companies to decide whether Kanye West gets a platform?  The Guardian, Oct. 12, 2022.


Brookings logo large

John Villasenor , Social Media Companies and Common Carrier Status, A Primer, Thursday, October 27, 2022, Brookings.edu.


Subscribe Below to Our News Service
PLEASE DONATE to USRENEW NEWS----We rely on donations from our readers to support the news we bring you. Donations of any size are welcome, and will be used to support our mission of providing insightful public policy reporting. Thanks. DONATE HERE

Pin It on Pinterest

Share This