"If you wait for these guys to solve the problem, we're going to die waiting," US Senator and Senate Judiciary Ranking Member Lindsey Graham said of a group of leading social media CEOs at a heated hearing on January 31, 2024. The CEOs of TikTok and Meta joined the CEOs of Discord, Snap, and X, who had been subpoenaed, as witnesses on combatting online child sexual abuse material (CSAM).

During the Judiciary Committee hearing, senators urged Senate Majority Leader Chuck Schumer to bring bills to the floor that had already passed through committee. These bills included legislation that would allow individuals to sue social media platforms more easily for hosting CSAM, increase law enforcement capacity on CSAM cases, and raise reporting and transparency requirements for social media companies. Senators argued for repealing Section 230 of the Communications Decency Act, a law that generally protects online platforms from civil liability for objectionable user-generated content, including most CSAM.

GMF Digital has argued that targeted solutions would be more effective in tackling harmful content online than repealing Section 230 entirely. Proposed bills would create case-by-case enforcement that would leave courts playing whack-a-mole in pursuit of individual offenders. They could require platforms to severely limit encryption protections and take down any content that could possibly risk a lawsuit, curtailing speech.

In an unconventional alliance, industry groups and hundreds of civil society organizations and LGBTQ+ advocates are largely aligned in opposing the bills that increase platform liability on the grounds that they threaten free speech, privacy, and encryption. Industry groups such as NetChoice, which represents Meta, Snap, TikTok, and X, also oppose these bills because of both cybersecurity and constitutional concerns.

Amendments to Section 230 aimed at conditioning its liability protections could serve as a strong tool for enforcement. However, they should be at most narrow carveouts and should pair with legislation, developed in consultation with civil society advocates, that incentivizes the creation of online safe spaces for children and protects their privacy and cybersecurity.

Given the challenge of widescale moderation for new and smaller platforms, creating a tiered system of obligations based on the size of the platform, as in the EU Digital Services Act, could also incentivize market-leading companies to set industry-wide trust and safety standards. This would be more productive and sustainable than imposing a one-size-fits-all regulatory framework that could have unintended and harmful consequences. An increased threat of litigation would be particularly challenging for small startups, which cannot amortize compliance costs as easily as industry giants, and could ultimately stifle market competition.

Passing any of these bills in an election year will be difficult, with Senate Republicans unlikely to compromise on legislation that would give President Biden an election-year win. Senate leadership has historically hesitated to put technology bills to a floor vote despite bipartisan support. Even after heart-wrenching testimony supporting these bills, they may still die in committee.