Document Type

Article

Publication Date

2025

Abstract

Two truths coexist: The Internet has brought with it tremendous changes for learning, connection, and business; and the Internet and other digital platforms have led to an unprecedented exploitation of children on a scale never before imagined. This is due in large part to §230 of the Communications Decency Act – the law which tech platforms have perverted to immunize them from liability for their activity which causes extreme harm. This duality has led to a current vigorous debate about whether this 1996 law has any value in the 21st Century.

This article answers that question with a resounding no, by focusing on the issues surrounding child exploitation. It corrects the false argument made by tech in their attempt to redefine §230’s origin as one singularly focused on Internet freedom. This is a false narrative, ignoring the actual context in which §230 became law: child protection. It then makes the case to reform §230 and return it to its original intent, updating it for the 21st Century.

Its in depth review of legislative history, historically contemporaneous media coverage from 1996, and tech litigation strategy reveals two facts: (1) §230 was intended in large part for limited immunity to encourage the protection from child exploitation and (2) tech platforms have systematically litigated throughout the country to expand that immunity to de facto near absolute immunity causing massive harm to children.

The article then compares the intentions and promises of the law to the present day climate regarding child exploitation on the Internet, specifically focusing on the problem of Child Sexual Abuse Material (CSAM) – also known as child pornography. Observing the cavernous fissure between one of the main purposes of §230 and the reality of online child exploitation it argues that the need to reform §230 and return it to one of its original purposes is now. This article argues that need is prescient not only because of the grave reality of CSAM online, but also because of one of the very intentions behind §230 – to protect children.

The article examines recent legislative proposals to address the problem of CSAM, and proposes a new solution that returns Section 230 to its origins, reverses tech platforms’ false narrative, updates §230 for the current world, and offers a path forward toward the protection of children.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.