Recent jury verdicts against Meta and YouTube, awarding hundreds of millions for alleged harms stemming from “addictive design,” threaten to dismantle Section 230 protections for online platforms. The core argument shifts liability from user-generated content to platform “design choices” like infinite scroll and algorithmic recommendations. This “design liability” theory, echoing pre-Section 230 legal landscapes, allows lawsuits without proving direct causation from specific content, making even minor editorial decisions a potential legal hazard. The high cost of litigation, even for companies like TikTok and Snap who settled pre-trial, poses an existential threat to smaller platforms. Furthermore, the precedent could discourage essential privacy features like encryption, as demonstrated by the New Mexico case, and stifle internal safety discussions, ultimately jeopardizing the open internet.