Curbing Online Censorship: New Liability Rules for Dominant Tech Platforms
This bill amends Section 230 of the Communications Act, primarily targeting dominant interactive computer services regarding content moderation. If these platforms actively suppress or promote specific viewpoints, they risk losing their current legal immunity, making it easier for citizens to sue them. The changes aim to increase transparency in algorithmic promotion and content management, giving users better insight into how their posts are handled.
Key points
Loss of Legal Immunity: Dominant internet platforms may lose legal protection if their content moderation activities appear to express, promote, or suppress a discernible viewpoint for reasons not currently protected.
Algorithmic Accountability: Platforms will be treated as content creators if they use algorithms for 'targeted algorithmic amplification'—actively pushing information to users without their request.
Increased Transparency: Providers must publicly disclose accurate information about their content moderation, promotion, and curation practices, enabling consumers to make informed choices.
Religious Liberty Exception: An exception to civil liability protections is introduced for actions taken against religious material that burdens the exercise of religion.
Expired
Additional Information
Print number: 118_S_921
Sponsor: Sen. Rubio, Marco [R-FL]
Process start date: 2023-03-22