arrow_back Back to App

Online Safety: Equal Content Moderation and Transparency for Non-English Users.

This law mandates that large online platforms (over 10 million users) treat content equally across all languages they monetize to ensure safety and equitable access to digital spaces. This means that the removal of harmful content, such as fraud or harassment, must be as effective for non-English users as for English users. Platforms will be required to annually publish detailed reports on moderation staffing and algorithm effectiveness across different languages.
Key points
Platforms must ensure that processes for detecting and removing illegal or harmful content are applied consistently across all languages where the platform engages in monetization practices.
Citizens will gain greater protection against online fraud, harassment, and misinformation in their native languages due to the requirement for increased investment in non-English moderation.
Large platforms must annually disclose public reports detailing the number of moderators, their language proficiency, and the performance of automated systems in moderating non-English content.
All user tools for reporting content and platform policies must be accessible in the same manner across all languages offered.
article Official text account_balance Process page
Expired
Citizen Poll
No votes cast
Additional Information
Print number: 118_S_1801
Sponsor: Sen. Lujan, Ben Ray [D-NM]
Process start date: 2023-06-01