What is KOSA?
Senators Blackburn and Blumenthal have introduced a new version of KOSA – the Kids Online Safety Act, which seeks to protect minors from online harms by requiring social media companies to prioritize children’s safety in product design and offer more robust parental control tools. Garnering bipartisan support with 62 Senate cosponsors in the wake of a significant hearing with Big Tech CEOs, the bill emphasizes accountability for tech companies, transparency in algorithms, and enhanced safety measures. The legislation has been refined following extensive discussions with various stakeholders, including tech companies, advocacy groups, and parents, to ensure its effectiveness and alignment with the goal of safeguarding young internet users from bullying, harassment, and other online risks.
Critics of the statute argue that the KOSA, despite amendments, remains a threat to constitutional rights, effectively censoring online content and empowering state officials to target undesirable services and speech. See, e.g., the EFF’s blog post about the statute. They contend that KOSA mandates extensive filtering and blocking of legal speech across numerous websites, apps, and platforms, likely leading to age verification requirements. Concerns are raised about the potential harm to minors’ access to important information, particularly for groups such as LGBTQ+ youth, those seeking health and reproductive information, and activists. The modifications in the 2024 version, including the removal of the authority for state attorneys general to sue for non-compliance with the “duty of care” provision, are seen as insufficient to address the core issues related to free speech and censorship. Critics urge opposition to KOSA, highlighting its impact not just on minors but on all internet users who could be subjected to a “second-class internet” due to restricted access to information.
What does the proposed law actually say? Below are some key facts about the contents of the legislation:
Who would be subject to the law:
The statute would place various obligations on “covered platforms”:
- A “covered platform” encompasses online platforms, video games, messaging applications, and video streaming services accessible via the internet and used or likely to be used by minors.
- Exclusions from the definition of “covered platform” include common carrier services, broadband internet access services, email services, specific teleconferencing or video conferencing services, and direct wireless messaging services not linked to an online platform.
- Entities not for profit, educational institutions, libraries, news or sports news websites/apps with specific criteria, business-to-business software, and cloud services not functioning as online platforms are also excluded.
- Virtual private networks and similar services that solely route internet traffic are not considered “covered platforms.”
Design and Implementation Requirements
- Covered platforms are required to exercise reasonable care in designing and implementing features to prevent and mitigate harms to minors, including mental health disorders, addiction-like behaviors, physical violence, bullying, harassment, sexual exploitation, and certain types of harmful marketing.
- The prevention of harm includes addressing issues such as anxiety, depression, eating disorders, substance abuse, suicidal behaviors, online bullying, sexual abuse, and the promotion of narcotics, tobacco, gambling, and alcohol to minors.
- Despite these protections, platforms are not required to block minors from intentionally seeking content or from accessing resources aimed at preventing or mitigating these harms, including providing evidence-informed information and clinical resources.
Required Safeguards for Minors
- Covered platforms must provide minors with safeguards to limit communication from others, restrict access to their personal data, control compulsive platform usage features, manage personalized recommendation systems, and protect their geolocation data. (One has to consider whether these would pass First Amendment scrutiny, particularly in light of recent decisions such as the one in NetChoice v. Yost).
- Platforms are required to offer options for minors to delete their accounts and personal data, and limit their time on the platform, with the most protective privacy and safety settings enabled by default for minors.
- Parental tools must be accessible and easy-to-use, allowing parents to manage their child’s privacy, account settings, and platform usage, including the ability to restrict purchases and view and limit time spent on the platform.
- A reporting mechanism for harms to minors must be established, with platforms required to respond substantively within specified time frames, and immediate action required for reports involving imminent threats to minors’ safety.
- Advertising of illegal products such as narcotics, tobacco, gambling, and alcohol to minors is strictly prohibited.
- Safeguards and parental tools must be clear, accessible, and designed without “dark patterns” that could impair user autonomy or choice, with considerations for uninterrupted gameplay and offline device or account updates.
Disclosure Requirements
- Before a minor registers or purchases on a platform, clear notices about data policies, safeguards for minors, and risks associated with certain features must be provided.
- Platforms must inform parents about safeguards and parental tools for their children and obtain verifiable parental consent before a child uses the platform.
- Platforms may consolidate notice and consent processes with existing obligations under the Children’s Online Privacy Protection Act (COPPA). (Like COPPA, a “child” under the act is one under 13 years of age.)
- Platforms using personalized recommendation systems must clearly explain their operation, including data usage, and offer opt-out options for minors or their parents.
- Advertising targeted at minors must be clearly labeled, explaining why ads are shown to them and distinguishing between content and commercial endorsements.
- Platforms are required to provide accessible information to minors and parents about data policies and access to safeguards, ensuring resources are available in relevant languages.
Reporting Requirements
- Covered platforms must annually publish a report, based on an independent audit, detailing the risks of harm to minors and the effectiveness of prevention and mitigation measures. (Providing these audit services is no doubt a good business opportunity for firms with such capabilities; unfortunately this will increase the cost of operating a covered platform.)
- This requirement applies to platforms with over 10 million active monthly users in the U.S. that primarily host user-generated content and discussions, such as social media and virtual environments.
- Reports must assess platform accessibility by minors, describe commercial interests related to minor usage, and provide data on minor users’ engagement, including time spent and content accessed.
- The reports should identify foreseeable risks of harm to minors, evaluate the platform’s design features that could affect minor usage, and detail the personal data of minors collected or processed.
- Platforms are required to describe safeguards and parental tools, interventions for potential harms, and plans for addressing identified risks and circumvention of safeguards.
- Independent auditors conducting the risk assessment must consult with parents, youth experts, and consider research and industry best practices, ensuring privacy safeguards are in place for the reported data.
Keep an eye out to see if Congress passes this legislation in the spirit of “for the children.”