Jump to content
  • Big Tech sues California, claims child-safety law violates First Amendment

    alf9872000

    • 262 views
    • 7 minutes
     Share


    • 262 views
    • 7 minutes

    Suit claims "guessing wrong" about young user harms could cost companies $20B.

    In the last half of 2022 alone, many services—from game platforms designed with kids in mind to popular apps like TikTok or Twitter catering to all ages—were accused of endangering young users, exposing minors to self-harm and financial and sexual exploitation. Some kids died, their parents sued, and some tech companies were shielded from their legal challenges by Section 230. As regulators and parents alike continue scrutinizing how kids become hooked on visiting favorite web destinations that could put them at risk of serious harm, a pressure that's increasingly harder to escape has mounted on tech companies to take more responsibility for protecting child safety online.

     

    In the United States, shielding kids from online dangers is still a duty largely left up to parents, and some tech companies would prefer to keep it that way. But by 2024, a first-of-its-kind California online child-safety law is supposed to take effect, designed to shift some of that responsibility onto tech companies. California’s Age-Appropriate Design Code Act (AB 2273) will force tech companies to design products and services with child safety in mind, requiring age verification and limiting features like autoplay or minor account discoverability via friend-finding tools. That won’t happen, however, if NetChoice gets its way.

     

    The tech industry trade association—with members including Meta, TikTok, and Google—this week sued to block the law, arguing in a complaint that the law is not only potentially unconstitutional but also poses allegedly overlooked harms to minors.

     

    Some tech companies don’t like the California law, NetChoice said in a statement, because they allege that it “violates the First Amendment" many times over. They also say it grants California “unchecked power to coerce moderation decisions the government prefers.” By keeping the law’s terms purposefully vague and never really defining what’s considered “harmful,” even companies attempting to comply in good faith could find themselves charged with unforeseeable violations, the complaint alleges.

     

    Some tech companies have already taken steps to tighten up online protections for young users this year. AB 2273 is based on a British online child-safety law passed last year that prompted many tech companies to change their policies, including Google, Instagram, Facebook, Pinterest, TikTok, Snap, and YouTube, The New York Times reported. None of these tech companies immediately responded to Ars' request for comment.

     

    California’s law goes further, however, by requiring tech companies to submit “Data Protection Impact Assessments,” which would detail child-safety risks and provisions before launching any new features. All online companies are currently required to submit these DPIAs before AB 2273 goes into effect in July 2024, and then they're required to submit to biennial reviews.

     

    These DPIAs are intended to increase accountability by prompting companies to consider how product features could cause harm to young users, then create timelines for mitigation efforts to prevent any harm identified. They also work to ensure that companies are actually enforcing their own posted policies, which NetChoice’s complaint specifically claims is unreasonable without the state defining the law in more concrete terms:

    “AB 2273 unconstitutionally deputizes online service providers to act as roving Internet censors at the State’s behest. Providers must (i) assess the undefined risks their services and content ‘could’ pose to the ‘well-being’ and ‘best interests’ of children; (ii) devise a plan to prevent or mitigate any such risks; and (iii) develop, publish, and enforce terms of service and “community standards.”

    According to the complaint, NetChoice views the law as an improper attempt by the state to censor tech companies while threatening “crushing financial penalties” for any perceived violations—alleging that what constitutes a violation will be determined fully at the government’s discretion.

     

    “Guessing wrong about what these provisions proscribe is prohibitively expensive—penalties for even negligent errors could exceed $20 billion,” the complaint said.

    Experts still debating what harms young users

    Signed into law this past September, AB 2273 covers not only online products and services directed at children, but also any products and services that minors are “likely to access.” This threatens to “fundamentally” change the Internet, NetChoice warned in its complaint, with companies incentivized to self-censor more often, cutting everyone online off from harmless information flows just to avoid fines.

     

    Because AB 2273 would place “overwhelming pressure” on online businesses to “over-moderate content,” NetChoice’s complaint said that young users would suffer from less accessible information, including “life-saving” resources. Other harm to children could come from the age-verification requirement, which the complaint alleges would require that companies collect more data on minors. The more data collected on kids, the more at risk they are for hacks, data leaks, and exploitation, the complaint said.

     

    “The problem is that the law actually threatens the safety and privacy of children online by forcing all websites to track, verify, and store information on minors,” NetChoice counsel Chris Marchese told Ars. “By forcing all websites to identify children, every digital service will need to collect more information on their users. At a time of increased cybersecurity threats online, this would make some businesses a honeypot for online child predators and hackers looking to get ahold of children’s information.”

     

    Not all tech-focused organizations oppose AB 2273. A spokesperson for Common Sense—a nonprofit focused on guiding kid-friendly technologies and one of the lead sponsors of the bipartisan California law—provided a statement to Ars denouncing NetChoice’s lawsuit as “desperate.”

     

    "Many big tech companies have been making intentional design choices with their platforms for years in the name of profits and engagement, without regard for children's well-being,” a Common Sense spokesperson said. “The California design code law is one step toward ending that practice.”

     

    Previously, NetChoice has sued to block similar child-safety laws with less-sweeping provisions in Florida and Texas. Common Sense described NetChoice’s latest lawsuit as sending a message to parents that their kids’ well-being is not tech companies’ concern.

     

    "This desperate lawsuit filed by big tech's lobbyists is a slap in the face to parents everywhere, particularly those who have tragically lost children to the harms associated with social media,” Common Sense’s spokesperson said. Common Sense claims that what is “more important” to tech companies is enlarging profits by maintaining “their ability to collect young users' data and manipulate it to amplify harmful content such as videos promoting eating disorders and self-harm without any accountability.”

     

    Marchese told Ars that NetChoice’s position is that the power to protect kids online should remain with parents, pointing to the 1998 federal Children’s Online Privacy Protection Rule (COPPA) as adequate legislation safeguarding young users’ privacy.

    NetChoice's complaint also said that COPPA prevents states from passing any conflicting laws and alleges that, in addition to constitutional violations, AB 2273 conflicts with COPPA and other federal laws.

     

    “AB 2273 also cuts parents and guardians out of child protection online, and in doing so, conflicts with federal rules that rightly enshrine a role for guardians to keep their kids safe on the Internet,” Marchese said.

     

    NetChoice is asking the US District Court for the Northern District of California to declare sections of the California law unconstitutional and block California Attorney General Rob Bonta from enforcing it. According to NetChoice, it's the only way to keep the entire Internet from being reduced to products and services based on whatever's deemed appropriate for each platform's youngest identified user.

     

    “The well-being of children is undisputedly of great importance,” NetChoice’s complaint said, but AB 2273 allegedly “regulates far beyond privacy” and “is not confined to children.”

     

    Bonta’s office did not immediately respond to Ars’ request for comment.

     

    Source


    User Feedback

    Recommended Comments

    There are no comments to display.



    Join the conversation

    You can post now and register later. If you have an account, sign in now to post with your account.
    Note: Your post will require moderator approval before it will be visible.

    Guest
    Add a comment...

    ×   Pasted as rich text.   Paste as plain text instead

      Only 75 emoji are allowed.

    ×   Your link has been automatically embedded.   Display as a link instead

    ×   Your previous content has been restored.   Clear editor

    ×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...