California has adopted new rules that will require many online services to increase protections for children. The rules could change how popular social media and game platforms treat minors.
Social media and game platforms often use recommendation algorithms, find-a-friend tools smartphone notifications, and other enticements to keep people glued online. But these same techniques may pose risks to scores of children who have flocked to online services that were not designed specifically for them.
California lawmakers have now passed the first statute in the United States that requires apps and sites to install guardrails for users under the adult legal age of 18.
The new rules would compel many online services to curb the risks that certain popular features may pose to child users, such as allowing strangers to message each other.
The California Age-Appropriate Design Code Act could herald a shift in how lawmakers regulate the tech industry. Rather than wade into heated political battles over online content, the legislation takes a practical, product-safety approach. It aims to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seatbelts and airbags for younger users.
The State Senate passed the bill in a unanimous vote of 33 to 0. The State Assembly previously approved a version of the bill. Now it goes to Gov. Gavin Newsom to sign.
The new rules tap into a national debate over the potentially deleterious effect that social media platforms may have on the mental health and body images of some young people. Instagram in particular has come under heightened scrutiny over the past year especially.
Last fall, members of Congress examined how the social network’s automated recommendation engine had served graphic images of self-harm to teenage girls as well as content promoting eating disorders to younger users.
Soon after, President Biden called for greater child safety on social media.
Companies have also been criticized for exploiting children’s data, including Google and TikTok, who each agreed to pay massive multimillion-dollar federal fines to settle charges that they had illegally collected personal information from children without parental consent.