California lawmakers OK bills aimed at content regulation and child safety in social media

Kids on the pressures of having a cell phone, social media at an early age

California lawmakers on Tuesday sent Gov. Gavin Newsom two groundbreaking bills intended to limit the downside of social media, as they faulted Congress for failing to act on the problem.

A first-of-its kind measure would require social media companies to make public their policies for removing disturbing content and provide details on how and when they remove it.

The second bill would require companies that provide online services attractive to children to follow age-appropriate design code principles aimed at keeping kids safe. That includes not profiling a child or using the child's personal information in a way that could harm their physical or mental health or well-being.

Around eight in 10 parents of children aged 11 or younger say their kids have used or interacted with a tablet computer — a number that has been on the rise in recent years, according to the Pew Research Center. A recent survey by Common Sense Media found that 81% of teenagers aged 13 to 17 use social media, with around 70% using the platforms multiple times a day.

Teen activist on social media, self-esteem and why it's important to "log off"

"The online world has created tremendous opportunities, but also real and proximate threats to kids, to vulnerable communities and to American democracy as we know it," said Democratic Assemblyman Jesse Gabriel, author of the first bill.

"We believe that California has a special obligation and a special opportunity to lead on these issues," Gabriel added during a news conference Tuesday. "We're proud of our technology economy, and we know that many of the companies that these bills would regulate are homegrown California companies. But with dysfunction in Washington, D.C., we believe that California must step up and lead."

Policy transparency

Gabriel's measure would require companies to say how they regulate their own content under their social media terms of service. It stalled last year over free speech issues before clearing the Senate on a 33-3 vote and the Assembly, 48-0. It says it is "the intent of the Legislature" that the state attorney general or a city attorney take civil action against violators.

While the measure had bipartisan support, Republican Sen. Melissa Melendez in opposition worried that it could be used to punish legitimate but unpopular content, particularly as Attorney General Rob Bonta is a progressive Democrat.

"I can't help but wonder if this is not in fact an attempt for the attorney general to perhaps harass the citizens of California, particularly those who have an opposing viewpoint, and I don't think it is appropriate that the state attorney general get involved in any attempt to censor speech," she said during a debate Monday night.

But Democratic Sen. Thomas Umberg, who carried the bill, said the measure "does not basically censor content ... If they have no policy, they have nothing to report. If they do have a policy then they need to report how they're implementing that policy."

Democratic Sen. Scott Wiener said the bill sought by the Anti-Defamation League is particularly important to the Legislature's Jewish Caucus, "given the rampant anti-Semitism on social media."

What's driving young white men toward radicalization and white supremacist content?

Opponents include the California Chamber of Commerce, Computer and Communications Industry Association, Consumer Technology Association, Internet Coalition, Netchoice and TechNet.

A coalition of the opponents said companies already must make public their content moderation policies, and the bill goes too far by requiring them to disclose to the attorney general "sensitive information about how we implement policies, detect activity, train employees and use technology to detect content in need of moderation."

In May, a federal appeals court sided with a Texas law restricting how social media sites can moderate their platforms. The law makes it illegal for any social media platform with 50 million or more U.S. monthly users to "block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression."

Residents in the Lone Star State can now sue Facebook, Twitter and YouTube if they believe their content was censored.

Bipartisan vote to protect children

The second bill, intended specifically to protect children from inappropriate online content, cleared the Senate 33-0 also with bipartisan support, though seven Republicans did not vote. It cleared the Assembly, 60-0.

The measure "will represent a major positive step toward creating a global standard for the protection of youth online. That's an aspiration about which I think we can all agree," said Democratic Sen. Josh Newman, who carried the bill in the Senate.

It is modeled after a similar measure in the United Kingdom. It also is opposed by the Chamber of Commerce and some of the tech industry associations. A coalition including the Entertainment Software Association said the bill includes "an over-inclusive standard and would capture far more websites and platforms than necessary."

A third measure working its way through the California Legislature would require large social media platforms to disclose to the public starting in mid-2023 statistics on content that violated its policies and that was recommended or otherwise amplified by the platform's algorithms.

Another more controversial measure failed in the gatekeeper Senate Appropriations Committee earlier this month after it was heavily opposed by the influential tech industry. It would have subjected some popular social media platforms like Instagram and TikTok to fines for using features they know can endanger children.

Researcher says TikTok's in-app browser monitoring user activity

In addition to content issues, how user data is being collected has also led to growing public concern and outrage from parents and officials. Federal regulators are looking at drafting rules to crack down on what they call "harmful commercial surveillance."

The Federal Trade Commission on August 11 announced its initiative seeking public comment on the effects of companies' data collection and the potential benefit of new rules to protect consumers' privacy.

Other states are drafting new regulations aimed at mitigating the harmful effects of social media as well.

A proposal advancing in the Minnesota Legislature with bipartisan support would prohibit social media companies from targeting children under the 18 with algorithms in a move that supporters say will help mitigate the harmful effect of certain content on kids.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.