Washington — Seventeen-year-old Julienne Pagulayan started using social media when she was in the fifth grade.
"It was getting on and it was, like, seeing what other people are doing," Pagulayan told CBS News.
However, under a bipartisan bill introduced this week, children under the age of 13 would be barred from using social media, while those between the ages of 13 and 17 would need parental consent to create an account. Social media companies would also be prohibited from recommending content using algorithms to users under 18.
The Protecting Kids on Social Media Act is co-sponsored by Republican Sen. Tom Cotton of Arkansas and Democratic Sen. Brian Schatz of Hawaii, both of whom are parents.
"My kids are young enough that it's not a concern yet, but I do worry very much about it," Cotton told CBS News.
Both Cotton and Schatz believe such a bill could be successfully enforced.
"There are lots of mechanisms for a more robust age verification system," Cotton said. "The age verification that they're doing now is essentially asking a 12-year-old to say, 'Are you 18?' And they click, 'I'm 18,' and now they're online."
Schatz argues that the bill would give the Federal Trade Commission and individual states attorney generals the authority to enforce the age limit.
"We've made a decision, as a society, that you should have to wait to a certain age to say, buy alcohol or buy tobacco," Schatz said. "We're not so naive that we don't think teenagers have never smoked a cigarette or never drank a beer. But that doesn't mean you should just throw up your hands, that there's no solution at all."
The two senators point to several studies that suggest a potential link between social media and mental health, including a survey released in February by the U.S. Centers for Disease Control and Prevention which found that 57% of high school girls, and 29% of high school boys, feel persistently sad. The survey also found that 22% of all high schoolers reporting they had seriously considered suicide.
Pagulayan believes kids her age should be able to make their own decision about social media usage.
"It's so relevant now," Pagulayan said. "And if a parent doesn't see that, I feel like them not permitting their child kind of becomes a block in that opportunity for them."
Some social media platforms told CBS News they are reviewing the legislation, and note they already have safeguards in place.
Antigone Davis, global head of safety for Meta, the parent company of Facebook and Instagram, told CBS News in a statement that the company has "developed more than 30 tools to support teens and families."
When teens create an Instagram account, according to Davis, it is automatically set to private, and teens receive "notifications encouraging them to take regular breaks."
"We don't allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it's reported to us," Davis said. We'll continue to work closely with experts, policymakers and parents on these important issues."
A spokesperson for Snap, the parent company of Snapchat, told CBS News in a statement that it has "built safety and privacy into the architecture of our platform and have extra protections for 13-17-year-olds."
"We are already working with industry peers, regulators, and third-party technology providers on possible solutions and look forward to continuing these productive conversations with the cosponsors of this legislation," the spokesperson said.
TikTok pointed to its privacy and parental controls, including restrictions to features such as direct messaging for younger teens, and restricting accounts for those under 18 from sending or receiving virtual gifts or livestreams.
for more features.