Editor’s Note: A previous version of this story incorrectly attributed quotes from Rep. Gale Mastrofrancesco to Rep. Anne Dauphinais. It has since been corrected.

The Connecticut House has passed a bill that would make sweeping changes to social media in an effort to protect minors from some of its potentially harmful effects, in line with Attorney General Tong’s recommendations. 

Tong first recommended the legislation in February, calling social media “addictive by design.” In 2022, Tong opened an investigation into Tik Tok’s algorithm, and in 2023, he sued Meta, alleging the company of knowingly pushing harmful content to minors via its algorithm. 

“Our teenagers spend hours each day glued to social media, with dire consequences for their learning, relationships and mental health,” said Tong. “We need to give parents back some control, and families can’t afford to wait for the federal government or the tech giants to do it.”

Social media apps, similar to search engines, use algorithms to find posts that users spend time on, like, and interact with, to cater their users’ future content towards their interests. While this creates a user-tailored experience, concerns have also been raised that it can create a negative feedback loop, which is allegedly especially harmful to minors. This bill would restrict social media companies from using these algorithmic content delivery systems on accounts used by minors, unless approved by a minor’s parents. 

“The bill requires the operator to use age verification that is commercially reasonable and technically feasible,” reads the bill’s analysis. “If an operator has used commercially reasonable and technically feasible methods to verify a user’s age and cannot determine if a user is a minor, the operator may presume the user is not a minor under the bill’s provisions.”

Essentially, if an app determines a user to be a minor, it would default to settings that would require parental consent to be disabled. These default settings would prevent minors from accessing or receiving any algorithmic content for more than one hour per day, allow them only to communicate with accounts of other minors, and prevent notifications from appearing outside the hours of 8 a.m. to 9 p.m. 

Starting on March 1, 2027, the bill would require all social media companies to annually disclose the total number of platform users, percentage of users which have provided “verifiable consent” from a parent or guardian to access algorithmic content, and the percentage of users that did or did not have default settings enabled. Furthermore, companies would have to disclose their users’ average daily usage time, broken down by user age and hour of day.   

Rep. Liz Linehan (D-Cheshire) explained that she and other lawmakers first started considering such a bill in 2021, when she co-chaired the Children’s Committee. She said that in 2022, a constituent shared with her that her teen daughter developed an eating disorder after researching diets via social media.

“It got deeper and deeper and darker, and she went from looking at things like eating paleo and eating keto, to now looking at pictures of significantly and severely underweight individuals that were promoting anorexia and bulimia, and this led to an eating disorder,” said Linehan. “When we think about eating disorders, when we think about body image, when we think about suicidal ideology, all of those things get deeper and darker, and AI doesn’t necessarily say, ‘Hey, this kid’s getting too deep.’”

Rep. Gary Turco (D-Newington) said that the apps’ algorithms value consumer engagement over the mental health of their users, and use more and more “sensationalized content” to keep users online.

“These algorithms, and numerous studies have shown, are used in a way to display content back, to back, to back, to a minor, to make them addicted to the platform, to make them stay on the platform a lot longer than they would want,” said Turco. 

While the majority of lawmakers showed support for the bill’s intent, there was also a fair amount of confusion as to how the bill could possibly be enforced. Some lawmakers showed both confusion and support simultaneously.

“I appreciate the work that was done by the committee moving this legislation,” said Rep. Tim Ackert (R-Bolton). “I have no idea how you implement it, that’s going to be something. But this is a laudable goal, and I’m in full support of it.”

Turco answered a barrage of questions regarding the bill’s mechanisms of implementation and enforcement. What counts as social media? Turco explained that exemptions were made for streaming apps, such as Netflix and Hulu, or educational apps used in schools. How could age be verified? Turco seemed uncertain, repeatedly referring back to the bill’s language of “commercially reasonable and technically feasible” methods.

“The bill was crafted to be purposely, in that section, broad, allowing the platform some discretion on what they believe is commercially reasonable and technologically feasible,” said Turco.

On enforcement, Turco explained that the state would treat potential violations of the bill’s provisions as a consumer protection matter. Those found to be in violation would be liable to fines under the Connecticut Unfair Trade Practices Act (CUTPA). He said that the data disclosure provisions of the bill intend to help aid in its enforcement.

Turco also explained that the bill only intends to regulate the method by which apps deliver content, not police the nature of the content itself. Per the law, minors would not be restricted from accessing adult content if they search for it, unless the app itself has its own internal controls to age-restrict such content. This was a point of contention for critics of the bill, such as Rep. Gale Mastrofrancesco (R-Wolcott).

“So, we’re not trying to restrict minors from accessing pornography, anything that’s filthy like that, but we’re trying to restrict minors from, maybe targeted ads, based on how they look and feel?,” asked Mastrofrancesco. 

Mastrofrancesco proved the bill’s most vocal opponent, questioning not only its enforceability but also whether it constituted government overreach. She argued that the government can not legislate behavior, and that the bill could open the state up to lawsuits while unfairly restricting social media companies.

“There could be many lawsuits going forward, [saying that it] infringes on people’s constitutional rights, which I think particularly this bill does,” said Mastrofrancesco. “When we legislate things like this, it’s very, very concerning, not only for – as a parent, as a citizen, but as a restriction that we’re putting on a company that has to monitor this.”

Linehan argued that the prior passage of similar bills in other states has allowed the legislature the opportunity to craft a bill that “can pass constitutional muster,” and that the bill has the “benefit of hindsight.”

In spite of lawmakers’ inquiries into the bill’s implementation or enforcement, the bill passed 121-26, with four representatives either absent or abstaining. After its passage, Tong released a statement in support.

“Today’s strong bipartisan vote sends an important message—Connecticut is done waiting for the federal government and tech giants to do right by our kids,” said Tong in a statement released after the bill’s passage. “I look forward to working with the Senate to advance this important legislation.” 

Was this article helpful?

Yes
No
Thanks for your feedback!

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.

A Rochester, NY native, Brandon graduated with his BA in Journalism from SUNY New Paltz in 2021. He has three years of experience working as a reporter in Central New York and the Hudson Valley, writing...

Leave a comment

Your email address will not be published. Required fields are marked *