A Connecticut bill seeks to add the state to a growing list of those looking to require age verification for the social media accounts of minors.

The requirement that social media platforms get parental consent before allowing minors to open accounts is one of seven objectives in SB 3, An Act Concerning Online Privacy, Data and Safety Protections.

The bill establishes standards around how consumer health data can be accessed and shared. It also prohibits geofencing from being used on certain health data. The third section of the bill creates new requirements related to the use of minor’s personal data and social media platforms.

Additionally, the bill would revise disclosure requirements for warrants directed to providers of electronic communication services and remote computing services, create a duty of care for online dating service operators related to users’ potential criminal activity, and create the Connecticut Internet Crimes Against Children Task Force. Finally, the bill would require employers to disclose known instances of sexual harassment and assault when making employment recommendations about former employees.

The bill’s third section defines minors as any consumer under the age of eighteen and would require social media platforms to delete the account of a minor within ten days of receiving such a request from either the minor if they are younger than sixteen or their parent or legal guardian. Within the same time frame, a social media platform must stop processing the data of a minor from whom they have received a request to delete their account. The bill would also require social media platforms to establish “one or more secure and reliable means for submitting a request” and to describe how to do so in a privacy notice.

Further, SB 3 would prohibit social media platforms from creating accounts for minors under the age of 16 unless the platform has received consent from the minor’s parent or guardian. Violating these provisions in the bill would be considered an unfair trade practice. The section three requirements would go into effect on July 1, 2025.

The bill does not include any guidelines for how social media platforms would establish that a user is a minor or verify their age.

Several other sections of the bill also touch on minors’ use of the internet. Section four of SB 3 contains regulations for controllers of online services, defined as any person who determines the “purposes and means of processing personal data.” Controllers offering online services, products or features to consumers would be required to use “reasonable care to avoid any heightened risk of harm to minors proximately caused by such online, product or feature” if they have knowledge their users are minors or willfully disregard the possibility that their users could be minors.

This section of the bill also contains a consent requirement that prevents controllers of online services from processing minors’ personal data or collecting their precise geolocation data without first obtaining the minors consent if they are over thirteen or the consent of a parent or guardian if they are under thirteen.

Controllers that comply with the verifiable consent requirements in the Children’s Online Privacy Protection Act of 1998 are deemed to have satisfied the parental consent requirement.

Further, the bill prohibits controllers of online services from using user interfaces to manipulate or subvert user autonomy or to use system design features to increase use of an online service when the user is a minor.

As of July 1, 2025, controllers of online services used by minors would be required to conduct data protection assessments at least biennially and maintain documentation of them for as long as they are in service. The assessment would have to address the purpose of the service, the categories of data on minors collected, the purposes for which they are collected, and any heightened risk of harm to minors “that is a reasonably foreseeable result of offering” an online service. If an online service controller finds through the data assessment that they pose a heightened risk of harm to minors, they would be required to establish and implement a plan to eliminate the risk.

Testimony given during the public hearing largely focused on other sections of the bill, but testimony that did focus on the section of the bill related to the use of social media by minors was largely negative.

Sen. Martin Looney, D-New Haven, a co-sponsor of the bill, submitted testimony in favor of the bill. Senate Democrats also submitted identical testimony.

Their testimony links social media use by teens to a rise in negative feelings. “One explanation for the rapid deterioration of our children’s wellbeing is social media. Facebook, Instagram, and Tiktok have become standard for our youth. Social media can foster friendships and expand connections, but it has also been determined to be the cause of loneliness. Tech companies push out content that is not age appropriate to tweens, and yet 38 percent of those 8 to 12 years old report using social media daily, an increase from 31 percent in 2019.” the testimony states.

Further, they argue sections three through eight of the bill “will protect the privacy of minors and prohibit the use of their personal data from being used in ways that cause them harm” and reference the Age-Appropriate Design Code adopted by the United Kingdom, and which, according to the testimony, led Google to make SafeSearch a default for minors, YouTube to turn off autoplay, and TikTok to disable messaging between minors and unknown adults. “While tech companies are adopting these and other protections for minors, not all are applied in the US.” their testimony continues.

But other testimony disagrees that those portions of SB 3 will insulate minors from harm. Testimony from a number of tech-related advocacy groups suggests that instead, the bill could cause more harm not just to minors but all residents of Connecticut by requiring the collection of more personal data.

According to Christopher Gilrein, the executive director of Northeast TechNet, SB 3’s definition of a minor as anyone under the age of 18 would require “nearly every website or app” to verify the age of every visitor.

“Age verification is a complex challenge for our industry to address, requiring consideration of how to
properly balance the interests of privacy and security, and efforts are ongoing to develop more privacy-protective ways to verify age online. At this point, however, compliance with SB 3 would almost certainly result in a substantial increase in the amount of personal information businesses would need to collect on individuals, including birth dates, addresses, or government-issued IDs.” Gilrein continued in public testimony.

Gilrein’s concerns were echoed in testimony from Andrew Kingman of the State Privacy and Security Coalition and Alexander Spryopoulos from the Northeast Computer and Communications Industry Association.

“Businesses may be forced to collect personal information to track a “minors” activity online, their physical location, and who they communicate with (either to ensure they are a minor or verify that they “allow any adult to contact any minor through any messaging apparatus unless such adult previously established and maintains an ongoing lawful relationship with such minor”) in order to comply with the proposed legislation. This puts consumers and businesses in the tough position of sharing and collecting
sensitive information that consumers may not want to share, putting in place additional risks for everyone, particularly children and members of other vulnerable communities.” Spryopoulos wrote.

A coalition of groups from the advertising industry also expressed concern that the breadth of language in the law related to controllers of online services would also require age verification for all users and hamper residents’ abilities to navigate the internet.

“For example, the bill could be read to apply to the online offerings of clothing retailers, professional sports organizations, and restaurants, simply because minors access their websites. To help ensure businesses do not “willfully disregard” minors who access an online service, product, or feature, controllers may require all online visitors to pass through “age gates” for access, which would mandate anyone attempting to access the website to provide specific age information to the site owner before reading the webpage’s contents.” the group, which included the Association of National Advertisers, the Interactive Advertising Bureau, the Digital Advertising Alliance, the American Association of Advertising Agencies, and the American Advertising Federation stated in public testimony.

SB 3 was voted out of the Judiciary Committee by a party-line vote of 24 to 13, with Democrats voting in favor of the bill and Republicans voting against it, on March 30. It was placed on the Senate calendar on April 17 and awaits a vote there. In total, the bill has 23 co-sponsors from both legislative chambers. All are Democrats.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.

An advocate for transparency and accountability, Katherine has over a decade of experience covering government. She has degrees in journalism and political science from the University of Maine and her...

Leave a comment

Your email address will not be published. Required fields are marked *