When news first broke of a draft Supreme Court opinion potentially overturning the landmark Roe v. Wade decision, suddenly there were a host of new concerns. With abortion access becoming suddenly uncertain in dozens of states, online paranoia turned to the frightening ways personal data could be used to enforce laws, especially without us realizing.
Overnight, Twitter was ablaze with warnings about period tracking apps, which could show a pregnancy beginning and ending, location data revealing when a user travels to a clinic known to provide abortion services, and other data that could be used to prosecute alleged offenders.
“Search histories, browsing histories, text messages, location data, payment data, information from period-tracking apps—prosecutors can examine all of it if they believe that the loss of a pregnancy may have been deliberate,” writes Jia Tolentino in The New Yorker. “Even if prosecutors fail to prove that an abortion took place, those who are investigated will be punished by the process, liable for whatever might be found.”
With apps and websites collecting, storing, and selling dozens of data points for every user every time we log on, we are at increasing risk of otherwise innocuous activity potentially being used against us. (You can check what data traces you’re leaving with this tool.)
New laws across the country, though, are working to prevent that, or, at least, to provide consumers with a tiny bit of control over their digital lives. That includes here in Connecticut, where a new data privacy law flew through the state government in early 2022.
How much information do you share online? The answer is probably “a lot more than you think.” Unless you’ve purposefully avoided any kind of social media, online shopping or Google searches, and never owned a smartphone, there is information out there about you. Even if you have been extremely careful, some website somewhere has likely collected information about your location, devices, and preferences.
Data collection is built into the internet at this point and it’s become crucial to the way an increasingly modular and interactive internet functions in the modern world. Websites capture information about your browser, your monitor, your operating system, and other data to make the site easier to use. You want a website to know when you’re viewing it on your phone instead of a laptop or desktop computer. You want it to know when to change font sizes or when to load images or video when you’re using your data plan versus wifi. It makes the internet more user-friendly.
Similarly, cookies, which were created during the rise of online shopping to save your cart when you left a website, have evolved to store additional data that allow websites you visit frequently to remember you and your preferences. Now, cookies help websites serve more and more targeted advertisements to consumers, sometimes doing their job so well it’s as if the internet is listening to our thoughts or conversations. At this point, everything you do online can be captured, processed, packaged, and sold. You are data and that data, for many companies, is money.
Even publications like ours collect data on users to help us do our jobs better. We know where our readers live (broadly, not addresses), what pages you visit, how long you spend there, where you came from, and where you go next. We use that information to adjust headlines, zero in on the best imagery, and figure out what our readers respond to and how best to keep you reading. We use it to make your experience better so we can grow and expand our work.
Data collection has made the internet faster and more convenient in an increasingly connected world, but it has also opened us up to bigger and more pervasive violations of trust and privacy. Data breaches have become so common that at last check, my personal email address had been exposed in at least three dozen over the last 10 years, about a third of them in just the last three.
It’s these challenges that legislators, security researchers, and law enforcement are trying to address.
On June 17th, Governor Lamont signed Public Act 22-15 into state law. The bill, which won’t go into effect for a year, provides a host of new protections for Connecticut consumers doing business online.
“Digital commerce is now a way of life for nearly all of us, and every time we stream a television show or movie online, every time we go for a walk while wearing a fitness tracking device, and every time we purchase something from our favorite website, our actions are being logged and frequently sold and shared with others,” said the governor at the time of signing.
The bill flew through the General Assembly during the 2022 legislative session, gaining broad bipartisan support in both the House and Senate. Only five legislators in the House of Representatives voted against the bill. Of them, only Rep. Craig Fishbein, R-Wallingford spoke during the debate on the House floor. His opposition stemmed largely from a section of the bill which gives control over setting up a task force to the Chairman of the General Law Committee, rather than to a bi-partisan group.
Some groups, including the Digital Advertising Alliance, opposed the bill citing concerns that state-level laws governing data privacy only serve to make the marketplace more confusing. Instead, they said in their public testimony that they would support legislation at the federal level, or, barring that, they called for states to harmonize their laws with each other rather than enact differing legislation across state lines.
Getting a final bill that would pass through state government, though, was a long time coming. Sen. Maroney, one of the primary co-sponsors of the bill, worked on it for three years as part of the General Law Committee. He says that the process has made him more aware of how personal information is being tracked online.
“It’s definitely a growing concern,” says Sen. Maroney. “And as we have more of our devices become connected to the internet that are generating data that could be tracked… It just becomes more and more of a concern.”
Maroney says he also understands it from a business perspective. He runs a small online marketing business when not serving in the State Senate and says the amount of information available to companies is staggering.
“As a marketer, the tools that are available to you are fantastic,” he explains. “But sometimes, as a human, it’s frightening exactly how much information you can get about people.”
That’s where the new law comes in. When it finally goes into effect in July of 2023, it will require any business collecting personal data from at least 100,000 Connecticut residents, or at least 25,000 if the business makes a quarter of its income from the sale of that data, to provide certain notices. Businesses will be required to inform consumers when they collect data, tell them what data they collect, and inform them when they plan to sell that data to third parties. They must also provide a way for consumers to view that data, correct it if there are errors, and allow them to delete it if they do not want it stored. Finally, there must be a way for consumers to opt-out of the sale of their data and businesses cannot punish consumers for choosing to do so.
Businesses will have 18 months to come into compliance with the law once it goes into effect. During that curing period, they’ll have an opportunity to correct any issues that come up without facing prosecution from the government. Once that year and a half is up, though, they’ll be answering to the state Attorney General’s Office, which has sole enforcement power for the law.
Connecticut’s AG’s office is already a step ahead of most when it comes to data privacy. They’ve had a team in place for years working to prosecute companies accused of lax security leading to an increase in data breaches over the years. Now, they’re making big changes to get ready for a brand-new kind of privacy protection.
First, that means figuring out what resources they’ll need. The bill allows for the hiring of additional lawyers and other staff members to help prosecute cases.
“Throughout the fall, we’ll take a look at the law, make any recommendations, including whether the protections in the law should be expanded,” explains Michele Lucan, who heads up the privacy section at the Attorney General’s Office.
The AG’s office will also be tasked with ensuring a level of education, both for businesses and consumers.
“We want companies to be able to comply with the law. We want to make sure we’re helping in that effort,” she explains. “But then on the flip side of things, we also really want to make sure Connecticut residents understand what rights they have under the law, how to effectuate them, so we are gearing up to do that.”
Despite everything the new law will change, there are still major gaps in our data privacy armor.
For one, while it forces companies to increase transparency, it won’t stop them from collecting or selling data altogether. Every protection that will go into place will still put a majority of that responsibility onto the consumers themselves.
It will be the responsibility of the consumer to opt out every time they decide they don’t want a business selling their data to unknown third parties. A consumer-first model might work the other way around, forcing companies to ask customers to opt-in to the collection or sale of their personal data, rather than out.
Experts say they fear apathy on the part of the consumer and the difficulty in separating ourselves from a connected internet will make laws like this almost moot.
“Some users, they have kind of [resigned],” explains Sebastian Zimmeck, a professor and researcher at Wesleyan University. “They say, ‘it’s nothing I can do … I cannot prevent anybody from collecting this data on me, and so I might as well go ahead and not care anymore.’”
“All this information is processed and it’s ultimately to try to make your life a little easier,” says Tom McDonald, CEO of NSI, a cyber security firm in Naugatuck. “And then trying to disconnect … It’s very difficult. It’s very hard to get away from it.”
The new law also doesn’t account for every type of data collection and allows for exceptions in specific cases. For example, small businesses that collect data solely for the purpose of making transactions won’t be affected. This will allow businesses like gas stations and convenience stores, that capture credit card information for purchases and see a high volume of local customers, to be exempt.
It also doesn’t apply to government agencies, a fact that came up as a point of contention for a select few representatives during the debate over the bill.
“How would we possibly draft a bill that says ‘well private entities need to be restricted in how they handle individuals’ data but the government does not’?” asked Rep. Doug Dubitsky, R-Chaplin, during debate in the Judiciary Committee. He called it a “glaring hole” in the bill.
But those pop-ups are their own problem. How many times do you expand them and make individual selections? How many times do you just click “accept all” without a second thought?
Part of the problem with data privacy as it currently stands is that the internet wasn’t designed for the way we use it today. As the internet and our experience with it continue to evolve, our protections are still a step behind.
That’s where Professor Zimmeck’s work comes in. Along with his students and colleagues, Zimmeck spends his days researching data privacy and working on tools to make privacy more accessible and easier to use for everyday consumers. He says the future of data privacy will be built into the experience itself.
“Oftentimes, what we find is that there is a big interest in protecting one’s data, but there’s also a lack of usable tools to do that in a really efficient and effective way,” he says.
Zimmeck is part of a project called Global Privacy Control, a collection of browsers and extensions that allow you to set your preferences once and extend them to any website that accepts the GPC signal.
“The idea of Global Privacy Control is that you have this, let’s say switch… you can imagine like a light switch, and it essentially says privacy on or off,” explains Zimmeck. “So, it’s very simple. It’s a binary signal, and it can be for all sites, for all websites that you visit, or can be for individual sites.”
Zimmeck says, ultimately, users should be able to support sites or companies they trust by allowing them to track or sell their data. The GPC signal could also be interpreted differently in different jurisdictions as more states enact piecemeal privacy laws.
Perfect data privacy probably doesn’t exist. At least, not if we also want a convenient and interconnected world. But experts and lawmakers do seem to believe that a better, more secure internet is possible if everyone works together and keeps privacy in mind.
“I’m actually quite optimistic,” says Zimmeck. “I mean, it’s hard work and it will take time, but we’re actually getting there. The train is moving and it’s moving in the right direction.”