Meetings

Transcript: Select text below to play or share a clip

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Good morning, everyone. This is a joint hearing between the House Commerce and Economic Development Committee and the Senate Economic and General Housing Committee. We're here this morning to take testimony more to educate ourselves on what's happening nationally on the privacy level. We're not here to discuss any bills, not to have a discussion of pros or cons. We're just here to listen. So appreciate everyone that's here today to testify. Senator, chair, Clarkson, I don't know if you want anything more to add.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: No. I would just add that most of the data bills at the moment are in the House. And I know this is maybe a little premature before crossover for us to be joining, but seldom are a number of experts in this field gathered, and so we really took advantage. This is a consumer protection issue that we care enormously about, all of us. And so it's You know, this opportunity, we couldn't turn down. We apologize for only being able to join for the first hour and a half, but we only have three hours of work a day, so we have to turn to our other bills afternoon. Thank you. Appreciate you joining us this morning. Oh, we're delighted. Thanks.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: So I think our first presenter will be Neil Richards. Good morning, Neil.

[Unidentified participant (brief acknowledgments/aside comments)]: Good morning.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: So if you would state your name, affiliation, and continue on with your testimony.

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: Absolutely. It'll be my great pleasure. So so good morning, members of the Vermont General Assembly. My name is Neil Richards, and I'm delighted to have the opportunity to talk to you this morning about privacy law and privacy reform, which are topics that I've studied extensively for the past twenty five years. I'm currently the Koch Distinguished Professor of Law at Washington University in St. Louis, where I've worked for almost twenty five years. Here I co direct the Cornell Institute for Policy in Medicine and Law, which is a center that's focused on the issues of health and human information policy that I understand are on the legislative agenda in the great state of Vermont this session. I'm the author of dozens of articles and two books on privacy, including Why Privacy Matters, published in 2022 by Oxford University Press. And I've also testified or otherwise offered my expertise on privacy in many high profile cases and hearings, including serving this summer as the court appointed consumer privacy ombudsman in the closely watched 23andMe genetic data bankruptcy. And my role there appointed by the bankruptcy court and by the Justice Department was to articulate the public interest in data protection. I am not a lobbyist. I am not being paid to be here today, and the views that I express on my own, and they're not those of my university necessarily, or of the center that I direct. What I'm here to do is to share what I've learned studying privacy for the last thirty years, since the early days of the internet, and from deep study of the interdisciplinary academic literature on privacy, which is a field that I helped to create. My views are also shaped by the fact that my 19 year old son Declan is a Vermonter. He's a sophomore at Middlebury and his plans are to remain in Vermont and teach English in a high school after graduation. I love this state, and I wanna help you make good decisions to protect the privacy of your constituents in ways that also build the consumer trust that Vermont businesses need to thrive in the long term. And these are hard and important questions. I'm grateful this body is interested in exploring them, in doing something meaningful about them and in doing the right thing, which is not necessarily the easy thing, which would be to do nothing or to follow the herd of other states. So I wanna try to help this morning by offering some high level observations based upon my study and research research about privacy, about privacy law, and about privacy reform. Now I understand that my colleagues who are following me this morning, and you have a world class group of thoughtful scholars and other experts speaking to you today, that they'll be going into greater detail, but I wanna offer three points to kick us off. First, I'd like to explain why, in my opinion, why privacy matters. Second, I'll explain why the research suggests that we need better privacy laws. And third, I'll offer what I think are the two essential elements of any privacy law that is gonna protect consumers in a meaningful way. And these elements are first, substantive consumer protections, and second, meaningful remedies. So first, why does privacy matter? So as I explain at length in my book and some of my other work, privacy matters because information about people, you and me, confers power over those people. We need privacy over our personal information to develop our identities, to figure out who we are, to participate as free citizens in a democratic society, and importantly to this hearing, to be able to participate as consumers in a digital economy free from manipulation, exploitation, and exposure. Now, people will tell you that human data is necessary for our economy, for innovation, whatever that means, or to train AI models. And that might be true to some degree, but the people of Vermont do not deserve to have their lives as open books to marketers and data scientists, AI developers, police, masked federal immigration agents, or other entities who seek information about them. A right to privacy is a fundamental right necessary for any decent democratic society. And that right should be extended to corporate uses of human data as well as to government uses. Importantly, the burden of protecting privacy should not be placed solely on busy ordinary people to protect that privacy from all of the entities that are seeking their data for whatever purposes they want. Moreover, as I have explored in a series of articles with Professor Hartzog, who I believe you're gonna hear from next, good privacy rules are necessary to build the trust that businesses need to thrive. Just as with doctors and lawyers, and I say this as a lawyer myself, if we have real trust in the entities we share our data with, the entities we expose ourselves to, we will share more data and everyone becomes better off. But importantly, real privacy protections are necessary if this is going to happen. So that's why privacy matters. The second thing I wanna talk about this morning is why we need better laws to protect our privacy. I was in law school where the internet came into ordinary people's lives. So I've been with the digital revolution from the beginning. During the .com era, the basic regulatory idea was that people might have varying privacy preferences. So companies could write down their privacy practices in a privacy policy, people could read that policy, and they could make informed privacy choices as a result. And this has been the default approach to privacy regulation in The US, and importantly, both the overwhelming weight of scholarly evidence, as well as common sense reveal that this default approach has been a colossal failure. People don't read privacy policies because they simply cannot. There are too many of them from too many companies that we encounter, hundreds and hundreds of pages just to read them. Ask yourself, have you read the privacy policy for your ISP, your email provider, your search engine, your cloud provider, your AI assistant, your car's operating system, your bank, your grocery store loyalty program, your child's learning management system, or any of the other dozens and dozens of businesses that hold your data. Do you know who they share it with and when? There are more of these documents than any consumer can read, much less understand. And if you do read one of these policies, as I do with my students every fall at WashU where I teach information privacy law, you'll find it to be simultaneously dense, vague, and unhelpful. It takes us an hour of upper level graduate students just to understand we need to please Amazon's policy. People struggle to remember all of their passwords. How can we expect them to remember dozens of sets of terms of service and what their answers are to all of those constantly changing settings. And the answer of course is that we cannot, and that this form of regulation that is called in the privacy community, notice and choice. Notice and choice has been a catastrophic failure, offering the fiction of protection over a practical reality that would give consumers the ability to run roughshod, sorry, that give companies the ability to run roughshod over the privacy of every Vermonter. Now, other states have passed so called comprehensive privacy laws, but most of the ones that have been passed to date have made the problem worse by merely rubber stamping industry practice and rubber stamping the failed regime of notice and choice. Obligation to protect privacy should be placed on the powerful companies that collect, process, and often sell vast amounts of personal data rather than on the harried individual consumers who are just trying to live their lives and get on with their days. Vermont, I think, can do better. In fact, I think Vermont can be a leader in this area. What we need is a privacy law that sets reasonable rules of the road for businesses using our data. And we don't have that yet in The United States, which is shameful. The only advanced democracy in the world that doesn't have a comprehensive privacy law. So let me be clear, if the past twenty five years of one privacy and security scandal after another have anything. It's that companies cannot be trusted to safeguard our data without some reasonable rules of the road. Just as we don't trust motorists to drive cars safely without speed limits and traffic signs, we don't trust lawyers with unregulated access to our secrets, we don't trust doctors or direct to consumer genetic testing companies with unregulated access to our medical data. And we don't trust banks with unregulated access to our money. In addition, and importantly, good regulation and innovation in the public interest are not inconsistent. While we can certainly debate how much regulation and what kind is appropriate, having no new regulations at a time of rapid change that we are in right now would be a disaster. And if innovation is as magical as industry says it is, it could still do great things while respecting the policy choices of the people's elected representatives. In this way, the necessity required by reasonable regulation has been and should continue to be the mother of invention rather than the unregulated pursuit of profit for its own sake. Good regulations take dangerous or disloyal business practices off the table, and they give companies incentives to compete by serving their customers better rather than by competing on who can make the most money exploiting them and their data. And this brings me to my third and final point this morning. What are the essential elements of a good privacy bill? And I wanna be clear, appreciate the reminder earlier, I'm not here to endorse any one bill over another, but I think it should be clear by now that my view, as well as the view of the large community of privacy scholars, is that any bill that just doubles down on notice and choice is not remotely enough. Any bill that relies on people having to read dozens and dozens of privacy policies, and then in order to protect that privacy, makes them take affirmative steps with dozens and dozens of companies, isn't remotely enough to empower companies, just to empower consumers, much less to protect them and their families. So I think there are two elements that a good privacy bill should have. They are substantive restrictions on dangerous data practices, and they are meaningful enforcement by both the government and by private citizens. So first, a good privacy bill needs substantive restrictions on what companies can do with personal data rather than acting merely as a recipe for extracting permission from confused consumers. One essential substantive restriction is what we privacy lawyers call data minimization. Data minimization means you only collect the information you need to provide a service to your customers and you delete unnecessary data when you're done with it. Data that isn't needed shouldn't be collected, and data that isn't collected cannot be misused, it cannot be taken by law enforcement, and it cannot be leaked in an inevitable data breach. Another different kind of substantive restriction is a duty of data loyalty that requires that while human data can be used to provide a service, it cannot be used to betray consumers, but it cannot be used to act contrary to their best interests. These are the kinds of obligations that our law has placed for centuries on doctors, lawyers, corporate officers, agents, and trustees. And they would work wonders if we wanted to be serious as a society about protecting people from the powerful companies that process our data and increasingly come to shape our very lives. Substantive restrictions like data minimization and data loyalty are essential because they take some of the most dangerous data practices off the table and they require companies to use consumer data for good purposes, serving their human customers rather than exploiting them or betraying them. Besides substantive restrictions, the second essential feature of a good privacy law is meaningful enforcement, and this also has two elements. The first of these is enforcement by the government, such as the Vermont Attorney General's Office, which I know has a long and proud history of consumer protection, and it should be allowed to continue to do so with appropriate resources. But the second of these is a private right of action that lets people, in this case Vermonters, whose rights have been violated, sue to enforce those rights. Now, tech companies hate private rights of action, and they devote literally millions of dollars to pay lobbyists who will tell you just how awful they are. But if you oppose private rights of action, what you're essentially saying is that companies should be able to break the law and harm people. And that when they harm people, there's no recourse other than to complain to the government. What this really means is that tech companies want to be able to break the law and have no consequences for doing so. It's the opposite of regulation. In fact, it's an invitation to misbehave. If you wanna make a law ineffective, just make it unenforceable. That's when it will be ignored. And that's essentially the gospel that the tech lobbyists are preaching. And that's a recipe for lawlessness, the same kind of under regulated wild west of data exploitation that has gotten us into this mess in the first place and diminished the trust that people have in the companies that use their data. But by contrast, a law that creates protections, in this case for privacy, needs to have some mechanism that makes sure those protections are respected. This is what private rights of action do, and it's what they've done in our law for centuries. Indeed, the foundation of American law itself, Marbury versus Madison, famously restated the ancient legal principle that for every right, there should be a remedy. And that's all private rights of action are, and they're only controversial in privacy reform because they are the essential ingredient that makes law work. A privacy law protecting the right to privacy without a private right of action is likely to be a toothless law as state attorneys general's offices are already small and overworked, protecting consumers across the whole range of consumer protection issues. So let me conclude by thanking you for your time and offering one final observation. When it comes to privacy, I think the worst thing that this body could do would not to be do nothing. Doing nothing would be the second worst thing you could do. The first worst thing you could do would be to pass a weak privacy bill, one that lacks substantive protections or that lacks meaningful enforcement, like a private right of action. A law of this kind would create the illusion that privacy had been protected, but it would merely rubber stamp the kinds of exploitative and unfair trades in personal data that the privacy law had purported to fix. It would create the illusion that privacy reform had happened and that the legislature could move on even though the problem hadn't been fixed. And this is particularly a problem in the tech sector because privacy is today's legislative problem, but tomorrow's and maybe also today's is going to be artificial intelligence. A good privacy law will help to address the problems that AI presents to our communities and to our families, but a bad privacy law will make things worse. A bad privacy law will mean that the problem doesn't get fixed. It'll be bad for the legislature because it failed to fix the problem. It'll be bad for your constituents because they won't be properly protected. And it'll be bad for business because exposed consumers are afraid to give those businesses the trust and then the data they need to thrive. And that's why privacy matters. And it's why a privacy bill with substantive protections and meaningful enforcement is essential in Vermont and throughout the country. Thank you for your time.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Thank you, Neil. Jordan, Okay. I believe you're up next.

[Jordan Francis (Senior Policy Counsel, Future of Privacy Forum)]: Good morning. I will I have some slides prepared today, and I will get those shared. Are you seeing my slides?

[Unidentified participant (brief acknowledgments/aside comments)]: Yeah. Yep.

[Jordan Francis (Senior Policy Counsel, Future of Privacy Forum)]: Terrific. All right. Good morning, everyone. Thank you to the members of the House Committee on Commerce and Economic Development and the Senate Committee on Economic Development, Housing and General Affairs for the invitation to be here today. For a quick word about myself, my name is Jordan Francis, and I'm a senior policy council at the future on the US legislation team at the Future Privacy Forum. FPF is a global nonprofit organization that advances principled and pragmatic data protection, AI, and digital governance practices. We We convene leaders across industry, academia and the public sector to provide expert analysis, benchmarking and best practices that support responsible innovation and regulatory compliance. In my role, I support expert independent analysis of federal, state and local consumer privacy legislation and regulation. My work primarily focuses on tracking consumer privacy legislation, specifically comprehensive consumer privacy bills, and doing comparative analysis to help stakeholders like yourself make sense of the legislative landscape and trends. Now, ten minutes is far too little time for me to cover a complex topic like data minimization today, but I hope that my testimony will provide a balanced and nuanced overview of data minimization requirements as they have historically been understood and integrated into privacy and data protection frameworks, as well as some emerging legislative trends. For more in-depth coverage of this issue, please refer to my data minimization report published last June on the FPF website. With all that said, let's get into what is data minimization. Part of what makes it difficult to talk about data minimization is that when this topic comes up in debates around privacy legislation these days, I think that people are often talking not just about data minimization, but about a broader set of interrelated concepts. Modern information privacy law and data protection is primarily based on a set of principles developed in the seventies and the early eighties, known as the fair information practice principles or the FIPS. These are a set of core principles and baseline rules for the collection and use of personal information that have proven extremely influential over the decades. Among those principles are collection limitation, purpose specification, and purpose limitation. These ideas are fairly straightforward. To generalize them, you should not collect or retain personal information beyond what is in some sense necessary for your purpose. You have to notify someone of the purpose for using their personal information when you collect that information, and you have to get that person's consent to use their information for reasons that are incompatible with the purposes you disclose to them. That is data minimization in a nutshell. Don't collect more data than necessary to accomplish your identified purpose, and it connects back to these other principles. This is beneficial to consumers because it protects them from excessive purposeless data collection, and it also helps businesses by reducing their risk exposure. Now different articulations of these principles have appeared in a variety of data protection frameworks across the globe in the last fifty years. Just to quickly highlight a few international examples, the OECD's privacy guidelines, which are developed in 1980 to support cross border data flows, include these principles and prove to be very influential in incorporating these principles in the privacy frameworks around the globe. More recently, the Asia Pacific Economic Cooperation Privacy Framework from 2015 also includes articulations of these principles. I don't wanna focus on these. They're just examples that these principles have spread around the world into numerous privacy frameworks. The European General Data Protection Regulation also includes data minimization as one of several core principles under the law, and the GDPR provides that personal data shall be adequate, relevant, and limited to what is necessary in relation to the purposes for which they are processed. I wanna emphasize that all of this may sound very straightforward, but a lot of significant compliance work has to happen behind the scenes to make data minimization work. Data minimization is an ongoing process that requires mature data governance, including audits, mapping, and oversight. In practice, organizations have to maintain current inventories of their personal data that they collect, use, retain, and share, and continuously assess whether that data remains adequate, relevant, and reasonably necessary for the specific disclosed processing purposes. This process also requires data mapping to identify the systems that process personal data, document internal and external data flows, and support accountability through documentation of processing purposes, retention schedules, data classification by sensitivity, and conducting impact assessments to identify and weigh risks. Data minimization also extends to your vendor relationships where controllers have to ensure through contracts and oversight that vendors process data only as instructed and to obtain a repurposed data beyond what is necessary. Technical safeguards also play

[Unidentified participant (brief acknowledgments/aside comments)]: a role in all of this.

[Jordan Francis (Senior Policy Counsel, Future of Privacy Forum)]: Some people might tell you that privacy just means collecting less data. Responsible data use benefits many people, and data minimization is fundamentally about whether the collection, retention of data extends beyond what is necessary to accomplish a legitimate purpose. And there are a number of technical safeguards that can factor into that calculus, including things like differential privacy, use of synthetic data, homomorphic encryption, or different anonymization, pseudonymization, and de identification practices. In short, data minimization is a continuous behind the scenes accountability process, and one that must remain responsive to evolving technologies, business practices, and shifting risks. Looking now specifically at the states, because this is the focus of my work, data minimization has become a very hot topic in privacy legislative debates in recent years. I think that if you're someone who's a privacy advocate, data minimization is very much a natural rallying cry. Data minimization means less data collection. That's very intuitive. And if you're on the opposite end of the spectrum and you're concerned about potentially disrupting innovation and economic activity, it might sound like a threat to long standing business practices. I'm here to provide a nuanced analysis that contrasts the different proposals floating around so that hopefully you are better equipped to understand data minimization's role in privacy if and when people come to you in support of one framework or another. This is an interesting and fast developing area of state privacy law, so I'll be covering how the states are approaching this issue. Now since 2018, 19 states have enacted comprehensive consumer privacy laws, and this has led to what I would argue are three distinct data minimization standards across the states. First up, we have what most states have done. And I'll note quickly that actually several states, including, I believe, Iowa, Rhode Island, and Utah, do not have general data minimization requirements in their comprehensive privacy laws, but 14 of those states have enacted some version of the rule that I have on the slide now. A controller must limit its collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which that data is processed as disclosed to the consumer. And this language is actually very similar to what we saw earlier in GDPR. Now the operative requirement here is a disclosed purpose. What you can collect is based on what you're telling the consumer. I've taken the calling the standard procedural data minimization because that's a bit because it fundamentally is about the steps that a controller has taken more so than about the data or the purpose itself. And for data controllers, the standard provides a lot of certainty and control. If data processing is necessary for your business, you can collect it and process it as long as you're adequately disclosing it. And the standard, I will note, has come under some criticism as being less privacy protective because there isn't a substantive backdrop stop preventing a controller from over disclosing and collecting data that a consumer may not necessarily expect in the context of their interaction. There are some natural checks and balances to that, but I won't dig deep into it in the interest of time. Now, that is what 14 states have done. California has taken a slightly different approach. The statutory language under the California Consumer Privacy Act is actually fairly similar to what I just described, but I'd like to move right into the CCPA regulations that were published in 2023. Under the regulations, a business's collection and use of personal information has to be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed. That sounds like data minimization as it's been understood and applied for decades. Be consistent with your stated purpose at collection. But the regulations further provide that the purposes for which personal information was collected or processed must be consistent with consumers' reasonable expectations. And this is a very interesting evolution to me. How do you figure out what a consumer would expect? Well, the regulations provide a handful of factors, including the relationship between the consumer and the business, the type, nature, and amount of the personal data in question, the sources of the personal information, the disclosures that are being made to the consumer, and the degree to which a consumer would be aware of involvement of by other parties in the data processing, including service providers. This is distinct from the procedural data minimization standard that I shared just a few slides ago because the disclosures that a business makes to a consumer are relevant but not dispositive. There's also an inquiry into the nature of the data itself, the relationship between the parties, and what a consumer would perceive. And I will note that this was, interpreted for the first time in an enforcement action last year, in a case involving publishing on a health website and article content for advertising. Moving on to our final data minimization framework in The States, Maryland added yet another standard when it enacted the Maryland Online Data Privacy Act in 2024. That law shifts data minimization requirements from focusing on the disclosures being made to the consumer, instead looks at the nature of the product or service being offered. A controller must limit their collection of personal data to what is reasonably necessary and proportionate to provide or maintain a specific product or service requested by the consumer to whom the data pertains, and for sensitive data that's even stricter. A controller has to limit not just their collection, but also their processing and sharing of sensitive data to what's strictly necessary to provide or maintain a specific product or service requested by the consumer. Now, depending on how these requirements are interpreted, this could be a significant departure from what a majority of states have done so far. Like with California's reasonable expectation standard, it appears to be an intentional shift towards a privacy framework that tries to align data collection more with consumer expectations than with the disclosures and notices being made. And I've taken the calling this Maryland style rule substantive data minimization. And wanna be clear that that's not a value judgment. I'm not here to say that substantive data minimization is better than procedural data minimization. I think it's a helpful descriptive of description for it to understand how the rule functions because the key consideration here is the purpose for which the data will be used in the context of that consumer relationship. And I do think that this is sort of different than how data minimization has been understood historically because this requirement is actually telling you the purposes for which you can collect personal data. Now, Maryland's law takes a novel approach to data minimization, and it introduces a number of important questions that I explore in-depth in that report that I highlighted at start of my slides. When is personal data reasonably necessary to provide a product or service? What about strictly necessary? What's the difference between those standards? How much deference does a controller get to define its own product or service and assert what it believes to be necessary? Does necessary mean essential? Does it account for business need, like profitability? What makes a product or service requested? Does clicking an I agree box count? If so, have you just reinvented consent from the ground up? How does this affect advertising? What does it mean for product development research? What does it mean for AI model development? I think these are all interesting questions, and we have to wait to see how this is going to play out. I will flag that recently the Maryland Attorney General's Office released FAQs on the law, then it included a question about how to interpret these requirements. Their answer is that this will be based on the expectations of the reasonable consumer how that data, about how that data will be collected and used. While it's maybe not a significant clarification on a number of the questions that I'm exploring with this standard, it does suggest that this approach might be converging with California's, you can at least try and approach them similarly in a compliance program. Finally, I won't linger on this point, but I do think it's important that I say that Maryland isn't the only law to require this kind of de minimisation standard. There are similar requirements in sectoral laws, including Washington State's My Health, My Data Act and the New York Child Data Protection Act. Neither of these laws, however, have been enforced yet, so we're still awaiting more detailed guidance on how to interpret necessity in practice. And so to conclude, data minimization is a decades old concept that is common to privacy and data protection frameworks, but it may mean different things to different people. Three distinct models have emerged in The States, and they all offer different things, different positives, different potential criticisms apply to them. Hopefully, my presentation today, when stakeholders come to you to discuss data minimization, you can ask them for specifics on what model they prefer, why, and how that requirement will fit in with the rest of the statutory framework for whatever privacy legislation you consider. With that, I'm happy to pass the mic to professor Hartzog.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Jordan, thank you very much. Professor Hartzog?

[Prof. Woodrow Hartzog (Andrew R. Randall Professor of Law, Boston University)]: Hello, can everyone hear me okay?

[Unidentified participant (brief acknowledgments/aside comments)]: Yeah.

[Prof. Woodrow Hartzog (Andrew R. Randall Professor of Law, Boston University)]: Excellent, thank you so much. Vermont legislators, thank you for the opportunity to speak with you today. My name is Woodrow Hartzog, and I'm the Andrew R Randall Professor of Law at Boston University School of Law. My job doesn't depend upon me taking any sort of position, and everything I say here today is the result of my independent research on privacy issues over the past twenty years. I'm not here to advocate for any bill, but I would like today to talk to you about the importance of substantive data minimization. So I tell my information privacy law students, many of whom I hope are watching now, that if an evil wizard were to come and take away all the privacy rules in the world and I could keep only one privacy rule, I would keep substantive, not procedural data minimization. That's because substantive data minimization, if done correctly, is one of our very few rules that actually works to keep companies from abusing our personal data. Other rules like requiring transparency or giving people control over their data are actually preferred by Silicon Valley because they don't actually hold those companies accountable or interfere with the business models that rely upon the exploitation of our data. In my testimony today, I'd like to highlight three versions of data minimization. First, data minimization is probably our most effective defense against hackers and careless data breaches. Every day, a story comes out that some company has been hacked and our personal data has been exposed on the dark web. AI is only going to make this worse as companies are marketing chatbots as girlfriends, therapists, and personal shoppers, creating an ever larger stream of data. Big tech companies are going to want access to every part of our lives and our minds. Without data minimization rules, all of that data is just going to sit there, creating the most tempting honeypot for hackers. And that's a huge problem because data breaches will happen. A former director of the FBI once famously said, there are only two types of companies, those that have been hacked and those that will be hacked. But a great thing about data minimization is that it reduces the amount of data that is collected and stored, which is great, because data that does not exist cannot be hacked. Data minimization embraces this very simple truth to give us that reality. It is a foundational pillar of every data security framework on Earth. A second virtue of data minimization is that it saves us from the impossible burden of protecting our own privacy, as professor Richards said, and gives us a way out of the I agree button hell that no doubt you encounter every single time you visit a website or download an app. And don't even get me started on the cookie banners. Silicon Valley loves to make protecting privacy our problem. Of course, they dress it up a little by saying that they're putting us in control of our data, but that's just another way of dodging responsibility for their own data practices. Putting customers in control of their privacy is a fancy way of saying, if a company violates your privacy, then it's your own fault for letting them do it because you were the one that agreed to it. Now I don't have to tell you that no one can possibly ever read all of the terms in privacy policy. One famous study estimated that if you were to read every privacy policy on every website that you encounter throughout the year, then it would take you seventy six workdays. And this was in 2012. We're much more connected now than we ever have been. And even if you were to somehow take the time to read all the terms in the privacy notices, which you shouldn't, there is simply no conceivable way that we could understand all the innumerable risks inherent in our jaw droppingly complex data ecosystem. There's simply too much information from too many sources using data for too many purposes that change too quickly for any one person to ever keep it all straight. People should be protected no matter what they choose about their data. Data minimization puts the burden of privacy protection where it belongs and where it can most easily be born on the company collecting and using our personal information to use only what is necessary for a service, and then make sure that it is consistent with our expectations, and then forget the rest. Now a final and maybe the most foundational virtue of data minimization is that it cuts straight to the heart to what is at stake in this debate. The freedom to live our lives in a world where the deck is massively stacked against us by large tech companies. As you've likely noticed, Silicon Valley is working to turn everything that we do and everything that we think into data that they can then collect and exploit for profit. And I don't have to be the one to tell you this, just look at the trends. First, big tech comes for our internet browsing habits, and then they came for everything that we do in our day to day lives and how long we linger in the grocery store, where we're going when we walk down the street, and a complete history of our friends, lovers, and coworkers. Next, they're coming for what's inside our heads. And if you don't believe me, I would love to show you the patents that have already been filed to monitor our brain waves and are already in the pipeline to make it into our earbuds and our eyeglasses. Silicon Valley isn't even trying to hide the fact that their end goal is to read every thought and quantify every emotion that you'll ever have again for the rest of your life. The only thing that will slow this ominous march towards total surveillance is strong rules that limit what data can be collected and what that data can be used for. And here's the thing, big tech is not going to stop unless lawmakers make them. After this hearing today, I imagine that Silicon Valley lobbyists and the organizations that they are bankrolling are gonna come to you and they're gonna repeat the myth that data minimization rules will harm innovation. This is Silicon Valley's favorite move, and it has always been a false choice between human progress and corporate accountability. You're also likely going to hear the argument that data minimization will harm the small business harm the small businesses of Vermont. That is also a myth. In my experience, a lot of the noise coming from small businesses is really big tech in disguise. I'm hoping later today that you will hear more about the fact that surveillance based targeted advertising doesn't really work as promised. And on the whole, it isn't good for either consumers or small businesses. The price of being an ordinary member of society should not be having to have all of your online activity trapped and assembled and used against you. More importantly, even if it is true that small businesses can make marginal gains by using behavioral targeting, which is debatable I think, These meager efficiencies stand to be wiped out by the massive advantage that big tech companies gain with behavioral advertising in order to target and thus siphon off those same consumers elsewhere online. It all comes down to this. The businesses of Vermont will never have more data or a greater ability to target users than large tech companies. And if it is really true that it's data that determines the fight for our attention and for our money, then Silicon Valley is going to win every time. In other words, when it comes to behavioral advertising, Silicon Valley gets the meal and the small businesses of Vermont get the table scraps. The citizens of Vermont and its local businesses deserve more than table scraps. If lawmakers were to pass a privacy law without strong substantive data minimization rules, what they're doing is letting big tech keep the tool that they use to undercut local businesses of Vermont at every turn. The false narrative that meaningful privacy rules were hurt innovation and economic growth is what led lawmakers to ignore privacy for three decades. That abdication of responsibility allowed a devastating data hungry business model to thrive, giving us a version of the Internet that has ravaged small businesses of Vermont and everywhere, everywhere else. And I have to be honest, I think it takes some nerve for Silicon Valley to come out and claim that they're fighting for small businesses, when what they are really doing is fighting to keep the tool that ensures that local businesses will never be able to compete with big platforms. My whole twenty year journey researching privacy has consistently reinforced the fact that big tech companies will always collect and use as much data as they are allowed. And it's not just the local businesses that have been put at a disadvantage here. The lack of strong data minimization rules has resulted in a data free for all that endangers the citizens of Vermont, deprives them of their autonomy, fractures their attention, and interferes with their ability to make meaningful life choices. So in conclusion, data minimization is probably the single most important data privacy rule because it protects our data from hackers. It puts the burden of protecting data where it belongs, which is on the data collector, and it makes it harder for big tech to consistently undercut local Vermont businesses. In short, the people and businesses of Vermont will never be free in a digital world without strong, substantive data minimization. Thank you very much for the opportunity to speak with you today, and I look forward to your

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: questions. Thank you, professor. So next up is Bob Hedges. Oh, in person.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: In person.

[Unidentified participant (brief acknowledgments/aside comments)]: Oh, how exciting.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: It's just I'm excited. Good

[Unidentified participant (brief acknowledgments/aside comments)]: morning. My name is Bob Hedges, and I live in Orford, New Hampshire. If you're familiar with Orford, we don't have a grocery store, nor do we have a gas station. So every time we need food or fuel, I come to Vermont. I actually when I prepared my testimony for today, I was sitting at my kitchen table looking out the window across the Connecticut River at Vermont.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: So I very much appreciate the opportunity to be here today to contribute to the discussion about data privacy. Before getting my specific comments, just a little bit of my own background. Long ago and far away, I graduated from Dartmouth College with a major in economics and computer programming, went to graduate school at MIT. I had a master's concentration in finance. I spent my career in financial services, working both as an operating executive and as a management consultant around the planet. But most importantly, in terms of context of today's conversation, for the last eight plus years until just this past September, I worked at Visa. And the last five years that period, I served as Global Chief Data Officer for Visa with global responsibility for how we collected, used, managed data, and had global responsibility for all our AI initiatives. And then so at that time, I'll put the benefit of bringing to the meeting today. Having retired from Visa, I'm presently serving as a research fellow at MIT's Initiative with Digital Economy, where I teach data science to graduate students and work on a whole series of research projects regarding how to make the data ecosystem work better for consumers. Now, my experience as Chief Data Officer, Peter taught me a lot about the critical imperative of managing data responsibly. Visa is a global business, on all the party in commerce trusting each other. And we serve more than 4,900,000,000 consumers, and more than 175,000,000 merchants. We operated in more than 180 countries with the state data privacy regulatory obligations. So my life has been about how to get this right, both as a corporate executive, trying to make sure we did the right thing and stayed within compliance to the law. But also did the right thing for consumers, so that they would have confidence in the system which they could engage using their card and sharing their data. Now with that backdrop, I want to address four topics today that try to make this whole conversation more. First, I want to talk about how artificial intelligence has evolved rapidly over the last half decade, how that changes the whole game regarding how we think about data. Regulations that were developed a decade ago were obsolete in the context of what's possible today. Secondly, wanted to find and describe the different types of data that are collected and used in commercial applications. Everyone keeps talking about data, I wanna talk a little bit about the specific things that actually matter to consumers and matter to businesses. Clearly I wanna describe how the data is actually used to develop consumer profiles that then are used in commercial applications. So what are people doing with all this data that they so zealously want to collect? And finally, which has been coming out a little bit already, I want to discuss the role of consumer consent. Or did consumers actually give permission to how their data is being used or not? So to begin with quickly reviewing the evolution of artificial intelligence or AI. At the beginning of my career in the 1980s, data was often difficult access and difficult to assemble. Computer power was actually quite limited. Processing time was long, you only ran jobs overnight, would take twenty four, forty, seventy two hours sometimes to complete an analysis. So you worked in the context of very slow and complex game plan systems. Over the decades of our computers, telecommunications, real time connectivity, database management tools have advanced significantly. Processing power today enables large scale computations almost instantaneously. Telecommunications allows instant messaging of the output either to the business or directly to the consumer. These capability which allows speed, integration, and computational power, that's what enables artificial intelligence. It's sort of doing the same thing we always try to do in the past, but faster, quicker, and more powerfully. Now with this much stronger technological and analytical capability, data has become the key fuel. What data you have or can get access to drives the scope of the issues and opportunities that you can examine, analyze, model. So let me take you through sort of six categories of the types of data that we would think about and people working data in the business space, cutting their arms around. First, identifiable data. Data that is unique to you, and it's used to establish who you actually are. So your social security number, your credit card number, phone number, your email address, personally identifiable data is protected under most regulatory approaches because it's how it is that we know you are you. The second category is sensitive personal data. Data elements where their misuse could significantly impact a person's civil liberties. Examples of sensitive personal data include data regarding an individual's racial or ethnic origin, a person's religious or philosophical beliefs, a person's sexual orientation, or biometric data, such as fingerprints. Sensitive personal data generally receives special treatment under most regulatory frameworks. Now, terms of trying to use data, as we're talking about transactional data becomes really important. Data that defines a specific financial transaction that you engage in, like making a purchase at a store, the relevant data is the dollar amount, the store name, the store location, the time of day. All that data is part of what's used. Geolocation data, geo data that establishes where you physically were at a specific time. It's collected from your mobile device or from your automobile. And it gives or it's the most recent transaction that occurred with your credit card. Data that gives the collector of the data your GPS coordinates. Shopping data, data that captures the history of what effort you put into searching for a product or service to purchase it. The web searches you conducted with a tool like Google, the websites or mobile apps you visited, where you picked up cookies or cross site tracking, The geographic location of the stores that you've been in, that's all data that comes through your mobile device, and data that's available for analysis. And then finally, intention data, data that's developed to infer your intentions, an assessment of your intention, what you're going to purchase next. So like what you had in your online shopping cart, or Google's review of your Gmails, or searches that you've done, or what you post in the interest you articulate on Facebook, or TikTok, all that is collected and used to refer your intentions. And of course, are many other categories of data, there's health data, there's wealth data, there's personal interest data, and we could sort of go on about that. But the data that we produce is increasingly growing as our lives become more digital. The data is sometimes referred to as digitally exhaust, and they can be collected and assembled and curated to any analysis you want to do on modeling consumer behavior. Now for commercial application purposes, consumer data is assembled and analyzed to create profiles, or personas, or signals. A profile or persona or signal is used by marketers and businesses to do two things. One, determine the likelihood of a consumer taking a certain action, for example, specific product or service. And two, assessing what tactics might best influence the consumer doing that. So profiles are extremely important to businesses and in marketing and digital advertising. Profiling might not influence the consumer's ultimate shopping and buying decision, has indeed become a big business in its own right. Now to make this concrete, we can review a series of examples, applying consumer data for profiling and signaling. The purposes of examples is to frame what data is valuable, and two, how the creative data scientists can stay and build datasets, profiles and models to provide insights into consumer behavior. We commonly refer to these models or applications as use cases. The specific consumer behavior decision that we're seeking to analytically understand, model and ultimately influence. I'll start with three examples that probably everyone would agree good examples and the best interest in consumer. So example one, a credit card fraud alert, when an uncharacteristic purchase is made with your credit card, and your bank notifies you of a potential fraud. The data is the historical record of all your past credit card purchases, the store, the amount, the location, the time of day. Example two, a recommendation to you from a streaming service like Netflix, based on what shows you've watched in the past. The data being used is the historical record of what shows and movies you previously watched. Example number three, a prompt to you from a restaurant or delivery app, consider reordering items that you've ordered in the past speed up the ordering process. The data is the record of all your past purchases from that provider. These examples or use cases are generally beneficial to consumers. You understand where the data is coming from and how it is being used. A second set of use cases might feel a bit more edgy. You may or may not think, how did they know that? So three examples. Number one, your mobile phone always knows and can report where you are. In the case like waiting for an Uber driver to find you, the value is clear and immediate. You can also however be delivered targeted advertising for retailers or restaurants based on the specific street address that you're near. The data being used is your local phone's location. Example five, you receive marketing offers in social media or emails about discussion topics from conversations at home. The data being levered is the in home conversation history being collected by digital assistants like Siri or Alexa. Devices that may sit in your living room at home, listening and tracking your evolving interests and preferences. Example six, you receive personalized recommendations about your general health. Data in this case being leveraged is the continuous tracking of body metrics, sleep, movement, exercise and heart rate by various health and activity apps combined with your purchase history around health and dietary supplements. Levering personal data for the purposes of fine tuning a marketing message is called personalization. Business marketers will trumpet the benefits and the time saving convenience and the value of anticipating consumer needs. Some consumers will certainly think how great it is that they knew I was going to need that. Another group of consumers will wonder, how did they know that about me potentially complaining that it's creepy. Let me go to a final set of three cases, use cases that most consumers will probably view as both intrusive and negatively distracting. Example seven, you start receiving age specific advertising for products for your children, even when you've made a point of not sharing online any detailed information about them. The data being used is an inference model results based on your online shopping behavior, historical purchases, and mobile phone location data suggesting that you have children in specific age groups, it can all be modeled and inferred. Example eight, advertising that you receive via email and social media feeds reflecting the app usage and online search activity of your TA children on their own mobile devices. The data being used is your children's inferred interests with you being inferred based on your data that you're actually their parent. And finally, example number nine, the price you pay for items you purchase online varies versus what others pay for the same item. The data being used is an inferred price sensitivity indicator, based on your overall shopping history and past purchases that indicate that you are or are not prepared to pay more for specific products. Retailers may be inferring the shopping savviness, your price sensitivity, or even your intelligence based on past shopping behavior and pricing their product according. We are actually looking at a bill in

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: the Senate on surveillance pricing.

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: Okay, so which

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: will ring for those of us who heard this. Great.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: So you may well have seen the December New York Times article in a recent study conducted by Groundwork Collaborative and Consumers Reports on how Instacart, the online grocery shopping service, was using dynamic pricing models to vary the price charged to consumers for grocery items. In this specific study, eggs were being priced differently across consumers based on profile signals developed regarding those consumers' price sensitivity. If consumers were viewed as price sensitive, they would charge the standard rate. If consumers were thought to be price insensitive, Instacart would charge more. Instacart which vary the price charged with consumers based on data informed profiles about price sensitive any individual consumer was believed to be. As a result of the data informed consumer profiling, the price charged for eggs could upwards vary 10 to 20% for non price sensitive consumers. An explicit increase in the cost of the item based on past behavior data of the consumer. The practice of varying price based on real time characteristics of the market or consumer is called dynamic pricing. If dynamic pricing can be applied to eggs, just imagine if it could be applied across your full grocery shopping list. Now importantly, the ability to apply dynamic pricing is a direct result of consumers not being able to control who gets their shopping and purchase data and what they're allowed to do with it. The idea that one grocery bill is higher than the HDB because AI models are suggesting to retailers that they can charge more is a bit trouble. Now, until recently, and we talked about this upfront, the technological capacity and real time processing required to do this didn't exist. But in the era of AI, you're definitely able to make those real time calculations and adjustments. So if you haven't yet read the New York Times article about this study, or you can get the study itself, you should definitely take the time to do so. Now, today's reality is that with the advancements in computing, real time connectivity, and data accessibility, the use cases to which data can be applied for insights into consumer behavior are essentially unbounded. In short contrast, the constraints from regulation and how data can be collected and used remain very limited. The existing regulatory limitation on the use of consumer data is framed by whether or not consumers give permission or consented as frequently referred to for their data to be collected and used in specific use cases. Unfortunately, what should be a simple concept, have you given your permission or not for a specific use case, has here in The United States been allowed to become complicated with unfriendly jargon and legal concepts. The predominant global approach in consumer consent is opt in, where the consumer is explicitly asked to affirm they want to share their data and for a stated purpose. Opt in is used in Europe, The UK, Canada, Brazil, Japan, South Africa, South Korea, India, and even China, as well as many other countries. In contrast, The US primarily follows an opt out approach, where data collection is permitted by default as long as the consumer is somehow notified and given some means to stop it. Under opt out, rather than asking for consumers permission, permission is presumed to have been gifted. Businesses need only one, to inform the consumer that the data is being collected, and two, establish a mechanism to stop the collection if so desired. And we're all familiar with the link deep in the disclosure document to go try and figure out how to stop it. The current opt out framework in The US places considerable cognitive and administrative burden on consumers. Rather than clearly framing the yes or no data decision, the general practice has been the very definition of what data is being collected and for what purpose, deep in the terms and conditions of any product, often wrapped in broad legal language, rendering it almost impossible for consumers to make an informed decision. And one experience where literally deep in the terms and conditions, the consumer was giving power of attorney to the data collector to use the data any way that the collector wanted to use it. But it was so deeply buried in the terms and conditions that consumers didn't even know. Unlike opt in approaches used in many countries around the world, The US opt out approach requires individual consumers to be their own private investigators and attorneys. Opt out approach is both inefficient and ineffective for consumers. At Visa, a consumer research we published in 2024, 55% of US consumers reported believing that companies' daily use, terms of consent, and privacy policy were primarily written to protect companies' legal interests rather than actually help consumers. What is needed is a system of informed consent using an opt in approach for all nonessential data collection.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Did Visa choose to do that?

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: Yeah. We had a very detailed consumer consent process that we applied to every use case that came through, looking for documentation that the specific use case had been approved by the consumer. So yes.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: That was a huge would have been a huge marketing advantage, I think, for many consumers that they knew that you were opt in rather than opt out.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: Yes. I mean, the

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Just saying. Yeah.

[Sen. Randy Brock (Vice Chair, Senate Economic Development, Housing & General Affairs)]: Did they know that, Did the consumer I ask anybody in this room, did they know that Visa had that particular approach? Would anybody know it?

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: If you ask, but I mean, Visa is not a direct to consumer business. So we so we work through I mean, certainly

[Sen. Randy Brock (Vice Chair, Senate Economic Development, Housing & General Affairs)]: But you work through other people and other companies. And those companies don't have that requirement, do they?

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: They don't have it unless they impose it on themselves.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Yeah. So you impose it on yourselves. And I think what we're curious about is how would a consumer have known that, because I think many of us in this room would choose to have a visa as a result of that. But anyway Yeah. We're giving I have questions, I think. Yeah, yeah, you tell

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: me there's also not a shared common ethic of what is appropriate in terms of the use of data and what is not. Unlike many professions in the field of data science, there are no institutions that establish conduct standards for data scientists. Contrast how we govern data scientists and the use of data with how other professions govern their members' conduct. All medical doctors commit to the Hippocratic Oath. Do no harm. In the financial industry, investment advisers are governed by fiduciary responsibility. One must always act in the best interest of clients. In the legal profession, there's formal professional training and a strict code of conduct. And in various trades, electricians, plumbers, and commercial truck driving, they all have formal licensing requirements. In data science, however, there's no established framework to inform and guide data science professionals about how they collect and use data in the best interest of consumers. Strengthening informed consent and giving consumers real decision rights over how their data is used would be an enormous step forward, but also a project for another day. Now, in closing, what is encouraging on this front is two things. One is there's a whole wave of technological innovation being done, lagging a bit what's happened with AI, but nevertheless creating more granular data controls, creating tools that consumers use, creating chatbots that allow the efficient corporate policies and data practices. So that with the right encouragement, there's no economic incentive for companies to make to implement these stricter guidances. But with the right regulatory guidance and incentives, there are reasons to think we can make the world better. The second reason to be encouraged is that we're having a hearing like this today, where we're just talking about how things actually work and what types of anecdotes and changes will be most helpful for the data ecosystem to benefit the participants. Now I go over here from Northwest New Hampshire. We've been home for twenty five years now. And I spend all my time working on these data issues. I'm pleased to have had the chance to be here today. And it's just five to twenty five to 03:02 to sixty two to get over here. So I'm happy to help in any way as you move forward. Thank you. You.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Thank you very much. We have one more presenter, and then committees.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: We have time for questions.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: If there's time, we'll open it up to questions for any of our presenters. Bob?

[Sen. Randy Brock (Vice Chair, Senate Economic Development, Housing & General Affairs)]: Hey,

[Bob Sullivan (Journalist; Visiting Scholar, Duke University; Host, AARP’s 'The Perfect Scam')]: here I am. Hang on one second, please. I'm having a little bit of video trouble all of a sudden. I'd be the one who would have the video problem, right? Yeah, I'm not able to appear on video. I apologize. But I'm glad to be here. Is it alright if I continue like this?

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Yes.

[Bob Sullivan (Journalist; Visiting Scholar, Duke University; Host, AARP’s 'The Perfect Scam')]: Okay. Great. Well, I just wanna explain very quickly who I am and and I think what my role is here. My name is Bob Sullivan, and I was a reporter at NBC News for about twenty years. I've been covering technology and society issues for almost thirty years. I've spent the last ten years of that time working at Duke University as a visiting scholar and also continuing my work as an independent journalist. And I I wanna go back to my time at at NBC News because the story that I have two stories I wanna tell today. I feel like that's my role in this panel is to tell a couple of stories. And the first one is about, the the very first, data breach, ChoicePoint. If you look on my Wikipedia page, you have spare time today, you'll find that's what I'm known for. It's now 20 old, so I'm telling old tales. But the important thing about ChoicePoint is that it was the first data breach that we know about. And the reason it's the first data breach that we know about is because the California state legislature passed the data breach notification act. And I happen to be lucky enough that I was following that law as a journalist. And so when these crazy letters started going out to California residents, I knew what they meant, and I was able to do the story and I had that great experience of everybody in the country following my story for a couple of days. But the reason I bring that up, is because I think it's important for everyone to realize how important state legislatures can be. I don't think it's an overstatement to say that the original California Data Breach Notification Act is probably the most important law ever passed in The US to enhance cybersecurity. We can debate what's happened over the past twenty years about data breach notifications, But without that law, I don't think we would have passed. I don't think we would have pushed cybersecurity forward anywhere nearly as intensely as we have since then. The work that you are all doing today is incredibly important, I I honor it. I'm very glad to be here. The main hat that I wear these days is I am the host of a podcast called the perfect scam, which is AARP's weekly podcast where we highlight individual cybercrimes. I'm very proud of the work that we do there. Instead of the traditional true crime stories where you tend to, glamorize criminals and say a huckster was clever and stole and got away with millions of dollars worth of crimes, we actually highlight the victims. And so every week, I interview the victim of a terrible internet crime, a woman who thought she was in love with someone and sent a million dollars, to a criminal or a criminal gang overseas for example, and the point of these stories is to humanize them and you know we all have that human reaction of how could someone do this, But when you actually talk to these human beings involved in these crimes, find out what this woman was married for fifty two years. She was signing 50 papers the day that her husband died and somebody called with the fifty first paper and saying they were from the insurance company. These stories always have a backstory, And the reason why this is relevant to your work here today is because so many of these crimes are fueled by data collection in one way or another. The radioactive waste of our time is what happens to this data once it is either sitting in a place where criminals can steal it or when it is sold by data brokers or other entities to criminals who intend to do harm. Now that seems far fetched, but just a simple Google search will yield dozens and dozens of stories of where this is relevant. Just last month, the California Privacy Protection Agency find a data broker named Data Masters and kick them out of California because they were selling a list list like this. They had 400,000 postal address postal addresses for Alzheimer's patients. They had 2,000,000 addresses for blind or visually impaired people. They had a 100,000 addiction sufferers and a million people with bladder control issues. Now I just want all of you to sit for just a moment and think about what criminals could do or anyone who just wanted to do ill with a list of Alzheimer's patients, with a list of people suffering from cognitive decline. The the it makes my skin crawl to think that companies collect this kind of data. And, of course, the way that they assemble these lists is up for grabs, and we that's an interesting topic of discussion. Most of these are inferences. These aren't check boxes, or they're not buying this data from hospitals. They've inferred this from merging databases or from other other behavioral observations. This is not a one off. My colleagues at Duke University have been studying data brokers for a while now, and Justin Sherman wrote a piece, about a year ago for Law Fair where he highlighted the criminal activity at, three companies, Epsilon, Macro Market, and KBM, who collected lists and sold lists for years of essentially lead list for criminals running things like sweepstakes scams. So some of these companies, you know, you probably have heard in your in your time that these companies categorize us as things like urban scrambling urban scramblers or or fit and fun. They give us names. Well, in this case, they had a name for people who were likely to be victims of a lottery scam solicitation. They were known as opportunity seekers. So they had lists of opportunity seekers that they would sell for years. And there are emails that the justice department, when they, filed criminal charges against these companies, that they found where the executives at the company knew what they were doing. There's one email from a worker who said, who responds to this stuff? Well, obviously, we have those people. So the these companies were expert at identifying who would likely be, willing to respond to, say, a lottery, mail solicitation, and there was a devilish feedback loop to this. So not only did they help these companies buy lists for years of who would be a likely victim, but they were able to hone these lists, and their algorithm got only better and better at finding criminals finding victims rather for the criminals. And eventually, all three of these companies had to, settle with the justice department. But I just wanna point out that when we talk about data collection, what we're talking about is the unintended consequences that come. Again, the radioactive waste that comes when people collect, when companies are allowed to collect data, when companies are allowed to assemble data, when companies are allowed to sell data, and what we haven't even, breached, which I know all of you are aware of, is what happens when criminals can access this data, and that's what happened in the ChoicePoint story originally. And let me close with this comment about ChoicePoint. When I was lucky enough to get a couple of emails from California consumers saying, got this letter from this company I've never heard of, ChoicePoint, and they tell me that somebody has accessed my data and that I should take action to protect myself. I don't know what that means. I knew what it meant because I had followed the law. So I called up ChoicePoint as a reporter and I said, explain to me what happened here. And the head of PR from ChoicePoint said to me that criminals had stolen data, but they had only stolen it from California residents. This pact didn't affect anybody else. And this is the only time I've ever done this in my career. I said, wait a minute. You're telling me that when the criminals went in and broke into these systems, they specifically selected California residents. And he said, yes. That's right. They only stole data from California residents. And I said, I'm going to pause here and tell you that's ridiculous. And I'm going to have to put what you're telling me into my story. We're immediately going to say it's ridiculous. And so that's that's our statement. That's our story. They only stole data from California residents. Now the only reason that he said that was because that was all they had to do to comply with the current date of law at the time. So notifications only went out to California residents and twenty four hours later, they to confess that yes, of course criminals stole data from the entire country. But I say that to you because that's how literal companies are, of course, when we have regulations. So they will do the absolute minimum required by whatever it is that you're considering. And so whatever you're considering passing legislation, want you to think about the original Choice Point story and the idea that criminals might only steal data from, say, Vermont residents. And I'm happy to take questions from you, and I'm very thankful for the opportunity to speak to all of you, and I honor the work that you're doing. So thank you.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Thank you, Bob.

[Unidentified participant (brief acknowledgments/aside comments)]: Thanks. Questions?

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Yeah, we have about have any

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: of our presenters are still here. Anyone have questions?

[Unidentified participant (brief acknowledgments/aside comments)]: I guess I could just

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: be right

[Sen. Thomas Chittenden (Member, Senate Economic Development, Housing & General Affairs)]: about logistically. I just heard a

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: lot of a lot of

[Sen. Thomas Chittenden (Member, Senate Economic Development, Housing & General Affairs)]: strong advocacy for different aspects of things that we're considering here today and I don't see on the website a lot of the testimony, but I did see a lot of the advocates reading scripts. If you'd be willing to provide what you read just for follow-up reference, there's some things I'd certainly like to follow-up on, especially Professor Richards, your advocacy for certain aspects of Bill before us, it'd be really helpful to just be able to look back and cross reference the statements that

[Prof. Woodrow Hartzog (Andrew R. Randall Professor of Law, Boston University)]: you made in that field.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Yeah, thanks, Thomas. Yeah. That's

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: I'd be happy to do it, even to put it in complete sentences in a way that's more that's more accessible. We we we can work that out and and get that to you probably within a week, if that's an okay time frame.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Yep. That would be very helpful. We do have one article, I think, that we're posting on our website, the legislative data, the legislating data loyalty that you and Woodrow wrote together. I think we have a copy of that, but I think that's it.

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: It's dangerous to ask an academic for sources, but we'll put a curated list together for you that we hope will be Transcript

[Sen. Thomas Chittenden (Member, Senate Economic Development, Housing & General Affairs)]: of the statement.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Yeah. I think just what you get I think what Thomas is asking for is your testimony. I think what we'd ask for every one of you is your testimony, if you could send that electronically to our committee assistants, because that's what we'll be referring to. Randy. Randy has a question.

[Sen. Randy Brock (Vice Chair, Senate Economic Development, Housing & General Affairs)]: Just one question. Every time I read a privacy law or a disclaimer or privacy regulation, it is so, in many cases, comprehensive and lengthy and putting people to sleep. How can you convey what's really important to consumers in a way they actually read and can understand? How can you do something that perhaps does not have to be read in detail, that accomplishes what we want to accomplish about real privacy protection and choice without, given all of the regulations that we have, both at the federal and state level, is sufficiently comprehensive and perhaps not as legalistic as what we do now? Is there a better way

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: to do it? Professor?

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: And Bob, I think, has Bob has an input.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Yeah. Professor Hartzog?

[Prof. Woodrow Hartzog (Andrew R. Randall Professor of Law, Boston University)]: So thank you so much for this question. So this is, I don't think there's a way to do that because there's a really, there's a trap that it's so easy to fall into. You either have all the information in the world and no one can ever read it and possibly understand it, or you abstract away all of the risk and you get a statement that's like, our company collects data for stuff. Right? And that's not helpful at all. And then it again forces the obligation back on the user. This is why data minimization is so important because people should be protected no matter what they choose. People just want to make choices and live their lives. The idea that we should empower people to have control over their data is a broken way of thinking about privacy because it results in the trap that there's either too much or too little information. There is no happy medium because it'll never work at scale. Even if you were to tell someone about this one business, the web of hundreds of thousands of companies that are using your data for hundreds of thousands of purposes will never be explainable. And so what we need are these strong rules that actually don't conceive of what we need to know to make these choices. The the right question is what is what are the rules necessary so that people can make meaningful life choices unburdened by data privacy risk? That's what I hope that you take away.

[Unidentified participant (brief acknowledgments/aside comments)]: And Bob, if you'd come

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: to the table to answer, because I think opt in is the Yeah, onto

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: debate with another presenter. Feel free. Beasley, At we've done a comprehensive review of corporate data policy language. We actually ran a series of focus groups where we give consumers disclosures and ask them to rewrite them and we workshop this in three different continents for a period of time. Consumers are looking for the answer to six issues. What data element, for what purpose, for how long, who's going to get it, and can you sell it to a third party or not? And if you can boil down the answer to those five things, it's easy enough. To

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: say yes, no.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: To say yes, no, yes, no. We actually demoed, developed, prototyped a user interface on a mobile device so that you could answer those questions and decide whether or not the data should be used or not. Outside The United States, there's lots of appetite for that because it aligns with regulatory obligations. Inside of The United States, there's much less interest because the corporate view is that it's not necessary to do that because it's not mandated by any law. But the UX consumer communication problem is not a hard one to solve.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Yeah, Craig.

[Bob Sullivan (Journalist; Visiting Scholar, Duke University; Host, AARP’s 'The Perfect Scam')]: Could I jump in here for just a second, especially now that you can see me? I apologize for that. I just wanna say two quick things. One is we don't ask people to decide whether or not their toaster is safe or their car is safe. And I feel like we have the question backwards. But the other thing I really think that is often ignored in this discussion is the lack of competition. If there were 50 different Instagrams, then parents could decide from among them and based on what privacy choices they do or don't make. But for the most part, there's so much concentration in every one of these products that we buy that we just have to take what's given to us. That's another issue.

[Unidentified legislator]: I have a lot to say about what you just said, and I'm going to let that go because our time is pretty And my question is really to Jordan, I think. You had a lot of information about data minimization and the amount of work that companies do when they're trying to keep it to a minimum for what they just need. And so we're a small state. We have small businesses. We have not a lot of And we're trying to figure out how to have a good law that works for our consumers and for our businesses. And so I'm wondering if you know what the costs are to business when you put minimization requests, when requirements

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: on them. Good question.

[Jordan Francis (Senior Policy Counsel, Future of Privacy Forum)]: Thank you for the question. I can't speak to any specific costs. I think it's going to matter a lot based on the size of the business, how mature its privacy compliance program is, how many people are doing that work, whether or not it's in a heavily regulated sector, what kind of data governance practices they have in place to begin with. I do think it's extremely contextual. And I will say that my point in making in raising those ongoing compliance efforts is just to say that all of the data minimization frameworks that I highlighted, all three of them do offer benefits to consumers in terms of reducing data collection, minimizing exposure to unnecessary data collection risks. And that might be different across those frameworks and matter in different contexts, but there is significant compliance work that goes into that. And I encourage you to speak to as many people who are actually doing that compliance work to get a sense of what does that effort take, how many people are doing that work, and what are the maybe technological limitations or challenges that they have in, say, adequate data mapping to make sure that they're appropriately engaging in data minimization.

[Unidentified legislator]: Oh. Yeah, and I don't know if this is a question for Bob or Professor Hartzog. I know both, Professor Hartzog, you mentioned the ways that And we're going have a segment on targeted advertising, but you kept referencing the ways in which big tech undermines small businesses' ability to basically operate and advertise. And I know this comes from I was just wondering if you could share a source, which I think Bob also has a relationship with as far as Equities work. But I was wondering if you could explain that a little bit.

[Prof. Woodrow Hartzog (Andrew R. Randall Professor of Law, Boston University)]: Sure, absolutely. So the source that I'm referring to is a professor, one of the most well respected professors in our field named Alessandro Acquisti, who is now at MIT's business school. And he has done a lot of research recently about the ways in which stated benefits of targeted advertising get canceled out by the incredible advantage that large platforms gain and then being able to siphon off those same customers because they get the same amount of data. And so when there's no restrictions on what gets collected, these incredibly powerful platforms are far better at targeting and stifling off customers. And so I'm happy to make a collection of those sources if you're interested and submit them along with my testimony if you're interested.

[Unidentified participant (brief acknowledgments/aside comments)]: That'd be great.

[Unidentified legislator]: Thank you.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Just please.

[Sen. Randy Brock (Vice Chair, Senate Economic Development, Housing & General Affairs)]: Just Bob.

[Unidentified legislator]: I didn't know it was fair to I knew you working with

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: us all.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: I saw his head. At

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: MIT, is the poor academic researcher on decision. I think the other thing to consider relative to this question of effectiveness of the advertising and whether or not it has deteriorated, varied competitiveness of small businesses, is more the issue in a world of agentic commerce, where today more than 60% of all purchases in The United States begin to search. And if we head towards a world where it's not just a search, but it's fulfilled through an AI agent, it's not a level of playing field for small businesses, period, And and I think advantage that the agents have is the access to all this data and the ability to proactively come at you with an offering or discussion of a need, and be able to immediately fulfill on it, as opposed to your shopping purchase decisions triggered by

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: the fact you happen to

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: be in town that day, and the general store is all this. And if needs are modeled and anticipated and proactively marketed through what one can do with data and agents, it dramatically disadvantages the business that is dependent on the hours in which it's all done and drive live traffic.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: So is that 60% of all purchases both in person or just 60% of online purchases? Wow.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: Because the search process begins. A whole lot of consumer research that's been done on the issue of how people shop and the role of digital tools in that process.

[Unidentified legislator]: The number of pages you have to go through to get a different answer is ridiculous.

[Bob Hedges (Former Global Chief Data Officer, Visa; Research Fellow, MIT Initiative on the Digital Economy)]: Yes, and even if you end up going to the store to make the purchase, it usually begins with a and a lot of that research is public. I'd be happy to enable you getting access to it. At least 2.6.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: You can stay there for a second. We'll have one more question. Jonathan?

[Unidentified legislator ('Jonathan')]: Good morning. Had long assumed that part of the trade with respect to free services was that the data was essentially taking the place of a financial exchange. Is there a meaningful distinction between the matters that we're discussing now that pertain to a carton of eggs or something that someone then buys, versus the sort of experience of trading data for free software as a service or things of that nature. Does the introduction of a financial transaction with currency make a difference in who a consumer is and therefore the extent to which they can be protected?

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Neil, good question.

[Prof. Neil Richards (Koch Distinguished Professor of Law, Washington University in St. Louis)]: That's a great question. Yeah. I would say to some extent, but substantively no. A lot of the so called free services on the internet were justified on the basis, not of the data, but of the advertising, right? And I think if companies had come out of the beginning and said, we are a search engine, we are a social network, give us all your data and we'd give you a free service, consumers would have overwhelmingly rejected that bargain. Empirical evidence going back three decades on consumer expectations about privacy on the internet backs that up enormously. I think when Mark Zuckerberg testified at the Cambridge Analytica scandal, he was asked by Senator Hatch, how do you make money? And Zuckerberg response was not, we give free services in exchange for data, or we give services in exchange for payment in data. He said, direct quote, Senator, we serve ads. And I think the whole scope as Professor Hartzog got into, if I can call them the Bobs, as they got into in their testimony, the backend data processing, data collection, data analysis is so far beyond the comprehension of consumers that it cannot be seen as part of the transaction for two reasons. One, it is complicated and too hard for the ordinary consumer to understand, just like we don't understand how our toaster works, to use an example someone has used recently. And second, companies go to great efforts to hide what they are doing behind design, behind public pronouncements, behind vague language and advertising and at the top level of privacy policies. And so there's a lot of obfuscation that goes on, but I don't think you could reasonably say in today's digital economy that there is a knowing and fair transaction for, quote, free services in exchange for data.

[Sen. Alison Clarkson (Chair, Senate Economic Development, Housing & General Affairs)]: Thank you. With that, we are gonna have to depart Senate ten forty back in our committee room. And thank you so much. This has been terrific, great updates and new information. We will see you after the House sees you in terms of follow-up. Okay, great.

[Rep. Michael Marcotte (Chair, House Commerce & Economic Development)]: Thank you to all our presenters. Just want to be sure that people that are following us on YouTube that we'll be taking a short break. But when we come back, we'll be on the House Commerce's YouTube channel, and right now, we're on Senate Economic Development YouTube channel. So with that, Kira, I think we can go off live, and we'll be back at 10:45. Right. Okay.