Meetings

Transcript: Select text below to play or share a clip

[Michael Marcotte (Chair)]: This is the House Committee on Commerce and Economic Development. We are back from the floor. It is 02/27/2026 at 11:21AM. And now we're going to move into receiving some more testimony on House Bill two eleven and AccurLink to Data Brokers and Personal Information. And first up, we'd like to hear from Ryan. Good morning, everybody. My name is Brian Kreger. I have been a resident of Montpelier, Vermont for fifteen years. For most of that time, I worked as an assistant attorney general in Vermont, attorney general's public protection division, doing consumer protection and antitrust enforcement with a focus on privacy and data security. After that, I worked for the Federal Trade Commission in their division of privacy and identity protection.

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: I also teach about privacy law and policy to a protection First Amendment at the University of Vermont. I've been teaching there for nine years. Currently, I am the deputy chief of the Massachusetts attorney general's privacy and responsible technology division. Today, I am speaking in my personal capacity as a Vermonter who cares about his own privacy and the privacy and safety of his wife, his friends, and children, and his fellow Vermonters. My comments today are mine alone and do not reflect any positions of the Massachusetts Attorney General or those of her office. When I was a Vermont AAG, I drafted the data broker report that led to the original data broker law. I was also very involved in the drafting and negotiation of that law. I worked with Chairman Marcotte at the time, I think he was ranking member at the time, on the original data broker law. In the course of advocating for that law and negotiating with stakeholders, was heard incredible pushback from industry and their lobbyists. Years later, I was told that one major data broker had made killing that bill their primary focus for the year. At the time, I didn't understand why the pushback was so ferocious, given that the bill didn't actually require any businesses to change their practices other than documenting the fact of their existence. Over the course of that experience, I came to realize that data brokers really don't seem to want anyone to know they exist. They are the only industry I've encountered in my two decades of legal practice that doesn't want people to know about them. Most companies spend a lot of money announcing their existence to the world in the form of advertising and marketing. One lobbyist told me in a moment of candor that businesses don't want to be in the registry because being called a data broker was like having a scarlet letter. I posit that if you are actually ashamed to acknowledge your line of business, that says a lot about the ethics of the work you do and the value it has to society. Despite that resistance, the fact that this law was enacted, the first in the nation, state or federal, to actually name data brokers should be a point of pride for this committee and the state of Vermont. Were it not for this law, I do not believe California would have passed its data broker registry nor would it have passed the DELETE Act, which creates a method for Californians to easily remove themselves from most data broker databases. The DELETE Act isn't a perfect law. It has more exemptions than I'd prefer, and, like many recent privacy laws, requires consumers to take affirmative steps to protect their privacy rather than putting the onus on the data brokers to justify invading their privacy, but it is a major step forward. This law has also received criticisms from some privacy advocates, namely that it doesn't actually do much. I'm talking the original, not the amendment. The term light touch was used so often during the negotiations of this bill, light touch legislation, that at one point, it seemed that if it was any more light touch, it would be practicing Reiki. Finally, someone laughs at that. Was good. That light touch was, however, a feature and not a bug. We had two goals at the time, to put data brokers on the regulatory map and to pass a law that would stand up to constitutional scrutiny. That first goal is an important one. When it comes to passing legislation, as I'm sure you all know, it is far easier to stop something from happening than to build something. And one compelling argument is it's never been done before. This law eliminated that argument. The second goal was equally important because at the time, the Supreme Court case of IMS v Sorrell was still fresh in the mind of many legislators. That case nullified a Vermont law that attempted to permit doctors to opt out of having pharmacies sell their prescribing information to data brokers. Many scholars have criticized that opinion, but it's the Supreme Court and it is current law. I want to note that the Supreme Court did not rule in that case that attempting to regulate data brokers was unconstitutional. The problem was that the law permitted an op permitted academics to use the information, but not commercial entities. And the court held that this was a viewpoint discriminatory law because it limited the ability of pharmaceutical marketers to be able to communicate, but not academic staff based viewpoint discrimination. My hope at the time was that having laid the foundation, this body would return to the law and build on it. I'm thankful to see that happening now. In particular, I applaud the following improvements in the current amendments. In the original bill, I hope to see a strong credentialing requirement to address the problems of data brokers supplying information directly to fraudsters and other bad actors, which was a known issue. This bill corrects that omission with section twenty four thirty one b. Once we started enforcing the law, we quickly realized that while the law imposes fines for failing to register, it did not impose fines for filing incorrect or overly vague information. This bill mostly corrects that in section twenty four forty six b. It imposes appropriate fines for filing materially incorrect information. I suggest expanding that to include omitting sufficiently detailed information because companies tend to provide extremely vague information that is not actually actionable, but is not necessarily incorrect. If you were to add that requirement, you could provide a cure period. In other words, please provide something more specific. We're not fining you, but if you don't provide something more specific in sixty days, then we may impose fines, something like that. The amendment bill includes a number of additional reportable elements in section 2446A, which I believe are extremely beneficial to enforcers, researchers, advocates, and journalists, and through their efforts, I would recommend the deletion of what is labeled in the original law 2446A for D and E, those two sections which require the reporting on whether a data broker imposes a credentialing process and how many data broker data breaches and experiences are no longer necessary because this law requires a credentialing process and requires direct notice of those breaches. Those were compromises because we didn't get those into the original bill. They don't need to be reported anymore. Finally, I have long argued in other contexts that the exemption of public information from covered data is a huge loophole and a fatal flaw for many privacy laws. And I'll be going on in length about that, but I won't here. This bill fixes that problem in its definition of publicly available information, section twenty four thirty sixteen b. I hope that y'all fight like heck to keep that section intact and it will serve as a model for other states. All that being said, I do have four suggestions as to how this bill could be improved. Most critically, fix the data broker data breach language, which currently doesn't accomplish what you seemingly want it to do, and I will try to briefly explain why that is. There is a prohibition on fraudulent acquisition of data, which I think is very valuable. I'm glad we got that in the original, but I think it could be improved to address certain current enforcement gaps. You might consider expanding the definition of data broker, which currently covers third party data brokers, to include certain major first party data brokers. We're talking Google, Amazon, Meta. Not everybody, but the big guys. And there is a jurisdictional issue that may benefit from some helping. So, I'm going to try to talk about the data breach notification part briefly, but I do think it's important. Data brokers have experienced a lot of data breaches and they have lost a lot of data when that happened. We already have a data breach notification, which requires all businesses can notify data breaches. And I enforced this act for fifteen years. Sorry, eleven years. Data Breach Notification Act requires all companies to notify breaches if those breaches involve personally identifiable information, PII. PII is a very specific definition. The key thing about PII is that if any one data field of PII is lost, that is very likely to be harmful to consumers. We're talking Social Security numbers. We're talking financial information, passwords. Right? If they lose your data, you're gonna wanna know about it. You're gonna wanna be notified. That's what the Data Breach Notification Act is all about. Data brokers currently also have those same obligations to notify in the event that PII is lost. K? So they it's not like they're exempted. The current bill just adds data broker to the definition of security breach, which, again, doesn't add anything. They're already considered data collectors for the purpose of that law. The original law had a different definition called data broker data breach, which has been deleted from this version. And it had a separate reporting requirement that data brokers would have to report data broker data breaches. K? So the reason we have this narrow definition of PII is because we don't want businesses to have to put the highest level of security on all the data that they collect. If a company loses everybody's T shirt size, we don't care. It's very unlikely that T shirt size is gonna cause harm if it gets out. So we don't require them to report a breach of T shirt sizes or encrypt T shirt sizes. If a data broker loses your T shirt size and the 4,000 other data points about you that they may be collecting, which together allow them to create a profile of you that is incredibly detailed and incredibly valuable to fraudsters and scammers, there is a chance, a good chance that consumers are gonna be harmed by that, and they should have a right to know that their data has gotten out there, and the AG should have a right to know that they lost their data. Historically, the first stage, if you were a con artist, the first stage in your con was to research your target, research the mark, find out information so you could exploit and manipulate them. Data brokers allow con artists to outsource that. Now they just have to buy the data off a data broker or find it when it gets you know, the credentialing should stop that from happening if it works. The data breach notification would stop it if it gets put on the dark web and people, you know, collect it that way. So my point is in a data broker data breach, enormous amounts of data may be lost. Any one of those data points is not PII, but the con the aggregate, the conglomeration of it can be very dangerous. That's what we were attempting to do in the first phase. The data broker data breach notification dropped out of the bill, and we just require people to say, how many did you have in the last year? I haven't done the research, but I would be shocked if any data broker has reported any more than zero in the data breach of data broker data breaches they had in the past year. I actually engaged in an investigation of a data broker over a massive data breach when I was in the office. They had they had filed with the AG's office, and I noticed they didn't even report the data breach that we were doing a 50 state multistate investigation of. And, of course, because we didn't have the extra penalties, we couldn't really do much about that. K? So I think it's important to keep that in. And that's all. I'll move off of that. Second, I suggest you expand the prohibition on fraudulent acquisition of brokered personal information in 2431 a to change acquired by fraudulent means to acquired by deception. Deception is a statutory cause of action created in the Consumer Protection Act because it is actually very difficult to prosecute fraud, and it gives the attorney general much more leeway to prosecute deception. I would also prohibit at an additional prohibition on the use of brokered personal information that violates the terms through which the person acquired it. I will give you an example. Anyone can acquire the Vermont statewide voter checklist. It's publicly available. All they have to do is fill out a form and certify that they, quote, will not use the information in the statewide checklist for commercial purposes under penalty of perjury. You go online, you will see that there are a lot of data brokers who are commercially selling our voter checklist because they got it, they signed the document, and then ignored that. The only enforceable remedy would be to find a criminal prosecutor who wants to bring a perjury claim against this company, which I can almost guarantee you is never gonna happen, which is why they do it. So putting some more teeth in the law so that companies acquire data and say, oh, sure. Sure. We'll use it for the reason you're telling us, and then they go and use it for totally different reasons, I think the AG should have the ability to do something about that. David?

[David "Dave" Bosch (Member)]: Can I clarify just a quick swap? Yes. You're speaking of you're talking about twenty four thirty one a Yes. Okay. And adding up a d or something along those lines besides age 14. Yeah. Okay. Yes. So a person shall not acquire brokered information through fraudulent means. Yes. Something along the lines of a person shall only use brokered personal information according to the terms under which they required it. In privacy speak, that's called use specification, and most of the comprehensive laws have a use specification requirement in it. Unfortunately, all of those laws are very vague as to how they apply to data brokers, which is why having it in this law would be very helpful. Thanks. Third, when the law was passed, some critics on the privacy side didn't like that it only covered third party data brokers.

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: Now expanding it beyond third party data brokers would have expanded it to almost everybody, and we weren't trying to regulate almost everybody. We wanted data brokers. The current bill does expand the definition to include first party data brokers who obtain data through third party means. So for example, if you have a Facebook account, Facebook is collecting tons of information about you not directly from you, but through pixels they put all over the Internet. This would bring that within the definition, but they'd still get to use your first party data, however they want not be considered data brokers. I propose putting in something that adds first party data brokers, but only if they collect, say, over 200,000,000 records in the previous year. Okay? Really, we don't wanna pull in the small businesses. We don't even wanna pull in the Home Depots. Just the really big guys, Google and Amazon Meta, who are collecting this. And Google doesn't collect third party data. They only collect first party data, but they're so huge that they're essentially it's the same danger. And finally, recently, the Superior Court of Washington County dismissed Vermont attorney general's lawsuit against Clearview AI, the company that screenscraped billions of images, including Vermonters, for an invasive facial recognition tool. The court found a lack of personal jurisdiction, essentially finding that Clearview AI lacked sufficient contacts with the state of Vermont. I don't have exact language to suggest here, but I ask that this committee consider a way to address jurisdictional issues when it comes to data and to include language that makes it clear that if a company is collecting the personal information of Vermonters, that gives Vermont courts jurisdiction to hold them. Their remarks and from their consent to respond.

[Unidentified Committee Member]: Thank you, Ryan. I'm curious, there's kind of two questions that have come up that I was just wondering if you could address, especially given all of your experience. One is basically hearing from industry interests that maybe we don't need this since we're already working on comprehensive. They said they support the updates of definitions and things like that, but that the right to delete isn't something that maybe we do at this time because we're working on a comprehensive bill. That's one. And then the other one is you started to talk about the fact that when this bill was started originally and passed instead of a president for everybody else, it's it was just a start. And so there was a question about things like the exemptions since we don't have a comprehensive privacy bill, just like working off something like what California has, which I think in a lot of ways, other people working in this space across legislators across the country are also trying to constantly build upon the efforts of other states. So if you could speak to that maybe we shouldn't go, or should we, base something purely off what another state is dead versus trying to evolve and tighten things up.

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: Sure. Okay. Okay. So I don't think that this law should take away from the comprehensive bill, but I think that this covers issues that the comprehensive bill doesn't cover. The thing about all these comprehensive bills, and I have read all 19 states worth of them, I can't quote them back to you, but, you know, my nighttime reading, is that they are all modeled on this idea of a company is collecting data directly from an individual, and then that individual has these affirmative rights of what they can do with the data. They can ask if they have their data. They can ask to correct the data and things like that. The laws are very poorly modeled for the third party data broker, the situation where there's a data broker who has no connection to the individual. Technically, they fall under the law. How the law would actually apply to them in practicality is unclear. Okay? And the idea that a consumer is expected to individually reach out to the 3,000 data brokers who they don't even know who they are, and they're constantly changing their names, and and and email them or write to them and ask them to it's it's it's ridiculous. It's not gonna happen. So what this does, and I think what it's kind of teeing up if it goes in the California direction, is that one stop shop to say to data brokers, I just don't want you having my information. Please delete that information. It's a different problem, very related, but it's a different problem and a different solution. So I think that, you know, that's why I would suggest keeping going forward with this. As for exemptions that can be found in other laws, I do think you have to be careful about exemptions. Exemptions can become major loopholes. I think that there are good examples. So for example, credit reporting agencies are covered by the Fair Credit Reporting Act. And if we tried to give people a right to contact credit reporting agencies and ask them to delete all your data, I think that would probably be preempted by the Fair Credit Reporting Act. And I'm not sure that's good policy in the first place. But to say any type of data that is covered by the Fair Credit Reporting Act, which I think is what this currently does, is exempted is so broad because the Fair Credit Reporting Act covers any data that can go to a person's character. So you're kind of throwing out the baby with the bathwater if you include that exemption. I think what you need is a combination of data and entity level company because all of the credit reporting agencies have both the credit reporting arm, which is FIC recovered, and a data broker arm, which is non FICRA covered. So if you say if you're covered by Fair Credit Reporting Act, you're exempt, then that allows all of these companies to keep doing a data broker side with no oversight, and it encourages all the data brokers to affiliate themselves with what they can call a credit reporting agency in order to get around the law. So you gotta be very careful about that that sort of thing. And lastly, and I don't wanna take up more time, so please cut me off if you want. I know that you have heard from a number of entities who have come in and said, you know, we could really use this data for this. We'd rather you don't take our data away for that. There are all of these different uses. K? And those uses that they wanna use it for are valid. There is privacy is not an absolute right. Privacy is always in balance with some other valid purpose. Okay? There's always a tension between the two. It could be national security. It could be policing. It could be credit reporting or marketing or public health. There's always a reason. Problem comes that with no laws in place, no laws ahead of time, the person who wants to invade your privacy always has a good reason to do it. It's usually their business. It's their life mission like law enforcement to, you know, surveil, to invade the privacy. So they're always gonna say, yes. I think our narrow use case is more important than the privacy impact. They're always looking at their lower area. There is never a privacy advocate in the room saying, yeah, but you gotta think about the abstract big concepts. Do you really wanna do this? That's not how any of that works. The only people who can stand in for the people of Vermont to talk about the value of their privacy, who is not in the room, are the people sitting in this room here through the laws, through statutes to say, look. You know, in x situation, we think privacy does trump your valid use. Okay? And and and just to be very clear, you know because, you know, you could hear someone come in and say, look. If you pass this law, we're gonna we're gonna fight less crime. We're gonna catch fewer criminals. It's very hard to argue against that, that more criminals will get away. Right? It's a very compelling argument. But you have to think that, first off, that's a truism. The more surveillance you have, the more criminals you're gonna catch, period. Right? If you really wanna catch all the criminals, just make everybody turn on their microphones and cameras on their phones at all time, and then you'll catch a lot of criminals. I don't think anyone wants that. And the second thing to remember is we have had a policy that we will let the criminals get away if it violates people's privacy since the founding of our nation. It's called the fourth amendment. If police collect evidence in a way that violates our rights against unreasonable search and seizure, that evidence gets thrown out. That's the exclusionary rule. So the supreme court has been trying to figure out the balance of that really narrow thing for two hundred fifty years, but we acknowledge. So if you come in to hear someone say, oh, but we have this really good reason, it's like, sure. You do. But does that reason justify a massive industry of shadowy companies who are unaccountable and have shown themselves to be irresponsible in the past to collect all of your data and then also be able to do whatever else they want with it so that you can have your specific purpose. That's what's going on. That's the question. It's not the purpose. It's the way that the data ends up getting to them through this massive industry.

[Unidentified Committee Member]: So thanks very much. Number one, you had written remarks. Is that I don't know if it's posted yet. I was curious about what your recommendations were around the specific statutory provisions and just seeing those, and also the court case that you mentioned about personal jurisdiction. That would be, if those are in your The

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: suggestions are in it and I can attach the court case. I have a copy of it.

[Unidentified Committee Member]: Okay, thank you. A more specific issue that I think came up in some testimony was, we're request to exempt from the statute in some way. And I haven't seen the actual language, but somehow to deal with insurance companies and financial institutions and the data that they, I think the argument was that they need data that data brokers have, and so there should be some sort of exemption. Again, I'm not sure how you frame that, but there should be some exemption. Have you dealt with those kinds of issues? What do you think about that?

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: Okay, so there was a fairly famous instance that happened. It was reported in the New York Times and it led to a Federal Trade Commission investigation involving General Motors. It turned out GM was collecting enormous amount of data from their automobile smart car information, your braking patterns, how fast you were going, geolocation, and selling that to auto insurers who were then using that information to increase people's premiums because they said you're breaking too much even though, you know, it could be that you just have a pothole in your driveway and so you break. It it was completely disconnected from any actual use. This is an example of the insurance company saying we need extra datas because it will help us. Okay? Yes. The more data insurance companies can have to surveil you, they're gonna use it to micro target your premiums and generally raise your premiums. Okay? Banks are gonna use this for marketing purposes, for credit the Fair Credit Reporting Act already exists. Okay? They already have a system that has been in place since 1974 to collect this information. This bill is not gonna make that go away. What this bill might harm is their marketing opportunities, and their uses of data that fall outside the Fair Credit Reporting Act to which you ask, well, what are those uses of data, and why are you using them? Okay? I think if they want an exemption, they should have to have very specific compelling reasons for that exemption, and then consider it and and decide if it's if it's worth the risk that this current system puts on people to get that information.

[Unidentified Committee Member]: So I think the case they were making it, I would like to see the actual language too, clearly, that they took specific and tailored to the legitimate person. But I think that the rationale that they were saying is, well, we this information in order to do our job as a regulated industry in terms of not staying risk case of current company. And I guess in terms of banks, just what I think. Yeah. With with the lending Sure. And I'm wondering well so I'm imagining you might be able to frame something that was targeted to some legit purposes, but not includes other purposes, or what do you think about that?

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: Well, the problem is that this isn't about their necessary purpose. It's about and you ask the data broker to delete your data. Right? Correct. And so the data broker is gonna say, oh, no. Yeah. We use some of this data to sell the bank sometimes, so we're not gonna delete it because it falls under that exemption. That that's the issue is that you're you're you're kind of creating these broad exemptions. Like I said, if you draft it so it doesn't exclude Fair Credit Reporting Act, which you really should do because it'll probably be an invalid law otherwise, Fair Credit Reporting Act covers insurance companies. It covers banks. They will still be able to get your credit report, which is you know? And that is the highly regulated, well known, transparent report that congress regulated fifty years ago and that we all know about. If the banks wanna also use some additional shadowy information that we have no idea what's being collected or why it's being used, Fair Credit Reporting Act, by the way, if they give you an adverse decision, if they deny credit, they have to show you your report. They have to you and if it's wrong, you get a chance to correct the report. It's transparent. If they make a decision, an adverse decision based on data broker data, they don't even have to tell you they were using data broker data, where the data came from, what the report said, any of that stuff. So why not keep them within the bounds of the federal legislation that we've had all this time? And and, you know, again, it's it's a truism to say that if you have a bunch more data come from surveillance that you're gonna be able to do cool things with it. But that doesn't take into account the risk that this system is putting on the individuals in getting their data.

[Michael Marcotte (Chair)]: She's rolling.

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: See you again. You as well. He's coming in. We appreciate it. Thank you for the opportunity to speak on this bill, and I'm very excited. I hope you guys do a good job with answering the bill.

[Unidentified Committee Member]: Great. Thank you.

[Zach Tomanelli (Vermont Public Interest Research Group)]: Thank you. For the record, I am Zach Tominelli. I'm the consumer protection advocate for VPERG, Vermont Public Interest Research Group. We are the state's largest environmental and consumer protection advocacy organization. I'm here in that latter capacity weighing in on consumer protection. Ryan thankfully actually stole a fair amount of my thunder, but that's good news for you all because I can probably keep this fairly brief and get people to lunch. Don't repeat a lot of what was said there. I think what you heard, especially the regard to being careful around exemptions, because know that's been discussion last couple of days on this, really important. So I just want to kind of give just VPERG's maybe high level perspective and then happy to take questions. And I did submit written testimony, which covers most of this. Just by a little background, in 2018, BPERG was one of the leading proponents of Act 171. That was the groundbreaking data broker law that Ryan just summarized for you all. Interestingly, we advocated for that at that time. That was actually the first bill that I ever testified on in this building. I was sitting here, Chair Marcotte was one seat over and Ryan was here, but I think everybody else in this room was different. And at that time, as you just heard, we wanted to go further. We wanted to do what this bill does, which is adds the credentialing, which is give people the right to delete their information. But no jurisdiction in the world at that time had even set up a data broker registry. So this body, I think, prudently saw fit to say, let's take that step. Let's set up a registry. Let's have data brokers report to us, tell us about their business practices, learn a little bit more, and we'll take it from there. Well, it's been eight years. We've learned more. In that time, California, as you've heard, has followed in our stead, and they've gone even a step further in establishing both the right to delete and now a full mechanism where consumers in a pretty seamless one step process can go and request their information deleted. So at a high level, BPRD supports this bill because we believe that our Vermonters, Vermont consumers should have the same rights as their, California counterparts. We have, the registry in place, and, you know, we think giving that right to delete and getting a mechanism set up is is the next right step. Now the most recent draft of the bill, and, you know, we've heard testimony, takes that universal opt out mechanism, moves it to a study. You know, we would like to see that move forward more rapidly, but we understand why the study is being asked for. You know, this is not something you can just build drop of a hat. So I think provided that the committee, the legislature recognizes, and I've heard the term interim step, half step, we support that. It is a half step, a meaningful half step, but a half step nonetheless. So we support the idea of giving folks the right to delete their information with data brokers and giving the secretary of state the ability to find out what would it take to actually effectuate this and get this universal opt out set up. Just really quickly on why this bill is necessary, and Ryan talked a little bit about this, and I wanna reiterate this point, especially because this comes in the context of also discussing a comprehensive consumer data privacy bill and how does this maybe interact with that or is it necessary if that moves forward? And going back to 2018 when we testified on this bill, the nature of data brokers, their third party nature, the fact that they do not have a direct relationship with consumers, that was why we even started to look at this issue in this industry in the first place. And it really is distinct. When we think, and we're a consumer protection advocacy organization, when we think about consumers, where their power in the marketplace comes from is the ability to quote, vote with their wallet. Typically, we can decide what retailer to go to, what website to visit, what organization we might want to give information to. And data brokers are unique. We don't even know that they exist. We do not make any decision about what data broker to give our information to or not. So we see this as actually correcting a market imbalance. We are restoring some power back to the consumer there by telling them who are the data brokers that have your data because you don't even know and then giving them the right to delete. So that just makes a whole lot of sense because otherwise, they don't have that vote with your wallet ability when it comes to these third party companies. So we think makes sense to give them the right to to delete. And then if you're gonna give that right, it only really matters if it practically is possible to do. And as you've heard, there's I think over 300, maybe it's even over 400 data brokers currently registered with the state of Vermont. There's another 700 on the registry that are marked as unregistered, but means that they were maybe registered at some time. So we're talking hundreds, if not thousands of these these companies. It's not practical, as you've heard, for a consumer to go and research the opt out policies of 400, 500, 600 individual data brokers and opt out. So I think, you know, the case has been made, but I just wanted to ask we're heading toward Friday for break and lunchtime, reiterate, you know, you're not really granting this right in a practical matter unless you couple it with the universal opt out mechanism. So

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: we're going

[Zach Tomanelli (Vermont Public Interest Research Group)]: to study that, see how we can make that happen. We support that, but job is not done until that happens as far as we're concerned. And then the last just little bit I wanted to make is just and I think maybe something to keep in mind if you should decide to go forward with this is what are the actual harms that we're trying to stop? Like, what are we protecting from? Because sometimes we get into some of these discussions or these debates, and we lose a little bit of sight of the tangible, what are we trying to do here? Again, we're not supporting this just for privacy for privacy's sake, although there is a valid argument to make. Ryan just really articulated that great tension between privacy and legitimate needs. But I just want to kind of I've provided a few examples here. You can kind of follow-up. I linked. But in the past decade, we've seen countless examples of real harms from the over collection and misuse of consumer information from data brokers. So in 2014, according to a complaint by the FCC, data broker LeapLab bought payday loan applications from financially strapped consumers and then sold that information to marketers it knew had no legitimate need for it. At least one marketer, Ideal Financial Solutions, used the information to withdraw millions of dollars from consumers' accounts without their authorization. That's a pretty brazen example, but one that happened. In 2020 and 2021, the Department of Justice charged three data brokers, Epsilon, Macromark, and KBM. Again, sidebar, again, these are companies that have information about us. You've never heard of them. You don't know who they are. You would have no way of interacting with them unless we had a registry and perhaps this law. But anyway, these three data brokers, they were charged with conspiracy to commit mail and wire fraud for knowingly selling for roughly a decade each lists of vulnerable Americans, including elderly Americans and people with Alzheimer's to criminal scammers. The scammers then use that broker data to steal millions of dollars from these people. And in 2022, the FTC sued the data broker, Cochaba, for selling geolocation data from hundreds of millions of mobile devices that could be used to track individuals to sensitive locations, including reproductive health clinics, places of worship, homeless and domestic violence shelters, and addiction recovery facilities. Data could reveal where people live by tracking devices at night and combining location information with property records to identify individuals. In all these cases, you may be saying yourself, well, the government was able to bring action against these bad actors, and that's true. But they could only do so after the harm had occurred, after the fraud had been done. If you were to move forward with this and give people the right to delete their data, and then eventually a universal opt out to make that happen, you might be preventing future consumers from these sorts of harms before they even happen. People get the right, they exercise the right. So just, again, at a high level, I just want to keep that in mind. You are making consumers safer. You are protecting consumers if you were to advance this bill. So VPERC supports it. We support support it very strongly as we introduced, but even, you know, the current draft, we think it is a a meaningful step, and, you know, we hope you advance it. And thanks for the time to testify today. And we're, like, right at 12:00. I probably could take a question about people are starving. I get it. And you could catch me some other time. That's fine too.

[Michael Marcotte (Chair)]: Questions for Zach?

[Unidentified Committee Member]: Just one sec. I think yesterday we heard the appreciation for the exemptions that we're trying to get at the case of processing for the business cases and stuff like that. And also, think most witnesses in this case were saying they appreciated the exemptions, they still would like further exemptions, but they can't actually provide evidence to justify those. So I was just curious if you have reflections on that.

[Zach Tomanelli (Vermont Public Interest Research Group)]: My reflections are somewhat similar to what you what you just heard from Ryan, and and that's to say and I think what described yesterday, I guess, case exemptions, if we wanna call it that, I think are very appropriate, you know, for this type of bill because the issue is if you broaden it and to say, a consumer cannot delete any data that is XYZ, and that XYZ, if that is too broad, now you're effectively having an exemption that swallows the rule. And that's always the concern that we're having here. So again, and this was the discussion that you were just having, guess I would like to hear with more specificity, what are those uses? What are the uses of data that you want to be able to make sure that you have access to that data? And again, we also should just recall, this is not a ban on the usage of data brokers. It is giving individual consumers the right to delete. So if a company, a bank or insurance company, what have you, is currently working with data brokers and getting information, if this law were to go into effect, it doesn't cut off that data from day one. It doesn't prevent them from continuing to use that business. It might give some individual, the consumer should they opt to make use of that, the ability to have their information removed from certain data brokers. So maybe some of that information, instead of you getting 100,000 records, you get 90,000 records because 10,000 Vermonters decided that they wanted to opt out of that. We think it should be their right. So does that answer that question? Yeah. Questions for that?

[Unidentified Committee Member]: Thanks, Zach.

[Michael Marcotte (Chair)]: Thank you. Enjoy your lunch. Well earned. Thank you, Mindy.

[Michael Marcotte (Chair)]: Back here at one, we do have a look at an amendment to H674, which is a sister state bill. And seen more testimony on page three eighty five that besides agenda

[Ryan Kriger (Former Vermont Assistant Attorney General; Deputy Chief, Massachusetts AG’s Privacy & Responsible Technology Division)]: discussions before we