Meetings

Transcript: Select text below to play or share a clip

[Speaker 0]: Good afternoon, everyone. This is the Vermont House Committee on Conference on Economic Development. It is Friday, 02/06/2026 at 01:04 in the afternoon. And so to finish up our day, we're going to have a walkthrough and take some testimony on H650, which is an act relating to educational technology products. First, we have Rick Segal from Legislative Council, who is going to give us a walk through.

[Rick Siegel, Office of Legislative Counsel]: I can make it Rick Siegel with the office of pledged counsel. I did a walk through Monday or Tuesday. Mhmm. So I'm happy to

[Speaker 0]: Walk through soon? Yes. Right now, you're.

[Rick Siegel, Office of Legislative Counsel]: You were here physically, sir.

[Unidentified Committee Member]: All due respect. You

[Rick Siegel, Office of Legislative Counsel]: were in that chair.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: I'm in the

[Speaker 0]: vice chair without you. Maybe that's good. My brain was gone.

[Rick Siegel, Office of Legislative Counsel]: See how it is. But yes, happy to maybe high level instead of details. All right, let me

[Speaker 0]: Yeah, think it's the hospital. Sure.

[Rick Siegel, Office of Legislative Counsel]: So I will join the Zoom. And so, yeah, H650, as introduced, no amendment here. Short term. Is a bill that would regulate educational technology products. And it looks similar to the Data Broker registration because it is based on that language of how Data Broker is currently registered, not how this committee is discussing how to change that registration. So I am getting into Zoom. Here, it's going help me. It's I'm going request to share my screen. Is Kenny here? All right. Enough of my small talk. All right. So the first page of the bill, as I mentioned last time, there's a, in my opinion, a corrections, a tech correction to our student privacy subchapter where the current language on the enforcement section indicates a violation of this chapter would be a deceptive act. This is the entire chapter 62. So I think the intent was for it to be subchapter. I'm going to find a way to correct that somehow this session, either through this bill or another bill, but I don't think that

[Speaker 0]: was the intent when we

[Rick Siegel, Office of Legislative Counsel]: passed that bill several years ago. So the main part of the bill is on page two where we have definitions. These are pretty self explanatory of what a product is and what a provider is. Fairly broad, that it is student facing software application or platform that collects, processes or transmit student data, and that is used for teaching and learning. So it has to be a product that does this one thing with student data, but it also must be used for teaching and learning in a school. A provider is a person that operates one of these products that is in use at a school with or without a contract. I understand that some of these products are just used as free. There's no contract with a free tech tool, so it would cover those products as well. And then we have a definition of school cross reference in Title 16. So the requirement to register is on line 12 annually on or before Jan thirty one of each year. Again, this is the current data broker language. Feel free to amend this as you will. They would register as a provider of an EdTech product with the Secretary of State. They paid annual fee of $100 They will provide their name, primary, physical email, and internet address of their company, the most recent version of their privacy policy, and terms and conditions in use by the product, and list the names of all the products operated by the provider and which products many have been certified by the Secretary of State as pursuant to the subchapter. A provider that fails to register is liable for a civil penalty of $50 each day, not to exceed a total of 10,000. In addition to the fees that would be due under the section and other penalties as imposed by law. This does not otherwise limit the provider's responsibilities to comply with the provisions prior. So if a NedTech product or provider does not register, they are still governed by this subchapter even if they don't register, just like a data broker. The AG may maintain an action in civil court to collect the penalties imposed in the section and to seek appropriate injunctive relief. Product certification, the Secretary of State shall have the sole authority to certify an educational technology product and to create a form on its website where a provider of an educational technology product can apply for the product to be reviewed by the Secretary of State for certification. No school shall use an educational technology product that has not been certified by the Secretary of State. The Secretary shall develop, publish, and annually review the standards for certification. In developing these standards, the Secretary shall consider the following: the product compliance with state curriculum standards, advantages of using the product compared with non digital methods, whether the product was explicitly designed for educational use, Design features of the product, including any geolocation tracking, use of artificial intelligence, targeted advertising, personalized recommendation systems, access to adults unknown to a student, and features that would lead to compulsive use. Also, the data privacy practices of the provider. Any other factor the secretary believes is relevant to the education privacy and safety of students. Notwithstanding certification, a certified product shall be compliant with all federal and state privacy laws, including COPA. Include clear and easy to understand product information. Provide the following to a parent or a guardian of the student, what personal information of the student is collected by the product, how the personal information collected pursuant to subdivision A of three is maintained, used, and shared by the product, and the ability to access, correct, and delete the personal information of the student. They shall not collect student data that is not essential for the product to function effectively for the purpose for which it is being utilized. Demographic data of a student, except for the name and grade level of the student. Behavioral, interactional, or sensitive health data of a student. And not use any data collected to sell or share with a third party or create a student profile for non educational uses, which includes targeted advertising and disciplinary action. The secretary shall post on its website and update a list of EdTech products that have been certified by the secretary. They may list products that are under active consideration and products that have failed certification. Nothing in the section shall be construed to limit or alter obligations under the Individuals with Disabilities Education Act or the Americans with Disabilities Act. Schools shall provide reasonable modifications and necessary assistive technology to ensure free appropriate public education and equal access. Enforcement, a provider that violates a section commits an unfair and deceptive act in commerce in violation of section 2,043, the Vermont Consumer Protection Act. The attorney general shall have the same authority under the sub chapter to make rules, conduct civil investigations, bring civil actions, etcetera, etcetera. Last page, section two, this is the transition period for certification. A school shall submit a list of educational technology products that are currently in use at the school to the Secretary of State on or before 12/15/2026. A school may use an educational technology product that has not been certified by the Secretary on or before 06/30/2027. The act would take effect 07/01/2026, except the certified product requirement would take effect a year later to give the secretary time to develop that process.

[Unidentified Committee Member]: Just trying to get an idea of the scope of some of these things because, like and maybe if you could let's do. But there are certain, like, in work environments, you'll have, like, non customer basic notes. And I can see a situation and I don't if they have that, like but there must be a way of communication between teachers. And I'm wondering if if, you know, in those communications, there could be behavioral issues, perhaps ways to deal with it or something like that. And with the fact that they're talking about some of those things here, would that is it the same thing, or would that be a separate category of educational software?

[Vice Chair Edye Graning]: Yeah. So, what the bill is intended to do is to set up a system to make sure that only products that actually can provide quality educational services are ever in our schools. And so this is the first system of vetting. And I'm sure the Secretary of State's office is going to say that they might not be the best to vet these products. And we've already been in conversation with the Agency of Education to talk about that. And there are outside organizations that are not part of the giant tech companies that could also help us with that. And that's not written in this bill yet. But that's the goal. The goal is just to take it off the plate of teachers, off the plate of our principals and schools who can't make sure that quality education is happening in their buildings and continue to vet these hundreds of products that are coming out so fast. There's so much money to be made in this. And so we're hopefully gonna flip the script on this and say, if you wanna do business in Vermont, you have to make sure that you do these things. I mean, Secretary of State's office holds a registry for us for lots of different businesses. And so the idea is we could have them register there, that they've met these minimum requirements, and have it vetted in a different way, but they would just hold that information and certification. We have a whole list of people who are experts in it. And so I would love to give them as much time as we have so that we can learn from them, because I have a lot to hear from them too. Thanks.

[Speaker 0]: Any else for Rick? Good. I'd be really surprised to see you on the agenda wanting to falsify about it.

[Unidentified Committee Member]: It says on here Jill at 01:15. Is that Jill Anderson. So that means she has two months. Specific civil body change.

[Speaker 0]: It's a multiple cases.

[Lauren Hibbert, Deputy Secretary of State]: Good afternoon. For the record, Lauren Hibbert, Deputy Secretary of State. Really nice to be with you today talking about this bill. The Vice Chair kind of gave away my testimony because we have been in communication. You know, I really applaud the effort and goal of this bill. However, I am concerned about the placement of all of this activity within the Secretary of State's office. As you know, it includes a registry, and you've heard us talk about the registry. We can do registries. It's not a functional problem for us to do a registry. But I think even with the registry, I would ask this committee to think about whether or not a better place to have approved products would be through the agency of education, perhaps even contracts through the agency of education that schools and school districts could use for economies of scale. And so that that was overseen at the state level because there certainly is efficiency in joining contracts as one entity as opposed to multiple districts. Because things are shifting so much right now, I always think that's a great thing to think about. And that would be an approved list, essentially. And I'm very concerned about our office doing any certification of educational technology products. I understand in drafting this bill that this was a good solution for the initial draft. But just to put a really fine point on it, we don't have a single person on our staff who is an expert in any of the factors that are outlined in this bill, really important factors that are outlined in this bill. We don't have anyone who is aware of what the curriculum is in the state. We don't have anyone who is an expert on technology and its use in the educational arena. We don't have anyone who's an expert on privacy. That's, you know, we have the data broker registry, but we are not going into the privacy practices of data brokers. And if that is a space that we were to get into, I would need to have a lot more conversation about what that would entail, who we would need on our team. I would expect that this would be at least a full time FTE because this would be a person who was an expert in this and whether or not they could. You know, this is not something we could train a member of our staff on. I don't think that would be the intent of the bill. And, you know, we want to get data privacy if we were the ones who are responsible for it. We'd want to get it accurate and and use a lot of our colleagues across the state, including ADS and the Agency of Education, the Attorney General's Office, this would be a large lift for us. And so, you know, I'm happy to hear, Madam Vice Chair, that you're open to other places, and I would love to be part of the conversation about other places. But I respectfully really don't think this is a Secretary of State function. I understand why the registry was put with us. But even that, I would encourage this committee to think about other solutions beyond a registry, including statewide contracts, a requirement that they need to be in a statewide contract approved by the agency of education, because of the possibility of solely because that would mean the agency was overseeing these contracts. And also, it could save the state money.

[Speaker 0]: Questions for the one? Thank you, Laura.

[Lauren Hibbert, Deputy Secretary of State]: Thank you so much. I'm going to listen to the rest of the testimony remotely and go testify about lobbying. So thank you very much for getting me in so efficiently.

[Speaker 0]: Jill, did you have a hard stop? Muted.

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Okay. Hi. Yeah. My class is special till 01:45, but I do have someone covering for me if I need a little longer. Thank you.

[Speaker 0]: Go ahead. You can testify now.

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Oh, okay. Great. Thank you so much. All right. So good afternoon, Chair Marcotte, Vice Chair of Graining, and members of the committee. Thank you for the opportunity to be here today and for your attention to this important issue. My name is Jill Anderson. I'm a veteran public school teacher with twenty years of experience, and I currently teach third grade in Westchester, New York. I'm testifying from my classroom, as you can see, this afternoon, and that feels especially meaningful giving the topic before you. I hold a master's degree in educational psychology. Over the course of my career, I've witnessed a dramatic increase in the amount of time students spend engaged on devices, especially since COVID when one to one devices and EdTech platforms quickly became the classroom norm. Simultaneously, there have been significant changes in students' attention, emotional regulation, social development, independence, and ability to engage in pretend and creative play. These shifts did not happen overnight, but they have become increasingly difficult to ignore. After reading The Anxious Generation, which connected what I was seeing daily with a growing body of research, I knew families needed more than awareness. That realization led me to create Mindful Tech Lessons where I facilitate hands on in person workshops for caregivers and educators with practical tools, strategies, and real books to support healthier technology practices at home. These workshops are intentionally designed around human engagement, collaboration, and discussion. My goal is to educate and empower the adults in children's lives to make informed developmentally appropriate decisions around technology in a world that is changing faster than our policies and systems often can. Through this work, I came to another realization. These changes cannot only happen at home. Schools must be selective and critical about the screen experiences we provide children during the school day. What I'm now seeing is a shift where many families who have intentionally avoided iPads at home are seeing their child's first exposure occur in kindergarten. An iPad is essentially an oversized iPhone, often filled with highly stimulating gamified applications. It is not developmentally appropriate for children to spend significant instructional time tapping and scrolling instead of squeezing, building, writing, turning pages, drawing, and playing. The more programs and platforms students are exposed to at school, the more schools unintentionally signal to families that these products are educational, safe, and appropriate for home use. Schools are not only educating children, we're also helping to shape cultural norms. I've always taught in large, diverse districts where students' needs vary widely, and educators want to do what's best for children. When presented with tools that promise differentiation, improved outcomes, or added support for high need learners, it is understandable to be hopeful. The problem is that in many cases, these tools are not delivering on those promises and are creating new challenges instead. Many of these products are highly distracting and heavily gamified. Students spend time customizing avatars, rushing through tasks to earn digital badges, pretending to read books to mark them complete, or focusing on flashy rewards rather than comprehension. I have seen typing programs so gamified that children use one finger simply to unlock to get to the next incentive and coding programs that include unsafe chats or embedded marketing. I'm fortunate enough to teach in a district that does not mandate specific ed tech platforms or required usage minutes. However, ed tech shows up very differently across districts and even across classrooms. Parents are often unaware how frequently devices are being used. While some districts require daily minutes on programs, others frame these tools as supplemental. A supplement, however, is only as strong as the core curriculum. When physical text, manipulatives, and hands on materials are lacking, teachers are more likely to rely on screens to fill those gaps when they need to ask, we then need to ask some basic questions about function and purpose. Why are children practicing letter formation on screens instead of developing proper pencil grip on paper? Why are students playing math games alone on devices instead of collaborating with peers using physical materials? Why are children reading aloud to an app instead of to a teacher or classmate or learning measurement on screen instead of using real rulers and real objects. These products are often replacing real books, decodable texts, math manipulatives, and hands on learning experiences. As a result, many children are struggling to sustain attention and engage with longer, more complex texts. Regarding this bill, I believe holding these products to a high standard is essential. There are simply too many apps and platforms for educators and districts to adequately vet. I served on an interview committee for a director of technology position in 2016 before one to one devices became widespread, and even then, was clear how complex and demanding this role is. These professionals must manage privacy compliance, data protection, IT systems, budgets, communications, and instructional needs simultaneously. Not every district has the capacity or expertise to thoroughly vet every product, and even well intended leaders can be influenced by aggressive marketing. When mistakes are made, an entire district of children is affected. I would urge that any group responsible for vetting products be free of conflicts of interest. If fewer products meet strong criteria, districts will be encouraged to invest in higher quality, curricula, collaborate more intentionally, and prioritize resources that do require this level of oversight. That benefits students, educators, families, and schools. Speaking as a parent, I strongly support this bill. I requested to opt my kindergarten child out of one to one iPad use, but the district refused, a concern echoed by many other parents. I believe many teachers want to honor these requests but feel constrained by district level decisions. One of my personal greatest fears is that public schools will continue to lose students to families seeking more balanced, less screen heavy educational environments. This is already happening. Many private religious and Montessori schools are intentionally tech light. Public schools are pillars of our communities, and we must protect them by ensuring they remain places where children can thrive. Thank you for your time and for taking this issue seriously. I would be happy to answer any questions.

[Unidentified Committee Member]: Thank you, Joe. Michael? I'm assuming you have a script there. Can you forward that to us?

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Absolutely.

[Unidentified Committee Member]: Thank you so very much.

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Do you know where I would forward it to? The same person who emailed me the link?

[Todd Daloz, Assistant Attorney General]: Yes.

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Absolutely. Thank

[Unidentified Committee Member]: you so very much. Of course.

[Speaker 0]: The questions for Jill? Jill, thank you for your time, and we'll let you get back to your class.

[Jill Anderson, Elementary School Teacher (Westchester, NY)]: Thank you.

[Speaker 0]: Todd? Good afternoon. Good afternoon.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: Been a long time, Todd.

[Todd Daloz, Assistant Attorney General]: It's good to see you all. Nice looking fresh today. Todd Dayloz, assistant attorney general for the AG's office. Happy Friday. Maybe it'll be the last time in front of you this week.

[Unidentified Committee Member]: A lot

[Todd Daloz, Assistant Attorney General]: of reasons I hope that's nice. We don't have a lot to say on age six fifty. I think it's, as has been laid out, presents a compelling case. The AGO doesn't operate a lot in the educational space. We do operate a lot in the consumer protection space. And I think the way that this bill the ends that this bill is focused on bring those two pieces of work together. I think this bill also represents continued work by this committee and by the legislature generally to focus on the health and well-being of children, especially in the face of significant incursions into all aspects of our lives by technology. And I think a deliberate approach to figuring out how to vet technology and ensure that appropriate use of technology is happening in educational spaces, especially public educational spaces, is really vital. All of that said, I think there are and and we are happy to enforce in that space as the bill moves forward. Obviously, we'll need to see, given the secretary of state's testimony, how that plays out. But as a general matter, we don't have concerns on that front. The the main point I would raise, and I'm happy to talk to ledge counsel about some little questions I have. I'm sure he will set me right. The main issue I wanted to raise as you think about moving this forward and another entity potentially stepping into the place of the secretary of state in terms of providing what the certification standards are is is ensuring that we have congruence between this and the data breach notification law. What we wouldn't wanna see is a scenario where an entity was compliant with this law and not compliant with the data breach notification law, and we we might therefore have trouble enforcing one in the face of the other. So I think that's pretty easy fix. I think we can certainly work with ledge council on that and present any of that to the committee. But otherwise, we're in support of where the where the bill is going and and voice our strong support for overall concepts of really protecting kids in the space. So happy to answer questions. I imagine as things get fleshed out a little bit more, we'll have it may be valuable for us to chime in.

[Vice Chair Edye Graning]: I I have a statement, really. I don't have a question. I appreciate you your framing because the reason it's here, we're not the education committee. The reason it's here is the consumer protection angle and protecting our kids in a time where there's just so much proliferation in the market that nobody can keep up with it and say, how do we do that? And so we didn't really set that stage at the beginning, and I'm glad that you just did that.

[Todd Daloz, Assistant Attorney General]: Well, I think Jill from Westchester just brought that to the fore too. Right? I mean, you all know as well as anyone how overworked our public school system is, how challenging, you know, we keep my mother was a school nurse for twenty years in Saint Jay, and, you know, that was twenty years ago that she retired. But the amount of work that gets put in the bucket keeps increasing, and the bucket is a limited size. And it makes complete sense that our public school professionals are looking for ways to offload work, are looking for ways to support students when they're also trying to ensure folks have winter coats, ensure the kids are getting food, ensuring that they have a classroom that is functional for them. You all know that. I don't need to opine in that space. All I would say is we support the bill, and we think helping that work move forward from a consumer protection act makes a lot of sense.

[Unidentified Committee Member]: And I actually have some I'm curious about where the obligations or the the rights of school districts to make selections about educational content and and the rights of of what's under title nine and chapter 62 begin. And is that super clear? Or what are your thoughts on that?

[Todd Daloz, Assistant Attorney General]: Yeah. One, I think this process is how to define where those lines lie. Right? I mean, you will craft the statutes and determine the congruence around that. I think a vital partner in this is is the agency of education, and that it's it's so valuable to hear that they're engaged in this process too in terms of how the legislation may move forward. I would say, I think consumer protection is gonna gonna be maybe this is the way to think about it, but you're getting it right off the top of my head. Consumer protection is thinking about what is prohibited conduct. Educational attainment is about what are we trying to do for kids. Right? So one is the kind of here's here's the baseline. You can't, as the bill is currently laid out, collect certain types of data, create certain types of engagement of content, and use algorithms that are made to be to increase the addictive qualities of the product. That is the appropriate level of consumer protection. And then, you know, how the educational content fits into that structure is a space, I would say, for for the education professionals.

[Speaker 0]: Thanks, Michael. It's not good. Horvath?

[Dr. Jared Cooney Horvath, Cognitive Neuroscientist]: Yes, sir. Thank you all so much, and hello. It's lovely to e meet you all. Wish I could be there in person, but thank you for letting me do this over Zoom. My name is doctor Jared Cooney Horvath. I'm a former teacher turned cognitive neuroscientist, and my focus is on human learning and development. And I just like to make clear at the outset that I I I do not now nor have I ever received funding from any tech company. I just want wanna point out something real looking at this bill. So I worked in in Boston for a long time. I was at a a hospital, and we would develop drugs for different disorders there. Imagine if I invented a new drug and someone said, Hey, does your drug do? And my answer was, Why don't you give it to kids and we'll find out? No one in the world would actually do that, yet that is exactly what we are doing every single day when it comes to EdTech. The vast majority, and I'm talking well over 95% of EdTech products, whether that's programs, tools, has absolutely zero evidence base behind it. And something as as deep now as AI, when that came out, what was the first thing we did? We shoved it in front of kids' faces and said, let them figure out if it's gonna work. Well, bad news. The data has been coming in for decades. It does not work. Not only have we not been helping learning, we have actively been harming learning with all of this influx of ed tech. In fact, data is now quite clear that Gen Z is the first generation in history to underform their parents on basically every single measure of cognition. Since we've been measuring development, every generation has outperformed their parents, and we've always had school to thank for that. Every generation goes to more schools. We cut our teeth at school. All of a sudden, we're cognitively stronger than our parents. Until Gen Z comes along, they are now lower than us in every basic measure. I'm not just talking the high stuff like literacy, numeracy, creativity, critical thinking, they're definitely lower and all of that. But even the basic psychological things that for a long time we thought were unmovable, we thought they were biologically fixed, things like working memory, basic attentional control, executive function, they are underperforming us significantly and all of that stuff. Now why? The big trick is if school made us smarter than our parents, how is it that they're in school more than we are? How are they no longer getting the same cognitive boost we were? It's because of the tools we've been using in schools. In any state you look at, as soon as that state widely introduces digital technology into education, cognitive development goes down. So I just want to give you an idea. I've taken a look at Vermont itself. So in Vermont, I'm going to pick your year at 2014. So in 2013 was when you all introduced, what was called Act 77, and that was the first kind of digital pathway was introduced into public schools. And in 2014, that is when you took, Smarter Balanced Testing System. That's when all testing moved online. So we're going say 2014 was kind of your decusation year between analog into digital. Just taking a look at NAEP data, which is the best data we have for comparing years because that test has never been renormed. When we compare scores on NAEP data, that is hard care core comparison. We know exactly what's going on here. If we look at eighth grade and fourth grade reading, between the year February 2014, you guys grew three points. Nice. From 2015 to today, you guys have dropped 12 points. Your kids are now performing nine points lower than we were back in 1992. But that's okay because let's take a look at fourth and eighth grade math. Between year 2000 and the year 2014, the decusation point, you guys grew 14.5 points on math. Awesome. Since 2015, you have dropped 16 points. You are now lower than we were doing back two decades ago. This is in every state you see. When you align performance with when that state adopted tech, it I all 50 states I've now looked at, it is exactly the same picture. Scores have always gone up since 1992 until that that state adopts tech, then everything goes down because we do not learn from tech. The trick is EdTech companies know this. That's why they do not do research because they know once we actually look at research, there is no viable evidence to prove why this should be in front of kids, let alone across all classrooms. So there's no data argument to be made for this. I do want to say this kind of referencing the points that were being made earlier. What office can handle this? This is a much easier system than you think. No one in the world right now is vetting EdTech, but we're vetting everything else. Here's how easy this process is. All you need is one or two experts in the field of learning. You don't have to pay them anything. You then find 10 research groups at universities around Vermont that are willing to be academic research groups. You don't pay them anything. Now any EdTech tool that wants to come on, on their dime, they must pay for an independent research study using one of those 10 groups. So all the pay to those groups, the funding and the research, comes from the company itself. You want your tool in? Put your money where your mouth is. Any research done by those two groups are then vetted by those one or two people that you have now appointed as your head person. That pay also comes from the EdTech group. It's not that these companies lack money. May they made more money last year than probably every other industry has made in history. So their dime, their time. If they wanna shove something in front of our kids, use one of our approved independent groups, allow our person to vet it, you pay for that process, you clear the board, congratulations, let schools choose. You can't even do that basic stuff. We're not asking for miracles, baby. Just show me you can do better than a pen and paper, and most of them can't show that, then sorry. You have no business being in this state. And this is where I think your bill is the first to to actually say this out loud. Prove it. And I think you're gonna see a lot of drop in EdTech companies when you ask that question, and I hope everyone else follows. So that's all I got. Thank you all.

[Speaker 0]: Any questions for Doctor. Warbaugh? Great, thank you. Faith.

[Faith Boninger, Research Faculty, University of Colorado Boulder (NEPC)]: Good afternoon. Good afternoon. Chair and members of this committee, I'd like to thank you for this invitation to testify today about H650. My name is Faith Boninger, and I'm testifying today in my personal capacity. But for identification purposes, I'm a research faculty member at the University of Colorado Boulder in its School of Education's National Education Policy Center. And I've studied marketing in schools for nearly twenty years, including of tech products. And for the past ten years, my research has focused on the educational and privacy impacts of digital technologies used in schools. And I also don't accept funding from tech companies for my work. I very much support this bill as essential to protecting Vermont's children, including the integrity of their education, the content they're exposed to online through their schooling, and their data privacy. And as you've heard, schools are so different than when we or even our children were kids. Digital education products are ubiquitous now in American classrooms, and for those of us who are regularly in schools, their ubiquity can make it hard to remember that it wasn't always like this, or that it doesn't have to be. For those of us not regularly in schools, it's hard to comprehend all the many functions that digital products serve. They also shape. So any given district may use hundreds of digital products or more. Teachers and students use them to organize and provide curriculum content, to structure classroom teaching and student collaborations, to assess and track student learning, and to communicate with parents and guardians. Administrators use them to make staffing and procurement decisions and for reporting purposes. And just a handful of student facing examples are Google Workspace for Education, Kahoot, Zurn, Khan Academy, Magic School, Nearpod, and PowerSchool. Some people argue that under the supervision of teachers for educational purposes, so called ed tech is different and better than other forms of digital technology platforms like social media or video games, but my research indicates that this isn't true. In many ways, EdTech is worse. And I know that this committee spent a lot of time thinking about big tech, data privacy, age appropriate design of tech products, and everything that you know from these deliberations is relevant to thinking about the digital products used in schools. Essentially, EdTech is still big tech, complete with many of the design concerns associated with social media and gaming platforms that kids use outside of schools. And then some, because EdTech products are mediating between teachers and students, they're delivering educational content, they're making educational decisions, and through all of this, they're collecting huge amounts of sensitive data from children as they learn and grow. And what's particularly important, as Jill mentioned, children are required to use these products in their schooling. Digital products influence the nature of teaching and learning in a variety of ways, and all of them point to the importance of the state knowing which products are used in its schools, establishing a means of understanding what their characteristics are, and laying out ground rules for companies that want to do business in Vermont and have access to its children. Particularly the pedagogical theories embedded in digital platforms and learning programs shape the student learning environment. In other words, the algorithms embedded in these products shape teaching curriculum and assessments. They tend to narrow the curriculum to competency based approaches that are amenable to digital delivery and assessment. They may also embed cultural and other biases in curriculum and assessments. Further digital educational products may expose students to marketing and behavioral tracking, and this is especially the case for students in low income districts, which are more likely to choose less costly products or options. Assessments in digital educational products that use predictive analytics, artificial intelligence, and machine learning can harm students in hard to identify ways. As a general rule, the economics of bringing EdTech products to market incentivizes opacity and discourages adequate testing of product algorithms. EdTech products also collect vast amounts of data. They do this partly to fulfill their educational functions, but also they do it because more data allows for more additional uses, including interactivity with other products and the development of new features and complementary products. Importantly, data privacy policies consistent with most state and federal law distinguish between student data that's clearly associated with a student and de identified data that no longer has that student's identifying information attached to it. There are no retention limits on so called de identified data. Providers can save, share, and use these data in perpetuity for all sorts of commercial and other purposes, like predicting the likelihood that student might engage in risky behaviors or commit a crime. And whether the predictions are accurate matters less than that they're made and used for uses like determining insurance rates or police surveillance or guiding students toward different academic tracks. In short, providers are enabled to collect, retain, and use data extracted from students in all aspects of their state required schooling for their own undisclosed purposes in perpetuity with virtually no limits. And AI now amplifies all of these concerns. In addition to standalone AI products for schools, other EdTech products are increasingly incorporating generative AI features. There's a lot of money supporting the integration of AI into public schools and it's happening at dizzying speed. Products that incorporate artificial intelligence are particularly opaque as the mathematical calculations that are embedded in them are unknowable even to their own developers. These products threaten to corrupt curriculum with misinformation, degrade the relationships between teachers and students, bias consequential decisions about student performance, exacerbate violations of student privacy, increase surveillance, and further reduce the transparency and accountability of educational decision making. All of these, of course, increase the need for the registry provided by the bill that you're considering. And for annual registration, which is included in the bill, to address the fact that products are continually changing with many of these changes currently in the direction of more AI. Now, in theory, districts carefully choose the best ed tech products, negotiate contracts with providers, and directly control the way that products work. But that's not the reality. More often than not, Jill also addressed this, teachers and administrators are flooded for marketing with tech products and districts lack the personnel expertise and power to clarify contract clauses and negotiate effectively with providers. And although they may try products before they adopt them, they can't legally examine the programming of proprietary products, including the programming that determines how a product makes educational decisions and how it processes student data. In many cases, district schools or teachers adopt products via click through agreements without any negotiation at all. Google, which is a major provider worldwide, as a matter of course, dictates terms and conditions to districts that districts have no recourse but to accept. And as with any other digital product, when EdTech products are updated, schools must either accept the changes or absorb the costs in finding alternatives. And then it's very difficult, if not impossible, for a parent to know which products are used or may be used by their child or how those products have been vetted. Smaller, under resourced districts have no money to hire staff to review and vet products or pay for adequate data protection. And the more products that are used, the more opportunity there is for data misuse, both by outside bad actors and by the providers and subcontractors with which they share data. And again, many districts are currently using hundreds of these products. While many districts try to vet their products for data privacy concerns, they're limited in their ability to do so. School leaders and the children and families affected by directly by the EdTech products that they adopt need high level policy like that provided by this bill to support them by establishing oversight and accountability mechanisms. The registry proposed in H650 would free districts of the expense and effort required to vet platforms and negotiate with providers. It would also reduce inequities among districts and leverage the power of the state to ensure the quality and safety of the products that students use. The registry serves as both an assurance of pedagogical quality of products that can be used in the state and also essentially as a privacy agreement between the state and providers. It provides a way for the public to know about the products that enter at schools and provides a way for the state to impact the nature of those products. And as such, it's an important, important step in improving the lives of Vermont's children and families. The bill as it's written addresses almost everything that my research suggests that it should in order to adequately protect Vermont's children. I do however have some suggestions and these are the kinds of things that you've already been considering in this meeting. That one is that as written, the Secretary of State is fully responsible for developing, publishing and annual reviewing the standards for product certification. As you've said, that may not be appropriate, and it may be more practical to create an independent entity, perhaps under the supervision of the Secretary of State or together with the Agency of Education to conduct these activities. I was also intrigued by the suggestions that Jared mentioned. It'll be important to include the expertise of educators who can address the product pedagogical aspects and developers who can define and evaluate issues associated with programming. Another issue that I saw is that as written, a certified product will not sell or share data with third parties. In many cases, a product must share data with subcontractors in order to function. I recommend adding a provision for this kind of sharing into the bill that also holds subcontractors accountable for the student data that comes into their hands. The bill could provide could require providers to list their subcontractors and then have the subcontractors register as well. And there will be overlap in the subcontractors that are used by different providers. And finally, as written, the bill doesn't define student data. I recommend including a definition that explicitly includes the identified student data as student data, because it is. Overall, H650 is an important step in recognizing and reducing the threats posed to Vermont's children by the technology they use in schools. I support it wholeheartedly and thank you again for inviting my testimony.

[Speaker 0]: Thank you, Faith. Any questions? Thanks very much. With previous requests, do you have some written comments that you could forward to us?

[Vice Chair Edye Graning]: Yeah, I already did.

[Speaker 0]: Oh, it is already. I'm sorry. Any other questions? Great. Thanks again. You.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Hello. Is it okay if I share screen?

[Speaker 0]: Yep.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Okay. Let's see.

[Unidentified Committee Member]: Remind that.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: All right. You seeing this?

[Speaker 0]: Yes, we are.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Okay. Thank you. Chair Marcotte, Chair Greening, Vice Chair Greening, and committee members, thank you so much for this opportunity to provide testimony. I did provide a written thing and I'm gonna go completely off script because so much is coming up during these fruitful and rich testimonies that I'm hearing. I'm Lisa Lavasser, the founder and research director of Internet Safety Labs. We are a non profit, non partisan, independent product safety testing organization. We do not and have not taken any money from technology at all. It's a bad look when you're doing independent product safety testing to take money from those people so we do not. I'm here today to speak in support of the proposed bill. I'm grateful and I'm really grateful for the opportunity to speak with you because we have amassed a really quite a sizable body of work around edtech safety and privacy since our start in 2019. As a product safety testing organization, we look under the hood of technologies to assess the riskiness of the app's behavior. We do not assess the company's behavior or what it says about the product in the privacy policy. We assess what we call programmatic harms. These are the harms that are baked into the product when you use it as it was designed to be used. These are things, these are behaviors that you can't avoid. They're just there. And we are, we our whole mission is about exposing these risks, you know performing empirical measurement of these risks, exposing them at scale in order to drive safer products for everybody. We are software people. We are geeks. We are like 100% software people. So forgive me, communication is not my like number one forte here. So and also have a lot of dense information so apologies in advance. Some of you may be familiar with our work around EdTech. In 2022 we performed the first of its kind safety benchmark to identify privacy risks in EdTech used by students in six sixty three ks-twelve schools across The US. We published three in-depth reports on the findings as well as more than 1,700 mobile app safety labels that are viewable in our app microscope. They look a little something like this. Our safety labels assign an overall privacy risk score and identified third parties observed to be in communication with the app. I pulled this particular label. This is available in appmicroscope.org. This was one of the apps that came up. We we looked at 13 schools in each state. So 13 in Vermont. This was from South Burlington High School. And it two things to note about this. This is a really good example of an app that is not education related but is absolutely like sort of viable for being recommended in school usage. It's audible, audiobooks, And this one received a critical risk score. That's the overall privacy. Like, how leaky is this by design? How much data is being shared by design? And you can see what we have. And this is just a partial part of the label. And what I wanna highlight is that we do we identify if there are data brokers in the network traffic. We also really tell you exactly who the third parties we see communicating back and forth to the app itself. This isn't conjecture. This isn't, you know, hypothetical. This is observed communication to these third parties. And then we assess the third parties based on how much they monetize personal information. Critical risk entities are entities that do are data brokers or are like grossly monetizing bulk personal information. I'm gonna say a little bit more about that as we go. In 2023 and '24, we worked with Brigham Young University on a project for the Utah State Board of Education to compare the privacy behavior of apps. Again, the apps used in their schools against what their data privacy agreements or DPAs or privacy policies promised. We continue to systematically measure safety and privacy risks in both web and mobile apps to produce new and improved safety labels as a public good. We are in strong support of H650 as a much needed step towards EdTech product safety accountability. Here are some of our research findings that support the need for greater scrutiny of the technology K-twelve students are using. And again, we're not education experts at

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: all. Mhmm.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: But you've got some great ones providing testimony and that's terrific. But we are geeks and we will tell you what these things are really doing. So in our 2022 benchmark, we were able to fully inspect 1,357 apps. 96 of the apps were in communication with third party entities. And this isn't a particularly alarming or surprising result. It reflects the reality that virtually all apps include third party software components, which means they're sharing student data with third parties. And as doctor Benninger I I think you're doctor. Yeah. Yeah. As you mentioned, the the statement in two four four four c c five a with of not selling or sharing with a third party, it's that's not really not gonna work. That needs some revision there because they absolutely share data with third parties. So adding some refinement there is gonna be crucial. In the benchmark, we observed the following that 78% of the apps studied communicated with advertising or marketing platforms, allowing these third party, risky third parties to glean data really about children. And nearly 70% of the apps were in communication with Google servers even though our sample was nearly a fifty fifty split of Android and Apple apps. So what that means is that we were seeing Apple apps use Google components. And in fact, Firebase was the most popular component used in the apps with 908 apps. So a very sizable chunk of these apps using this Google technology. 13% of the apps had targeted ads which while expressly prohibited by COPPA still appeared in apps used by school aged children in recommended or required apps but from schools. And I would be remiss to not share some of our findings around COPPA Safe Harbor certification. COPPA did the one thing it was supposed to do well, which is to remove retargeting ads. We didn't find any evidence of retargeting ads in the apps that were COPPA Safe Harbor certified. However, you can see from this chart that the the set of apps that were safe harbor certified performed worse in terms of overall ads. They were more likely to have an ad than the overall dataset, which included a bunch of off the shelf, not for children technology. So this is a, in my opinion, quite disheartening. Right? It's it's disturbing. Moreover, 74% of these, COPPA certified apps were rated our riskiest score at the time, meaning that they were sharing with marketing and, ad tech related platforms. And that's significantly higher than 55% of the overall dataset. So just just a kind of an awareness about you know, you will you will get you will get the behavior that you regulate for. So the regulation needs to be careful and mindful. Okay. It's also super important to recognize that schools legitimately recommend websites and apps that aren't expressly for children. And this this is a breakdown of the apps that we audited and you can see here that 28% were NES. That means non education specific. This means things like Audible and New York Times, Wikipedia, the zoos, the museums, all of these kinds of things which we see coming, you know, teachers recommend these things and they will and they, you know, they should. But these are not strictly speaking educational apps. And then we had educational other which was like games and things like that. When we looked at these further, this this combination of NES and O apps and we teased apart the ones that were strictly for children, it turned out that 28% of them really were not at for children at all. They weren't designed for children at all. This again, my point in sharing this is, this is not likely to change. Schools will likely recommend tools to children that aren't for children. So how do we do you know, how do we manage that? With respect to permissions, 79% of all the apps requested location information and 65 of the apps requested camera access to the camera or the microphone. And a 100% of the Android apps requested location information. Pretty disturbing. On average, 58 schools recommended or required 58 apps to students, but only 29% of schools had evidence of vetting these apps. And we only found that 14% of the schools were affording an opportunity to consent to the apps and to bring this closer to home, the 13 Vermont schools we audited closer to 40% of the schools perform technology vetting. That's good And 20% provided an opportunity to consent. Nine of the 13 schools sampled in Vermont used the student data privacy consortium tools to manage their software, their technologies. This is a good thing. The SDPC is a good service. One point, Bennington Elementary School, one such school that used SDPC was found to have a 124 approved technologies listed for the school. We did find in our research that the more there there's a correlation between whether or not you have vetting and how much technology gets recommended to students. So, your schools that had vetting were more likely to recommend more technology. I think they felt a sense of, you know, confidence in what they were doing and therefore, promoting more technologies to be used by students. In our in the Utah research, we found 44% of the 100 studied apps collected data that was not identified in the DPA or the privacy policy. We found 11% of the apps sent data to third parties not mentioned in the DPAs. And 36% communicated with ad tech platforms even without the visible presence of ads. All of which is to say it's high time we systematically screen ad tech apps for risky behaviors. H650 can go a long way to help mitigate the risks of these typical app behaviors. I wanna highlight a particularly good inclusion in the bill's definition scope definition section, namely two four four eight two. And it's the definition for a provider of an educational technology product. The scope includes products that are quote in use at a school with or without a contract. And this is super important and someone might think that this is overreach, but it's quite it's quite important. In our 2022 benchmark, we found that only 29% of educational technologies in our sample were licensed by the school or the district, and 71% were off the shelf. So to keep students safe from risky software behaviors, we have to be concerned with that whole set of, you know, licensed and unlicensed software. Secondly, there is a naturally occurring power and information asymmetry between technology manufacturers and licensees of technology products. Yet we have had EdTech manufacturers tell us outright that they are not the data controller and therefore not responsible. They assert that they are merely providing a platform and that the schools alone are the data controller citing section two thirty of the Communications Decency Act. We refer to this as deliberate data controller confusion. And since 2023, we have advocated that manufacturers of educational technology licensed by local education agencies or LEAs be regarded as joint data controllers along with the LEAs. The LEAs did not write the software, select, or even approve the variety of third party data processors baked into the product. And as our research has confirmed, often the manufacturer doesn't accurately disclose those data processors. In no way can LEAs be conceived of as sole data controllers. Thirdly, the kind of certification defined in section two four four four c b four of the bill relating to design features of the product. So this is just that one piece of it, the sort of technical behavior piece is very feasible. You may hear from opponents of the bill that such an assessment is too onerous and too expensive or worse not possible. And I'm here to preempt those arguments with several years of hands on experience. These kinds of certifications are possible and practical. And our current and forthcoming safety labels will address everything that's listed in this section of the bill and more. And I wanna underscore doctor Horvath's point about, yeah, let the company pay for these things. This is another place where, you know, there are agencies around that can can perform this under the hood auditing. And it it can be very, very practically done efficiently and cost effectively. There are some considerations that you may like to consider, and I you cover this in the language about not selling data to a third party, but there is this whole world of commercial surveillance happening. And I do think it's worth just I'm not gonna read the whole thing. You have the written comments on this. But what we what we see is that there are ad tech and marketing platforms that routinely uniquely identify the person, I e student in this case, associated with all incoming data packets. Virtually every network communication from a mobile app or unprotected browser allows student identification. It's a fact. And from our 2024 research on this, we can definitively state that the magnitude of commercial surveillance is staggering. We published a report the World Wide Web of Human Surveillance and we identified and researched this beating heart of infrastructures that enable this commercial surveillance at scale. It's a global decentralized network of advertising and marketing platforms called customer data platforms like Adobe and identity resolution platforms like LiveRamp that are architected to ingest customer data from disparate sources, associating them to a unique person through identity resolution techniques. These platforms aggregate personal information in bulk via APIs and transactionally through digital advertising by the inclusion of proprietary personal identifiers conveyed in the real time bidding process. We constantly assess marketing and ad tech platforms, and their sites proudly assert cookie list tracking and personalized experiences for visitors. This is a deliberate word choice, not customers, but visitors. We are not anonymous online. Of the combined total of 360 CDPs and IDRPs that we studied in 2024, only 16.4 of them were registered data brokers, while many more and arguably all of them should be considered data brokers. There are two types of commercial surveillance infrastructures. There's the decentralized one that I described, and then there's the other one, which is the the big tech, the proprietary infrastructure from big tech like Google and Apple and Meta. Both of these types of infrastructures knit together disparate data sources to develop increasingly invasive and comprehensive profiles of people. Worse, the mechanisms for knitting this data together, especially in the case of decentralized identities entities, They indiscriminately hoover up the data of everyone, including children. We found that 35% of the apps, 539 apps in our twenty twenty two benchmark sent data to these types of platforms, CDPs and identity resolution platforms. In our audits over the past few months, we're seeing more and more of these appear in the network traffic. We're also seeing an increase of AdTech and Martech platforms internally integrating these functions into their own services. So it's starting to look like a worldwide user identifying customer database sharing a palooza. And our next version of safety labels, we will be indicating these types of platforms that receive data. Okay. New point. We're also seeing an increase in the presence of third party screen and session recording tools. This can be for debugging purposes. This can be for marketing purposes. We're starting to flag these because in conjunction with the unique user identification, it gets worrisome about having the entire session recorded, but we're seeing a lot of that happen. Finally, we the the section on artificial intelligence. Artificial intelligence is marketing language. It doesn't map neatly into specific measurable technology behaviors. So we suggest more specificity in there. Use of machine learning and large language or other media models. Who is the third party provider of the model? Is student data being used to train the model? Does the app offer a way to disable this behavior? Use of chatbots. Who is the chatbot party provider? What third parties have access to student prompts? What kinds of guard rails are installed? The young people's alliance has proposed recently banning the use of human like chatbots for children entirely. And they've created a strong list of empirically measurable chatbot behaviors which constitute human like behavior. And we are adding this into our next version of labels. So in conclusion, we are really happy to see this novel approach to holding educational technology manufacturers accountable for their products that's described in H650 and we stand ready to support the state of Vermont in any way we can. So thank you very much.

[Unidentified Committee Member]: Thank you, Lisa. Any questions? Hey, Lisa. I'm just curious if you'd support adding tracking sensitive data and data brokers to the product certification list, specifically thinking of your scorecard system. Oh, you're on mute.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Yeah. Of course, we agree with that. And and we are you will see that this year in our next version of safety labels that we'll be tracking that. The one thing I will say about sensitive information is we don't have a unified definition. It varies from state to state and we've done our own definition on top of it And because we don't actually think any state has it quite right. We think that there are omissions. We don't think also GDPR doesn't have it quite right. So we look, we cast a wide net, but we will provide both. We have an omnibus version of sensitive data based on all of The US states. And if any one of those gets sent, we're gonna flag it as sensitive data. And then we have our own ISL definition. And if any of those get sent, we'll flag that.

[Speaker 0]: Thanks.

[Unidentified Committee Member]: Thank you very much, Lisa. My question pertained to the Vermont summary that you provided in your slides. And I have the slides you sent to the committee here. Can you just go over for me one more time? In that school composite app score, you were seeing or just one more time what the composite app score is indicating.

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Yeah. It's kinda tricky that and and if if you you can find the whole definition of that in our findings report one online and it takes into account the average app score of all the apps that were found in the school as well it also will give a weight to the number of apps that are in the school. So if a school has there we found one school that had over 1,400 app 1,400 technologies in Colorado. And so that even if they were all sort of like, you know, medium risk or something, they would get a high composite score because that's too much. That's too much technology. So it's a combination of the average score as well as the number of apps that are there.

[Unidentified Committee Member]: And one of the things that I think it was a takeaway that we're seeing that where schools had a review team or something, there was a sense of confidence that may have been that sort of set in that led to there being more and more of them. Did I hear that right?

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: Yeah. We saw we saw a correlation between the presence of auditing. And I I should remind you that we we did this completely independently of the schools and we looked on the school and the the district website for evidence of sort of broad tech vetting. Virtually every school district or school will vet the licensed technology and we were looking for vetting of the unlicensed technology. And so when we saw that, we definitely saw that there was a sort of tendency to have more recommended technologies in those My

[Unidentified Committee Member]: last follow-up for that one. The auditing groups, who by and large in those as you looked at those those audit teams, were those teachers, were those PTO members? I mean, who was doing it? Volunteers? What

[Lisa LeVasseur, Founder & Research Director, Internet Safety Labs]: So these were processes that were established either at the district or the mostly at the district level. And these were generally, I can't say for sure where the genesis and the composition of these, but I will say that they usually came out of like the IT organization. And it was their kind of procurement and vetting process that was established.

[Unidentified Committee Member]: I see. Thank you very much.

[Speaker 0]: Yep.

[Vice Chair Edye Graning]: Other questions? Thank you so much, Lisa. Emily Carris

[Emily Carris, Parent/Author/Speaker/Teacher]: Hello, good afternoon. Thank you so much. And I appreciate you having us here today to testify. The people on this call are all people I respect tremendously. And I'm actually honored to conclude this group of testimony because I wanna end with illustration of what education and childhood looks like today in 2026. My name is Emily Carris I'm a parent, author, speaker and teacher. I also do not accept any funding from tech companies for my work. And I'm here to testify in support of H650. And I'm here with an urgent warning. Technology is fundamentally changing childhood and in the process undermining parents, destroying education and threatening the very health of our democracy. I wanna start with an overview of what childhood in America looks like today. Children between eight and 18 are averaging over seven and a half hours a day on screens outside of school hours. Sixty two percent of children under the age of two are watching YouTube. Forty percent of two year olds have a personal tablet. Nearly 90% of American public schools provide children with internet connected devices for learning. Occupational therapists therapists tell me that they have to teach young children how to turn the pages of a book. Preschool teachers report that toddlers don't like getting their hands dirty. I know of one teen so addicted to his phone, he puts it in a Ziploc bag and brings it in the shower with him. 26 of 13 to 17 year olds use chat GBT to do their homework, which they access on laptops given to them by their schools. Elementary school children are literally falling out of their chairs in classrooms because they lack the core strength to sit for long periods of time. One child has viewed more than 13,000 YouTube videos in less than three months at school on his school issued laptop. I heard of another child just recently who spent seventy two hours in one weekend on YouTube on his school issued laptop. Just this week, a teacher told me that while assessing kids for kindergarten readiness, children could correctly identify the numerals one through 10 and the letters A, B, C, etcetera. But when they got to the number 11, the children identified it as pause, as in the pause button. Children are setting their alarms for two o'clock in the morning to get up and play on their school issued laptops while their parents sleep unaware. And finally, one anecdote that grabs a lot of attention, a teacher shared with me that while her students are still losing their baby teeth as middle schoolers, they are also making and imitating the sex noises they hear from videos on the internet. Mounting evidence shows that smartphones and social media harm children. And after nearly a decade of advocacy work, lawmakers and governments around the world are enacting changes. I wholeheartedly support these efforts, but I wish to stress that further protection is needed of children during the school day, where nearly every child in America is provided with an internet connected device in the name of education. To protect children's cognitive, mental and emotional health, we must do more than ban phones from classrooms and remove screens from childhood. We must get rid of EdTech too. Unless you are a current parent of a current school aged child, school looks very different today, not just from when we were students, but even ten years ago. Today in a school classroom and children spend the school day hunched over individual laptops and iPads, teachers eyes are directed to a screen of screens to monitor their students. Chalkboards are digitized whiteboards. Schools use hundreds, if not thousands of unique ed tech products and apps per school. Curriculum is online learning and grades are stored in digital grading portals. Physical textbooks and workbooks rarely exist, but ebooks and note taking apps do. Teachers upload lessons and homework to online learning management systems. Digital curricula uses persuasive design techniques that emphasizes rewards and engagement over learning. And finally, human teachers are being replaced by AI tutors even as harms of children's use by AI products fill the headlines. It's okay if this is new information to you. The onslaught of digitized edtech products into schools happened quickly and relatively quietly while other crises dominate our headlines. That distraction has served edtech companies well. Yet many of the companies who build these edtech products, whose names you've probably never heard of, are as powerful and wealthy as the companies whose names you do know Meta, Snap, Google. In reality, simply putting Ed before the word tech doesn't make it effective or safe or legal. EdTech is big tech. So it's critically important that lawmakers seek to hold these unheard of yet still very powerful companies as accountable as we are doing with the ones whose names we know. Because if you remember nothing else I say today, please let it be this: at its very core, the business model of EdTech is no different from the business model of Big Tech, and both are fundamentally at odds with child development. Big Tech has already co opted the social lives of our children we cannot let them co opt their education too. I am pleased to see that H650 would establish a registry requiring EdTech companies to register in order to do business with schools in Vermont. This filter would be a first step in preventing health harming products from getting into children's hands. And I agree with everything my colleagues have said previously today. Some of you might be wondering whether it makes sense to restrict educational technology in schools. After all, you've probably been told that children need these products to be successful digital citizens of the future. But here is the problem. Such claims stem from industry funded marketing that benefits technology companies, not children or teachers. This propaganda has unfortunately fuelled a wholesale restructuring of childhood around screens, both at home and in school, and is catastrophic for families, children and schools. There are several myths propagated by the industry and repeated by those who do not know they are being manipulated, and I would like to share those with you today. Myth number one is that improves learning. This is false, and Doctor. Horvath's entire book debunks this. But children today are less digitally literate and less cognitively developed than their parents were, a reversal that coincides with the introduction of EdTech products in schools as Doctor. Horvath noted today. Just a few examples from the research, overwhelming evidence exists to show that reading and writing on screens harms cognition. Another study based on over 300,000 primary students found that even thirty minutes of digital device use in class had a negative impact on reading comprehension scores. Finally, one study found that investing in air conditioning yields a 30% improvement in learning outcomes over giving children Chromebooks. Another myth is that EdTech companies claim and market heavily to school districts that their products can help teachers, when in fact the very premise of scaling teaching through technology as EdTech companies intend to do, means displacing the human educators at the heart of the learning experience. Success, according to the EdTech industry, mean fewer teachers serving more students and increasing teacher workload and class sizes while turning teachers into IT administrators instead of mentors and instructors. It's no wonder teachers are quitting or retiring early and schools are resorting to AI tutors who are far cheaper even as they are more dangerous. All good educators know learning is rooted in human relationships and the EdTech business model is fundamentally at odds with that fact. EdTech tools don't help teachers they help schools hire fewer teachers while generating greater profits for EdTech companies, some of who are worth billions of dollars. Third, you may have been told that edtech will democratize education by giving underrepresented children access to products. But far from improving equity, edtech products create new digital divides: digital safety divide and a digital learning divide. As you heard earlier, safer versions of ed tech cost more. Monetizing safety and privacy means under resourced schools receive less safe versions of the product. Ed tech is offered as a solution to ballooning class sizes, but EdTech solutions in under resourced schools means privileged children will get human teachers while poor children will get technology and chatbots. This is deeply inequitable. Just look where technology executives themselves send their children to nature based low tech schools, and you can see that those who build and market these products for our children make starkly different choices for their own families. Another myth: concerns about online safety are growing, and rightly so, and in spite of claims to the contrary or efforts to filter or block content, head tech is not safe for children because these very products rely on the internet to deliver their services and the internet is not a safe place for children. Via school issued devices, children are accessing pornography, pedophiles, suicide videos, and extremist content even when filters and blocking software are in place. It is vital that we remove student access to phones during the school day, but doing so without also removing internet connected devices from students' backpacks simply transfers the risk of harm from an iPhone to a Chromebook. Just like school administrators and many teachers, you've likely been told that edtech will prepare kids for the future. But this is a myth. No tech skills will matter if children do not first learn how to communicate, think critically, or problem solve. Children do need technology skills, such as an understanding of what the internet is, what is an algorithm, how do you discern fact from fiction, and so much more. Children need to learn how technology works and how to do it safely. But do not confuse ed tech with tech ed. Learning about technology is very different than learning on technology. Giving ed tech products to children and calling it education poses an existential threat because of the degradation of skills that ensues. We must ask why such tools are being given to children with vulnerable brains in the name of education in the first place and do what we can to stop it. Finally, I often hear the claim that it is too late to change things because technology is here to stay or the toothpaste is out of the tube. It is true, technology is and will continue to be a part of our life, but it is absolutely not a foregone conclusion that EdTech companies should be allowed to run rampant through our classrooms without consequences. Lawmakers do have the power to change this. It is possible to build a safer internet, regulate tech companies, and protect children's data and privacy as the default. Technology companies will always choose shareholders over people, so the only way these companies will meaningfully change is if they are forced to, such as having to register with the state, as Bill H650 proposes. We've regulated other health harming products before when it comes to use by children. We can and must do the same with EdTech. Doing nothing is not an option, because as a result of the harms caused by social media and smartphones in the hands of children, plus the ongoing enmeshment of EdTech companies and products into education, we are facing four crises that warrant immediate attention. First, we face a mental health crisis. Screen use before two years of age is linked to accelerated anxiety by age 13. Today, one in three girls has seriously contemplated suicide. The youth mental health crisis is so dire it has elicited a warning from the Surgeon General. Second, we face a learning crisis. Reading and math scores are plummeting. We are literally wasting education dollars on ineffective and harmful technologies. Third, we face a crisis in creativity. A 15 year old in Kentucky told me that the elementary school students she teaches in an after school drama class, when she said to them, Let's pretend we're flying, they looked at her and asked, How? If children can't pretend to fly, they cannot imagine. And if they cannot imagine, they cannot innovate. Creativity means having an original thought. Technology access in childhood does not enhance creativity, it kills it. Remember that today's tech titans had analog, play based childhoods. Finally, the enmeshment of technology in childhood is creating a crisis for our democracy. Thomas Jefferson himself said, An informed citizenry is at the heart of a dynamic democracy. When children spend hours being fed algorithmically driven, rage bait content designed to increase engagement on internet connected devices given to them by the adults who are supposed to protect them, they lose the ability to form their own opinions, detect bias, and think critically. That should frighten us all. It is important to state that this is not a kid problem. It is an adult problem that is impacting children and adults need to do something now. No one policy will be sufficient by itself. We need many laborers to work in concert to best protect children. In addition to what you consider today with this registry bill, warning labels on social media products and giving families the right to opt out of harmful products in schools are equally important actions to take. Thank you very much for your time.

[Vice Chair Edye Graning]: Thank you, Emily. Michael.

[Unidentified Committee Member]: So I got a question for you. How would you address the teachers and administrators that are extremely supportive of this ed tech? Because, I mean, we saw it with the cell phone bill. I mean, it was teachers as well that were against it.

[Emily Carris, Parent/Author/Speaker/Teacher]: I think one of the problems is that because of the way teachers, especially recent teachers in higher ed have been brought into education with all of these products as the norm, that this is what new teachers especially are being told is education. That's one of the challenges, is helping return to the institutional memory of teachers who are more experienced, who've been here a long time and know how to teach without it. The problem is, it's a sort of self fulfilling cycle here, right? We have these products that are being used, for example, online testing, right? And so therefore schools say, well, we need the one to ones to justify test taking, right? Because that's where the kids are testing. And so then if the one to ones are with the kids, we need the digital curriculum. The problem is to pull one of these pieces out, dismantles a lot of the other pieces and it's going to be messy. I have no doubt that it is a complicated process, but the alternatives are so health harming for children that I just don't think we can afford the luxury discomfort, or of trying to stay comfortable with this. Just today, I had a friend tell me she substitute taught in a class, wasn't able to log into the curriculum. All of the teachers around her were like, well, how did you get through the day? And that the students themselves were like, well, you didn't put anything up on the SMART Board, so we don't know what to do today. And their independent reading is on a Chromebook, not a book. And she said they spent ten minutes scrolling through the apps to try to find a book rather than and when it was too hard, they gave up. So, I actually believe that most teachers don't like this, or if they don't, they may not like all of it. I think they feel caught between a rock and a hard place. And I think Doctor. Horvath has said this before as well, but I think teachers need our support in helping put a stop to this.

[Unidentified Committee Member]: Thank you. Sheregen, I'm not implying that this is incomplete testimony, but I was curious if there's been things have been sort of having catalogued the challenges of children who don't want get their hands dirty physically, who think an 11 is a pause sign. Going off your most recent comments just now, is there research calculating social costs in terms of the resources and the time that would be devoted to making up for lost time in human developments in the school?

[Emily Carris, Parent/Author/Speaker/Teacher]: That's a great question. And maybe Faith can add too. I think the social costs are massive. It's a huge risk. This is a critical time in development that we don't get back. If you don't, it's use it or lose it when it comes to those neural pathways in childhood. And so when was the best time to stop this? Yesterday. When's the second best time? Today. So I think there is absolutely reason to act now and still hope we can save the current generation. I think one of the most frightening things to me is what I hear from higher education professors, and I see this in my own university students who are struggling to focus, who are just constantly assaulted with AI products as a solution to help them study and learn, which it doesn't do any of that. Those are children who didn't have iPads in kindergarten or even fifth grade, or maybe even didn't get a one to one until middle school. Today's kindergartners are being given iPads. Like if we think what's coming out of higher ed right now is problematic or, and I know I've spoken to, it was the head of a large engineering company, said she's hiring graduate students out of top engineering schools who literally cannot function as humans are the words she used. It doesn't matter how good their engineering skills are, they can't operate as a person. And so again, those are people in their 20s. We're talking about five year olds right now. We haven't seen the worst of what's to come, which is again, why I feel this extreme sense of urgency to do something now.

[Vice Chair Edye Graning]: Any questions? Thank you, Emily.

[Emily Carris, Parent/Author/Speaker/Teacher]: Thank you very much.

[Vice Chair Edye Graning]: I think our next presenter is Jeff Wallace. And I think, Jeff, you're our last presenter for the day.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: So I'm gonna just make sure I can see. Can everyone hear me okay?

[Vice Chair Edye Graning]: Yes. Thank you. Jeff and I have known each other for a long time.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: Oops, let me just open this over. Sorry. First, I would like to thank you for inviting me to testify today. I am the Director of Technology and Network Engineering for Mount Mansfield Unified Union School District here in Vermont. And I appreciate the legislature's attention to student data privacy and security. I share the goals of this legislation and want to offer some insight from our district's experience with educational technology and student data privacy to help inform your deliberations. So first, I'd like to talk a little bit about our current student data privacy practices. And I'll back up a little bit and talk about the Vermont Student Privacy Alliance. Our district's a member of the Vermont Student Privacy Alliance, which was established in 2018 as a collaboration of Vermont supervisory unions and school districts that share common concerns around student privacy. And membership with the Student Privacy Alliance is free to all Vermont school districts, courtesy of the Vermont Agency of Education. The AOE is also an official partner of the Student Data Privacy Consortium, and Vermont districts may join and use the database of negotiated agreements at no cost to individual districts. The National Student Data Privacy Consortium and Vermont State's partnership with them ensures that our schools have free access to the Student Data Privacy Consortium's resource registry, which is a database of executed agreements, the model data privacy agreements to use with vendors, training and support resources around data privacy, and a community of other schools addressing data privacy. So this standardization means that vendors and schools have common expectations when entering into relationships without having to renegotiate terms in every instance. Since 2016, over 275,000 standard data privacy agreements have been executed and subscribed to nationally through this consortium, demonstrating the widespread adoption and effectiveness of this approach. The way an agreement works is the originating district uses a model data privacy agreement, and this is then signed by the district and the vendor. And then subscribing districts can sign what's called an Exhibit E agreement, which also gives them the same protections and agreement as the originating district. And then the data privacy agreement is added to the database for all other members to view. This makes it easy to enter into agreements for schools and vendors. This is important because it keeps the cost down for new software and agreements can be created in a timely manner. While the VT SPA membership is free, our district does invest an additional $1 per student per year by contracting with a nonprofit called the Technology and Education Cooperative. And this service provides our district with several critical functions, mostly drafting and negotiating comprehensive data privacy agreements with vendors. They also manage and maintain these agreements, constantly monitoring vendor changes to terms of service, data breaches, and any other changes to the software. They hold vendors accountable to their contractual obligations, and they provide expertise in data privacy law and best practices, and all contracts are reviewed by their lawyers. I have attached to the documentation one of our student data privacy agreements for you to review. This one was for Coral Vector. This agreement includes detailed student data and how it may be used by the vendor, strict limitation on data collection, only allowed to collect what is essential for educational purposes, They're prohibited from selling or sharing student data with third parties. Restrictions on de identified data and its use. And sub processors or subcontractors meet the same privacy requirements. Sorry. Sorry. Requirements for encryption and data security measures aligned with NIST cybersecurity frameworks, parent access rights to review and correct any student data collected, data breach notification procedures, provision for data disposal upon contract termination, compliance with FERPA, COPA, and state specific privacy laws for Vermont, restrictions on targeted advertising and student profiling, annual security audits, and criminal background checks for employees. This agreement is notable because it does cover design and graphic software used by our middle schools for design tech classes, Despite being general purpose design software not specifically created for education, this company operates under the same strict contractual limitations that protect student privacy. Under the definition of this bill, I believe they would be considered an educational provider. I'm not sure though that they would submit to a certification process to prove they provide an educational program, even though this is a valuable learning tool for our students. This is just one example of many software tools that we use for education that may not consider themselves EdTech software. I would also like to add that we require a student data privacy agreement for all software that collects student data. This is not limited to only student facing software. And these are also required for paid and non paid products, either licensed or non licensed. I'd also like to talk a little bit about our current vetting process. When a teacher requests a new educational software, we follow a multi step evaluation process. The initial request, teachers complete a software request form, which includes questions such as what is the content goal or task that you want students to accomplish? How will this tool help you accomplish your learning goals for students? What specific features does this tool offer that other tools don't offer? We then do a privacy review. So we determine whether we can negotiate a data privacy agreement through the student data Privacy Consortium. We do an educational review. Our curriculum coaches, digital learning leaders, and curriculum councils work with the teachers to evaluate whether the software meets educational requirements and aligns with our curriculum standards. We do a financial analysis, review the costs, and assess whether we have similar tools already in our technology portfolio. Part of this includes the cost of supporting the software and professional development. We then do a pilot program. If the software passes all of these hurdles, we conduct a pilot program with a limited group of teachers and students to assess its effectiveness. After a successful pilot, we make software available to all students at the appropriate grade level, and this ensures consistency and equity across our district. This process ensures both data privacy protection and educational quality while being fiscally responsible. Importantly, by going through this comprehensive educational software vetting process, we ensure that there is genuine buy in to using the software. The teachers and curriculum leaders who will be using the tools for their students have made the professional determination that the software supports the student learning outcomes. This teacher driven approach means that software adoption is purposeful and pedagogically sound. While our district has implemented strict data privacy procedures, I acknowledge that not all Vermont schools have the same level of oversight. This disparity is a legitimate concern that H650 seeks to address. As you consider this legislation, I'd like to share some observations about the implementation of this bill that may be helpful. So as you refine H650, I'd like to highlight some areas that may benefit from further discussion. First, the scope and definition of educational technology. This bill defines educational technology product as any student facing software application or platform that may collect, process, or transmit student data, and that is used for teaching and learning purposes in a school. This definition would encompass a wide range of tools. To help illustrate the scope, here are examples of commonly used platforms in schools. Google Workspace for Education, including Docs, Sheets, and Slides. Students use these to create writing assignments, presentations, and collaborative projects. Google Docs processes and stores student generated content. And I do want to add that Google Education Workspace is separate from their commercial products, and data privacy is separated from their commercial side. Learning management systems, such as Schoology and Google Classroom, video editing software, such as WeVideo, educational content platforms, such as Renaissance Learning, Happy Numbers, Math Facts Lab, Design and graphic software tools such as Quarrel Graphic Suite used in visual arts, graphic design, and middle school design technology classes. These tools range from general purpose platforms to specialized educational software. So, some questions to consider. Would all of these require state certification? And how would the certification process handle multi product vendors like Renaissance Learning, which offer assessment, practice, and analytics tools? Would they need separate certifications for each product, or would a vendor level certification be sufficient? This bill's certification criteria includes things like compliance with curriculum standards, advantages of using the product compared with non digital methods, whether the product was explicitly designed for educational use. These criteria make sense for purpose built education software like Renaissance Learning STAR assessments, but become harder to apply to general purpose tools like Google Docs, which wasn't designed specifically for education, but is used extensively with our students. Some general purpose tools may not pursue certification or meet these requirements. Will we then not be able to use them with our students? And then some implementation and resource considerations. Section two of the bill requires schools to submit a list of all educational technology products currently in use by 12/15/2026. To provide context, our district does use over 110 different software tools and platforms across all grade levels and subject areas. Across Vermont's 300 plus schools, this represents a significant collective effort, and more importantly, the Secretary of State's office would need resources to review and certify potentially thousands of products. And my question is, does this office have the resources to chase down and monitor certification in a timely manner? And I think it sounds like they may not. Understanding the resource implications for both schools and the state may be helpful in planning for implementation of this bill. Certification standards and process. This bill gives the Secretary of State authority to develop, publish, and annually review the standards. Again, does the Secretary of State's office have the expertise in educational technology, curriculum development, and pedagogy? Will they partner with the AOE, or would a task force or work group be created for this work? How would the Secretary of State ensure certification criteria be applied consistently across diverse products? Would professional educators be involved? Many EdTech vendors include their own studies of educational effectiveness in their marketing packages, yet still do not align with our local curriculum goals. Does their study outweigh our professional educators' evaluations? Will the 07/01/2027 timeline be achievable to evaluate and process potentially thousands of software tools, and will it give schools enough time to adapt curriculum? This bill calls for an annual review of standards. This could affect multi year contracts between schools and vendors. Many of our purchase contracts span three or even five years to save money and reduce annual increases. Would we continue to pay these for these resources but not be able to use them if they didn't certify? Would we be forced to have annual contracts, thus increasing our costs? And then some of the data collection requirements. This bill specifies that the vendor will not collect demographic data of a student except for the name and grade level of the student. Some software we use does collect demographic data other than that, which is valuable for data analysis by our educators. Cost considerations. This requires providers to register annually with the Secretary of State and establish penalties up to $10,000 to failure to register, along with the state certification review process, it's worth considering that these costs may be passed through to schools in the form of higher software prices. For small educational technology companies, Vermont specific registration and certification requirements may affect their ability to serve Vermont schools, potentially reducing vendor options for our schools. And then I have some approaches to consider. Building on the existing Vermont Student Privacy Alliance framework, here are some approaches that could work for H650's implementation. District level software vetting procedures consider requiring all Vermont school districts to implement and document procedures for vetting educational technology products. These procedures could include review of vendor data privacy and security practices, evaluation of educational value and curriculum alignment, a cost benefit analysis, pilot testing before full implementation. The state could provide a model policy for districts that include required components. And then again, leveraging the existing Vermont Student Data Privacy Alliance. The Vermont Student Privacy Alliance already has established requirements that meet state certification standards, such as compliance with FERPA COPA and Vermont privacy laws, prohibition on selling student data, restrictions on data collection only to what is necessary for educational purposes, encryption of data in transit and at rest, data breach notification procedures, parent guardian access rights, data deletion upon contract termination, prohibition on targeted advertising to students, and annual security assessments. This bill could mandate the use of model data privacy agreements through the Vermont Student Privacy Alliance. This would include the use of the vendor agreements inventory in the student data privacy consortium database that is already funded by the AOE. It could also include the funding for privacy review services for districts. This is the $1 per student through the technology and education cooperative that we use or similar services. And then vendor accountability mechanisms. To ensure vendor compliance, you could consider establishing requirements for vendors to sign the Vermont Student Privacy Alliance compliant agreements, notify districts of any material changes to their data practices, provide transparency reports on any data breaches and security incidents. So as you consider implementation timelines and scope, it may be helpful to understand how teachers currently use technology in classrooms to differentiate instruction for diverse learners, provide immediate feedback to students, enable project based and collaborative learning, support students with special needs, engage digital native learners, prepare students for a technology driven workforce. Ensuring continuity of access to these tools during any transition or certification period would help maintain instructional quality and avoid disruption to student learning. And in conclusion, I think keeping our students safe is critically important and I commend the spirit of this bill. As you refine the legislation, I hope this testimony has been helpful in highlighting some of the potential issues with the certification process done by the Secretary of State's office, the robust privacy infrastructure that already exists through the Vermont Student Privacy Alliance, free to all districts through the AOE partnership, how comprehensive data privacy agreements currently work in practice while also highlighting the need for all districts to adopt these practices, and the vetting process for educational technology and how it is working in our schools, with a need for all schools to understand and follow a similar process. I'd be happy to provide additional information about our district's data privacy practices, practices, share our template agreements, discuss implementation details, or serve as a resource to the committee in any way that would be helpful. And thank you for your dedication to keeping our students safe and considering this testimony.

[Vice Chair Edye Graning]: Thanks, Jeff. Any questions? So Jeff, how many roughly districts across the state do you think have the capacity to do what you're doing? Percent.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: Think that's hard to know. Think the main districts that I talk to on probably a weekly basis have the capacity and a lot of them are doing it. I think it's some of the smaller districts that may not, that don't have, well, think it really comes with buy in from the leaders in the schools, the principals and the superintendent as well, to trust the process. And so we had a lot of buy in from our educational leaders. And so it was really, we're not gonna use this software unless they sign data privacy agreements, period. And we started from there and built through that. So I think that having that support for schools is where it starts. The work itself, I think, can be done. It's not a big burden. And I think teachers appreciate it because they can bring an idea to the group. And for us, it's become a pretty quick process. Like I said, we can find out if there's a data agreement available for that particular software pretty quickly through the database. And we can, if there is already one there, we can have it signed by that day even, by the vendor and by us. And then we can work with the curriculum's folks. And have, you know, and those are usually teacher leaders that can work with their curriculum committees to kind of work with teachers to make sure that it makes sense. So I think, you know, the alternative is the Wild West. I think we've heard from a lot of testimony today that that is not working and that is not keeping our kids safe and it's not improving student learning. So I think pulling that back and saying, let's be thoughtful about how we're using technology and making sure that we're keeping our students' data safe and we're only using it in thoughtful ways is necessary. And so I think if schools want to implement technology, they need to have a similar process in place. I would say the value of the vetting process is that technology is thoughtful in its implementation. It's not just, oh, there's a software tool on a list, go ahead, you can use that. So yeah, I hope that answers your question.

[Speaker 0]: Anybody

[Vice Chair Edye Graning]: else?

[Unidentified Committee Member]: Would it be helpful or just back up, can you reflect on what it would be like if it was more of a centralized process from the agency of education reviewing these tools?

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: Yeah, I mean, think there can be a duality with that. Would say for our district, we are in a place where the data privacy agreements are more comprehensive than the list on this bill. And I would say for us, we would continue to use the Vermont Student Privacy Agreement, and we would just add in that extra check of, know, is it on the state's list?

[Speaker 0]: I see.

[Jeff Wallace, Director of Technology, Mount Mansfield Unified Union SD]: So I don't think it would affect us that much. And I think we would, again, continue to vet our software through a process of working with teachers and looking at the curriculum and making sure that it makes sense to use that tool. Otherwise, we wouldn't use it. So I would say for us, the process wouldn't change very much other than if a software is not on that list and we're already using it or we would like to use it, you know, then it would be, oh, I guess we'd find something else or we wait a year and see if it makes that list. So that would be the challenge for us is to be able to, if there was a, you know, something that we wanted to use with students, the timeliness of getting that certified through the state and or potentially, like I said, could affect those cost increase or choices coming down to our district.

[Speaker 0]: I see. Thank you.

[Vice Chair Edye Graning]: Thank you so much, Jeff. Thank you, committee. This is the end of our week. We start back up at 9AM on Tuesday. I will see you all then. Take us off the bus. Have a great weekend, everybody.

[Unidentified Committee Member]: Stay safe

[Speaker 0]: out