Meetings

Transcript: Select text below to play or share a clip

[Sen. Virginia “Ginny” Lyons (Chair)]: Dear, lads, so good morning. This is Senate Health and Welfare. It is April 1. That's not a joke. It absolutely is April 1. And we're looking at H-eight 14 and other bills. So Jen, question for you. Do you want to go first and go through the bill or should we start with representative Chittenden?

[Rep. Brian Cina (House Member)]: Think you should go first and walk through it so if they have questions about language. Let's do that. I can answer their questions about process and the

[Sen. Martine Larocque Gulick (Vice Chair)]: rationale. Good. We'd like a lot of

[Sen. Virginia “Ginny” Lyons (Chair)]: time for this one because it's part of a larger issue and it's important for us to understand. Good morning. Jen Harvey from the Office of Legislative Council. So I will put the language up on the screen. This is H E 14, an act relating to neurological rights and the use of artificial intelligence technology in health and human services.

[Jen Harvey (Office of Legislative Counsel)]: Started out as introduced was a much bigger bill that did some of the things that you'll see toward the end are being looked into rather than addressing directly in legislation at this time. Starts out with an intent section, that it is the intent of the General Assembly to protect human rights, promote equity, increase efficiency, enhance accessibility, create transparency, and guarantee accountability in healthcare and human services through the ethical and responsible use of artificial intelligence technology. Intent to maximize the benefits and minimize the risks of the use of artificial intelligence in healthcare and human services. Promote the ethical and responsible use of augmented intelligence in service delivery, coverage determinations, and access to healthcare and human services. Prevent harm from the use of augmented and other artificial intelligence in healthcare and human services, improve the experience of patients, providers, and payers through the use of augmented and other artificial intelligence, and improve quality of care, drive positive health outcomes, and cultivate population health through the use of augmented and other artificial intelligence. So it starts out with that as just a general Section two adds a new chapter in title 18 on neurological rights, although all that is going in here for now is the individual right provision. So this has the state recognizing that each individual has the right to mental and neural data privacy, the freedom of thought, non discrimination in the development and application of neurotechnologies, the right to change an individual's decision regarding neurotechnology, and the right to determine by what means to change that decision, the right to be afforded protection from neurotechnological interventions of the mind and from unauthorized access to or manipulation of an individual's brain activity, and to be afforded protection from unauthorized neurotechnological alterations in mental functions critical to personality. Section three amends the existing Artificial Intelligence Advisory Council, so this is entitled three, And the existing council was established to provide advice and counsel, this existing law, to the director of the Division of Artificial Intelligence, that's within the Agency of Digital Services, With regard to the division's responsibilities to review all aspects of artificial intelligence systems developed, employed, or procured in state government. And the advisory council also engages in public outreach and education of artificial intelligence. I think it's important to look at what their existing charge is, because that is not changing here, although some of their specific short term duties change. So providing advice and counsel to the director of the division of artificial intelligence with regard to the division's responsibilities to review all aspects of artificial intelligence systems developed, employed, or procured in state government. It makes some changes to the membership of the advisory council, and if you're interested, we can look later at who is in A through F that you can't see on here. But it's changing a few of the members and adding a few. Under current law, there's one member with experience in the field of ethics and human rights. That experience description doesn't change, but instead of being appointed by the governor, they would now be appointed by the National Association of Social Workers, Vermont Chapter. You'll see the governor does maintain an appointee later in the list. In Subdivision H, the Commissioner of Health changes to the Secretary of Human Services, and in both cases, or designee. So it could still be the Commissioner of Health if that's who the Secretary of Human Services designated, or some other designee. It adds one member with experience in healthcare appointed by the Vermont Medical Society, one member with experience in public education appointed by the Vermont National Education Association, and then a couple of redesignations, and then it adds the state treasurer or designee, and one member with relevant knowledge and expertise appointed by the governor. So it expands the scope of the background that the person the person appointed by the governor in that. And it extends the length, the duration of the advisory council, which is currently set to be repealed on 06/30/2027. It extends them for an additional three years. And then in section four, it's giving them some specific shorter term tasks, directs the council in coordination with the director of the division of artificial intelligence, and in consultation with interested stakeholders to review guidelines and recommendations from the American Medical Association, National Association of Social Workers, National Education Association, and other relevant professional organizations on the use of artificial intelligence in the fields of healthcare, human services, education, public participation, and public finance. So you can see where some of these new members are then particularly relevant to his work. Directs them to the council to research existing and potential uses of artificial intelligence in public participation processes and in public finance, and to create opportunities for public education and engagement in the development of artificial intelligence policy. And then by January 15, the council would submit a written report to the general assembly, first recommending any statutory changes needed to further the purposes of this act, including protections for neurological rights, protections related to neurotechnologies, and proposed definitions for relevant terminology. Guidance on the use of generative artificial intelligence by regulated professions, and on regulating the use of artificial and augmented intelligence in health insurance utilization review processes. Also summarizing, the report would summarize any additional ways the government can promote the ethical and responsible use of artificial intelligence technology in health and human services and in education. They will propose pilot projects that improve public engagement and public finance using ethical and responsible artificial intelligence technology. And finally, they will identify any reasons for further delaying or removing the new 2030 sunset, that extension by three years of the Artificial Intelligence Advisory Council that was set in section four. And in section five, they act and take effect on passage. So, in underlying statute, are there definitions of some of the terms that are here, for example, generative, artificial intelligence? I don't believe so. I think these are some of the terminology that would be recommended by the council. I think one of the issues that we talked about a fair amount in the health care committee upstairs is just being mindful that artificial intelligence is being rolled out in all sectors. And so making sure that definitions are definitions that are appropriate for use across the statutes and we don't have different definitions of some of these technologies that aren't healthcare specific, just for the healthcare segment that are then potentially in conflict with definitions that are being used in the privacy statutes. So and this will maybe be a question for the reporter, but I'm looking at b on page five. It says guidance on the use of generative artificial intelligence. So if there isn't a specific definition of that, then you can't then those recommendations might be all over the place. Well, that's why the previous provision, and actually this came out of some questioning on the floor in the House, has the council recommending additional statutory changes, including proposed definitions for relevant terminology. That could be part of their recommendations. Okay, well we'll think about that. I'm not sure, I like that thought. I mean, honestly, I think we'll have to talk about that going forward. Yes, we can find, believe there are some definitions in existing statute, not necessarily Extensive. Extensive or sufficient to meet some of the goal of this legislation.

[Sen. Martine Larocque Gulick (Vice Chair)]: Thank you, Chair Lyons. I'm the Senate sponsor of the companion bill, and I'm also going to be reporting the third companion.

[Rep. Brian Cina (House Member)]: I don't

[Sen. Virginia “Ginny” Lyons (Chair)]: know don't see Is a mental is that a mental health chatbot?

[Sen. Martine Larocque Gulick (Vice Chair)]: No. It's it's AI. It should be up there.

[Sen. Virginia “Ginny” Lyons (Chair)]: What number is it? I don't know because I knew we were It's in Hessville. Yep. That we have. What was the? One. Two There it is. Okay. Right, that is in the specific Specifically to the mental health department. That may be, yes, the bill you're gonna be talking about next with Katie, age eight sixteen. Oh, 16. Oh, see. Yeah, it's a different Well,

[Sen. Martine Larocque Gulick (Vice Chair)]: in any case, my question remains the same, which is I'm gonna be reporting age 84 very soon, And this obviously contemplates the use of augmented intelligence in the recordings, the recorded doctor's appointments, visits. I wanna make sure that these are working together in an appropriate way so that this doesn't negate HIV-four. I don't think it does negate

[Sen. Virginia “Ginny” Lyons (Chair)]: I mean, HIV-four on its face is just allowing reporting of telehealth visits in the same way that in person visits are not currently prohibited from being reported. I don't think there's anything most of what is in this bill at this point is really looking for is recognizing rights, recognizing some legislative intent, and then directing the AI Council to come back with to do some exploration and come back with recommendations. Okay. So I don't think there's anything that this bill does that would be in conflict with what age 84 is authorized.

[Rep. Brian Cina (House Member)]: There is an eight sixteen doesn't conflict with it either, because at eight sixteen, it allows the use of artificial intelligence for administrative tasks, which is what recording which which is what note taking is. So you'll see that one in

[Sen. Virginia “Ginny” Lyons (Chair)]: the What about the augmented

[Sen. Martine Larocque Gulick (Vice Chair)]: intelligence that's taking what's being imported and like creating notes from it?

[Rep. Brian Cina (House Member)]: That's that's not technically what augmented intelligence is. Like augmented intelligence is more

[Sen. Virginia “Ginny” Lyons (Chair)]: Hold on. Here's what I'm gonna suggest. But now we are actually moving into discussion on the bill. So, senator, I'm gonna ask hold your question until we've been through the bill. And then because I have a lot of questions too, and so we all wanna make sure that whatever we pass out of here links up with whatever they pass out of there. And so I think it will. It doesn't sound like there is a conflict, so it's what we'll we'll do. Recording that bill today or tomorrow? Tomorrow. So you'll you'll be ahead of the game. Everybody has to follow your lead. Okay. It's going to be fine. And anyway, we didn't want to amend the bill you're reporting. Just be careful what you would do. Okay. I was in health.

[Sen. Martine Larocque Gulick (Vice Chair)]: Yes, I wanted to be in health, but okay. Know it was a different Different bill.

[Sen. Virginia “Ginny” Lyons (Chair)]: Not in the community. Only I'm gonna go first. Senator Morley.

[Sen. John Morley III]: Someone I think, Senator Gulick asked questions. In here, it's quite a bit though, can you explain to me exactly what the word augmented means? I mean, from a

[Sen. Virginia “Ginny” Lyons (Chair)]: legal issue. I mean, no, I'd have to look up a definition for you on my

[Sen. John Morley III]: Because it's through here.

[Sen. Virginia “Ginny” Lyons (Chair)]: It is, but augmented intelligence is a sort of a type of artificial intelligence, and that's where we're gonna get to that need to So take my yes, I think other witnesses can define for you.

[Sen. John Morley III]: Okay, because

[Sen. Martine Larocque Gulick (Vice Chair)]: it's like prevent

[Sen. John Morley III]: the harm from the use of augmented and other artificial intelligence. So it's not artificial

[Sen. Virginia “Ginny” Lyons (Chair)]: intelligence. But is other. That's the and other. So it's not augmented and artificial intelligence, it's augmented and other because they type

[Sen. John Morley III]: So it's a form of artificial intelligence.

[Sen. Virginia “Ginny” Lyons (Chair)]: Remember, this is our first crack at this bill. We have a lot of questions. When you hear from the reporter who is our local expert on this and what I'm gonna ask of Ledge Counsel is if we could look next time that we pick this up or AS whenever we can to look at what are the current definitions in underlying statute so we know Yes. The only one I think I'm aware of is the definition of artificial intelligence systems that is in the creation of the AI council, but I have to end that after, but I will I will look. And then and then beyond that, maybe the whole world of artificial intelligence experts has a glossary for us somewhere. I'm not sure that there is a set. Nothing yet. Well, mean, think there are a lot of definitions, but I think it's an evolving field. And so there are many different definitions or definitions that may be developed for use in a particular circumstance. Sure. But you may wanna hear from the division of Artificial Intelligence Agency of Digital Services, who testified upstairs and are very knowledgeable. We'll do that. So, Representative Gina, I see you're very hesitant to come up.

[Rep. Brian Cina (House Member)]: No, I'm giving Jen a chance to

[Sen. Virginia “Ginny” Lyons (Chair)]: move so

[Rep. Brian Cina (House Member)]: I don't run her over.

[Sen. Virginia “Ginny” Lyons (Chair)]: No, we're really glad you're here and sharing with us. As reporter of the bill maybe you could help us understand a couple of things. One, from your perspective that overview of the bill, what's in the bill, what you hope to accomplish with the bill, and what was the vote in committee?

[Rep. Brian Cina (House Member)]: Coming out of it? Yeah. That's a good question. 09:11.

[Sen. Virginia “Ginny” Lyons (Chair)]: Okay. And then the floor vote?

[Rep. Brian Cina (House Member)]: There was no roll call.

[Sen. Virginia “Ginny” Lyons (Chair)]: Okay. So it was just a floor vote. I can

[Rep. Brian Cina (House Member)]: tell you from what I heard that on second reading, there were a few weak nos from the center of the room, and on third reading, there was, like, one. Like, one like, nay. So It was a masculine voice. But that could mean anything.

[Sen. Virginia “Ginny” Lyons (Chair)]: So welcome, and go right ahead, please.

[Rep. Brian Cina (House Member)]: Yeah, so, as everyone is well aware, think, this point, artificial intelligence technology is infiltrating every part of human life in in in developed world, anywhere where there's technology now. It's in our phones. It's behind the scenes, constantly crunching data and and and directing processes, and it's being used to track us and to sell our information, and then it's and then that information is being used to manipulate our behaviors, and it's all legal. It's all a lot of it is things we're choosing. We're choosing to buy a phone. We're choosing to use apps. We're choosing to shop. Know? There's consent involved. But there's a lot that's going on that people don't understand, and I think being uninformed sometimes changes the dynamic. And in the health care sec in human services sector, there are some innovations in artificial intelligence that are rapidly being deployed that are going to fundamentally fundamentally threaten our rights at some point if we don't get ahead of the curve. And so there's great risk of harm there, as well as harm of exploitation, by by actors that use our data in ways that might influence our behavior. But there's a I think there's actually more potential benefit and more potential innovation and and usefulness in AI than harm if we do it right. And so the intention of this bill isn't to crush innovation or to hinder progress, but to set guardrails so that we don't allow this this, incredible technology to be exploit to be exploited, but rather be used to advance like humanity. So, and so the what it does is it it makes a statement of intent just so that for the record, it's clear what we're trying to do here, so Jen read that. It establishes basic neurological rights and statute even if some of the terms are not clear yet, as a starting place. And it uses terminology from the United Nations so that the language of the neurological rights are internationally established standards that have come out of many years of of discussion between scientists, between leaders, between the business sector, between, you know, philosophers, between, you know, many different many different mindsets have come together to create the this code of rights. And then what it does is it takes an existing tool of government, an existing structure of Vermont government. It adjusts it to equip that tool with the expertise needed at this time to do its job, and then gives some directions to that tool to that existing structure of government. And I think it's important just understand a little more about the Artificial Intelligence Advisory Council if you haven't been here ten years or so, which I think at least two of you have been here that have been here longer. And and one of you has been here maybe long enough. I I don't know how long you two guys have been here exactly. Four years? Three months. Three months. You're both three months.

[Sen. John Morley III]: So we lose

[Rep. Brian Cina (House Member)]: three months. Yep.

[Sen. Virginia “Ginny” Lyons (Chair)]: They're they're so way ahead of us on AI.

[Sen. John Morley III]: Yeah. We got I got an appointment,

[Rep. Brian Cina (House Member)]: Three months ago? Yeah. It was in December. Did you take the position of the controversial Northeast Canada person? I believe I did. Yeah.

[Sen. Virginia “Ginny” Lyons (Chair)]: Alright, senators. I mean, a representative. I'm almost promoting you. Sporeshadowing. Yeah. I understand that. Yeah.

[Rep. Brian Cina (House Member)]: I I can get easily distracted. Sorry about that.

[Sen. Virginia “Ginny” Lyons (Chair)]: That's on the UN use I was gonna have AI counsel. Counsel standards.

[Rep. Brian Cina (House Member)]: So Vermont, in 2018, Vermont became the first state in The United States to pass artificial intelligence legislation on the state level by creating an artificial intelligence task force. And a task force was created with a one year mission. Well, it was, like, it was extended a little, but, like, it was, you know, a year and a half mission to, come forward with a report on what state government should do about AI. And so I'm happy to find that report and send it to you, but you can also eat if you Google, like, Vermont AI task force report. It comes right up. And it's actually on the homepage of the Vermont, division of of AI's page on ADS' website. Mhmm. It it's right on it's right on the the home page, I think. So that and I and I served on that task force. And and so we came forward with this report recommending that Vermont create a com an AI commission and some apparatus in state government. We didn't recommend the the exact structure you see. It was like AI commission and a code of ethics and and certain things. But what ended up happening is I I was asked to be at this transatlantic fellowship with people from the Bundestag and parliament. And then and and after that, introduced two bills. One was the recommendations of the task force. One was a public inventory of all the ways the state is using AI so that the public would know and it would hold us accountable as as a government. So it's not touching the private sector. It's the state saying, we're gonna set an example for how people should do this. And and another feature of it all was Vermont decided to use public education at every step of the way. So the when the AI task force was was doing its work. It it we had public meetings around the state. Like, we went to Applejack Field in Manchester to see their AI set up. We went to Linden, to the state college. We went to different places. We went to the incubator or whatever it was called in in the generator space or in in Burlington. So we went to these different places to meet with the public, and they would speak, and we would have speakers. And it was teaching people and learning, and and then just people who don't know where we're coming to learn about AI. So what what, the reason I bring that up is because it's an important feature of this bill too, is that Vermont from the start had this open process. The idea was like we're engaging the public, we're educating the public. Then there were these two bills. One was follow through on the recommendations and create a public inventory so the public is aware. And what we did is we merged those two bills into one. And then the senate decided to go in this different direction and actually create a division of AI, an AI director, because that's not what the house did. The house had, like, a commission and a code of ethics and an inventory. The senate was, like was basically feeling like it needed to be built into the government a little more. Like, there needed to be so the a ADS was given a division of AI, an AI director, and then the commission was made into a council. There was a code of ethics established and a public inventory. And that is now known as the Vermont model. And multiple states have have been exploring that that that approach. Meaning, you build into the government a structure of of oversight, and it's focused on on the government's use, not on the private sector's use. So now that I explained that to you, how that fits into this bill is we already have this existing structure in in the state government doing this work. And in the health and human services sector, it isn't totally the private sector. It's it's the it's and public education is not the private sector. State budgeting is not the private sector. School budgets is not the private sector. So we're at a point where where AI is is advancing in our education system and our health and human services system, and it shows promise for improving government spending. In Slovakia, they're using it now to look at the the government's budget and budgeting process because it can project quickly, it can it can find things humans can't find and show them to you when you're in when you're fine doing financial planning. So the idea is we're we're gonna take this existing piece of the Vermont model, the AI Council. We're gonna add someone from the medical society, like a doctor or health care provider. We're gonna add someone who knows public education, knows the school system. We're gonna add a social worker who knows human rights and and who were and who understands how the different those different, you know, social workers are trained to work in schools, hospitals, health clinics, all these settings. So they so you have a broader perspective of the different settings. We're gonna add someone from the treasurer's office, and that's up to the treasurer, whoever it is at any given time, who that person is. But someone who understands public finance, who understands the the the the large financial systems involved in managing taxpayer money and the state resources, and then it gives the governor an at large seat. And the reason we did that is because we heard that the governor's current appointee is a great asset, and we don't wanna remove that asset. We want to make but we wanted to give the human rights seat to the social workers, So we just gave the govern whoever the governor is, can pick whoever they think is best at that moment in history. And we gave we gave that group a a mission, which which Jen read to you. So I I I'll pause there. I I just wanted to put it in perspective that we're at this moment in history where we need to take some action. So what we're doing is using an existing piece of our government that has proven to be useful, modifying it a little, and giving it a test. And then in a year, we'll look at what they recommend. And just to also speak to your issue of definitions, there is a lot of debate about definitions because it because the the technology is evolving so fast that today's definition might be outdated tomorrow. So I would recommend that you talk to our AI director and our, you know, our our ADS folks come in here and they talk talk about that with you. It you may choose to a provisional definition, you may not. And if you did, if you amended it to add, like, a provisional definition, I don't think that would be the worst thing. I think what would happen is people are gonna nitpick over it and fight about it, and and then you and it lands somewhere. We chose not to do that doesn't mean that you can't.

[Sen. Virginia “Ginny” Lyons (Chair)]: People are gonna ask you questions. Yeah.

[Rep. Brian Cina (House Member)]: I think I just wanted to address, that's what I wanted to say about the definitions because I called it earlier.

[Sen. Martine Larocque Gulick (Vice Chair)]: I I was just gonna say I'm embarrassed that I didn't realize we have an AI department and AES. So I I you wouldn't know. But okay. Well, I look forward to them coming in and speaking to us. I wish part of me wishes they were a

[Sen. Virginia “Ginny” Lyons (Chair)]: little more visible. It just seems like I don't know.

[Rep. Brian Cina (House Member)]: Don't know how AI is. Just true. And listening. That's just true. Everywhere.

[Sen. Virginia “Ginny” Lyons (Chair)]: Yeah. So other questions that I've got. Ahead.

[Sen. John Morley III]: Thank you, Mr. President. You did a great job explaining it to me in case. So this bill basically is just looking at the neurological rights. It's not looking at healthcare in general.

[Rep. Brian Cina (House Member)]: It is. It is? Yeah. Oh, the the language is still up. In the report, it it it it it that they look at the neurological the neurotechnology piece. Good. Then it asks that it that it looks at generative AI. It asks that it looks at use of AI in utilization review and in financial decision making system. Then it asks that they look at the use of AI in public process participation processes, public finance, public education. So and it says other uses of AI in the health, anything that anything else recommended to meet the intent of the statute. So it's basically that's why we left the intent in because it's if we want them to look at that intent and and say, how do we make sure that Vermont does what it needs to do in the next few years to meet that intent? So we kinda we kinda give them some specifics, but then we say end things bigger, and we ask them to review the guidelines of professional organizations and other relevant organizations. So the American Medical Association has guidelines that they put out, and the National Association of Social Workers do, the National Educators Association does. And another group that they may look at is called NIST, the National Industrial Insight, oh my guy, I

[Sen. Virginia “Ginny” Lyons (Chair)]: think he's a, guys see that. Today I

[Sen. Martine Larocque Gulick (Vice Chair)]: see that. But.

[Sen. Virginia “Ginny” Lyons (Chair)]: So- It's true to standards and technology.

[Rep. Brian Cina (House Member)]: Yeah, And they have a definition of AI. So this is basically focused on government?

[Sen. John Morley III]: Yeah. Yes. And I've gotta believe that AI's all through our healthcare system already. Oh, yeah. It's quickly Yeah.

[Rep. Brian Cina (House Member)]: You're gonna hear about it. Yeah. Okay. Thank you. Yeah. And so and be and, like, I just wanna acknowledge again that the health care system, it isn't totally public, but a vast amount of public money goes into it. It's a public good. So so this isn't about infringing on innovation. It isn't about infringing on people's rights to make money or sell. It's about protecting our rights and making sure the tools are, like, are being used for the best possible purpose. So I just wanna be clear about that because it the Vermont model is all about promoting public education, promoting innovation. It's not about stifling progress. And and I think we wanna be careful. We don't over overregulate, but we also wanna make sure that we have guardrails before people get hurt. And I will rec I will recommend, and I'm gonna send this to you, that you hear from the Neurorites Foundation. Their of their their Their director, I think, is based at Columbia University in New York City, and they testified in our committee. They can give you sort of the basic background on what's happening with neurotechnology. Because we have interfaces rolling out that are wireless, so someday soon, very soon, you're gonna be able to take these, put it in your ear, so this exact thing, go like this, and it's gonna link with your brain. We're very close to that. And there's chips being implanted in people's brains currently. So, yes, and it's allowing people who are paralyzed to be able to move things with their mind.

[Sen. John Morley III]: I think eyesight was above the blame.

[Rep. Brian Cina (House Member)]: Yeah. So so I mean, I wish what I'm getting at is, as scary as it may sound, it's here. It's we we cannot really stop it at this point. Maybe an EMP, like, if the sun had a big solar flare or if there was a nuclear war, would stop it. But, like, other than that, I don't know what else is gonna stop it at this point. But so we might as well, like joke. Yeah. Well,

[Sen. Virginia “Ginny” Lyons (Chair)]: the the point that you're making, I think, is really an an important one and that is that AI is used to improve the human condition, not to make decisions contrary to that. So those are the guidelines that you're talking about and completely understood. It's very helpful. So the question then, I think we'll have to look at the task force report to fully understand that bottom line that you're talking about public engagement, not stopping productivity or entrepreneurship, but protecting rights. So then you also indicated the use of the UN definition for neurological rights.

[Rep. Brian Cina (House Member)]: Yeah, we took their code of rights.

[Sen. Virginia “Ginny” Lyons (Chair)]: And so it's a code? Yeah. Okay. So we'll look at that. And and then are there other things that the UN is coming up with that will solidify some of these ever changing definitions and categories?

[Rep. Brian Cina (House Member)]: I think there is, and and that's why we ask that that that the council look at the UN, look at all All

[Sen. Virginia “Ginny” Lyons (Chair)]: of that.

[Rep. Brian Cina (House Member)]: Yeah. While they formulate because I we realized there's no way we and us, like our body, could figure this out in a year with our workload. But the council would have and that's why it goes into effect on passage. Because the sooner we pass it, immediately these people will be appointed to the council or they could start working on it.

[Sen. Virginia “Ginny” Lyons (Chair)]: So, and then I'll ask you another question regarding, you have the social worker, who I highly regard, and I know I have one sitting at the table. Me too. So, was there any other discussion about other, the ethics network or others who are involved in ethical decision making to be included? Not not that I'm excluding social workers, but

[Rep. Brian Cina (House Member)]: There was a lot of talk about the membership because maybe 12 of our stakeholders wanted to offer to be on council. We we heard nurses, doctors Of teachers, the hospitals, the the designated agencies, the everyone was interested. And so we didn't want to inflate the size of the task force too too large. So what we did is we we stuck with the general idea of the original bill, but we did put in language and our and make it clear that all are gonna be welcome at the table, even if everyone does have a vote of voting voice. And that and and so you can continue those discussions, but we have we did have extensive I think the part of the bill that took up the most time in the end was talking about membership for, like, out in a few hours, and people really you know, can you imagine, like, 11 people instead of five talking about that? So

[Sen. Virginia “Ginny” Lyons (Chair)]: You've got nine members, which is a good number, so when you have to vote.

[Rep. Brian Cina (House Member)]: Oh, do score the nine. You're seeing you're only seeing

[Sen. Virginia “Ginny” Lyons (Chair)]: this is these are the Duke members. You're so you're only seeing some of them. I think there's 14 members. 14 members.

[Rep. Brian Cina (House Member)]: The ones you 50. The ones you don't see are, the ones you don't see are like public safety, commerce, like there's some other

[Sen. Virginia “Ginny” Lyons (Chair)]: We should.

[Rep. Brian Cina (House Member)]: Yeah, agency heads who have seats that are in the like ellipses, is that what it's called? Ellipses part of the bill, so.

[Sen. Virginia “Ginny” Lyons (Chair)]: Okay. Alright. Okay. Well, we'll look at that. But did you think about the ethics network at all? I think they are still engaged in ethical decision making, but you did.

[Rep. Brian Cina (House Member)]: You can continue thinking about it. We never talk about clinical ethicists or this kind or that. Yeah. Understood. Yep. And I acknowledge my bias. I'm a social worker. I served I was the social worker on the original task force. I think that it was it Vermont wouldn't have the the approach we have if there hadn't been a social worker involved, that we have a state code of ethics for use of AI that you can look up in the state. So I think that that perspective is a diff a little different than ethical decision making.

[Sen. Virginia “Ginny” Lyons (Chair)]: Yes.

[Rep. Brian Cina (House Member)]: You know?

[Sen. Martine Larocque Gulick (Vice Chair)]: Understood. Yeah.

[Rep. Brian Cina (House Member)]: But but I don't but I think there's a space for that perspective. And if if they're not on the council, I would hope that the people involved in this work really do cast a broad net. And it it would be great if they're going to facilities around the state and meeting with providers and meeting with people at schools and doing public teaching people about what neurotechnology is and about the risks and benefits as they go along so the public knows more before they're clicking I agree on their phone, and then their memories and thoughts are being sold to a company, you know, because we're very close to that, and we can't really stop that. So what we can do is book for our reals.

[Sen. Virginia “Ginny” Lyons (Chair)]: Stop asking. Well, anyway, somebody should have thought of this. No tax. Senator think about my A sin to tax. Your what? I've been looking for new sin to tax. Oh. I think it found

[Rep. Brian Cina (House Member)]: AI. AI robot tax?

[Sen. Martine Larocque Gulick (Vice Chair)]: Yeah. Tax.

[Rep. Brian Cina (House Member)]: Neurotech tax. Neurotax.

[Sen. Virginia “Ginny” Lyons (Chair)]: There we go. Representative Chino, this has been very helpful. Really appreciate your time.

[Rep. Brian Cina (House Member)]: Well, thanks. I'll step back and let my committee make a couple. I'm gonna I'm gonna get something in the office and back. But I'm gonna send you the neuro tech the Neuro Rights Foundation info and and

[Sen. Virginia “Ginny” Lyons (Chair)]: Whatever you have, you know, send send to Calista, and we also have connection with all the work that your committee did. So Okay. Any any specific folks that you think are most useful, let us know. Okay.

[Sen. Martine Larocque Gulick (Vice Chair)]: Really good. Alright. And you've

[Sen. Virginia “Ginny” Lyons (Chair)]: got us in the right direction.

[Rep. Brian Cina (House Member)]: Alright. Well, thanks for making so much time to talk about it.

[Sen. Virginia “Ginny” Lyons (Chair)]: Well, it wouldn't happen if it weren't important. And I know that you were the driving force in 2018 that put it out on the table and everybody was saying, why are we doing this? And now we know.

[Rep. Brian Cina (House Member)]: Well, said that for like a minute and then they were like, and then

[Sen. Virginia “Ginny” Lyons (Chair)]: they they need to do something. Thank you for your work. It's good. Alright. So committee, I think this will be a bill that will take up and take some time. We also have h eight sixteen, and we have so I think Katie, I think we're gonna have the reporter first. Is your preference? Prichlows, thanks for being here. Thank you. It's time to give us some info. It's good. It's great. So let me let me ask you this. Do you know everyone around the table?

[Sen. Martine Larocque Gulick (Vice Chair)]: I don't. I I I walk you guys all the time.

[Sen. Virginia “Ginny” Lyons (Chair)]: Oh, it's so exciting. The most exciting part of your day.

[Sen. Martine Larocque Gulick (Vice Chair)]: I got it. You're on my algorithm now when when I go to YouTube, just you guys come up first, by the way.

[Sen. John Morley III]: John Morley, Benson,

[Sen. Virginia “Ginny” Lyons (Chair)]: Orange. Ginny Lyons, Chittenden, Southeast. I am Martine Ginny and Cotton's from Washington. So, our ask of you is just introduce yourself and then give us why it was introduced, what's in the bill, of your overview. We need to understand what we're looking at here.

[Rep. Wendy Kirchhoff (House Member)]: Sure. So my name is Wendy Kirchhoff. I'm from Chittenden 19, Manchester. And I'm introducing h 16, which is a bill that just establishes reasonable guardrails around the use of artificial intelligence and mental health. And I must say my my committee made the expert on all things AI, and it's been a crash course for all of us. But it's here to stay, and it this is here to just put in guardrails as far as its use in mental health. So this is just my my core course. It's free. It's so artificial intelligence is rapidly becoming integrated into many aspects of our daily lives, including tools to claim that claim to offer emotional support, mental health guidance, and therapeutic assistance. And while these technologies have the potential to increase access to information and support, they also present significant risk when used without appropriate oversight, particularly in areas as sensitive and complex as mental health care. And that's why this bill is very important because it does put in some pretty significant guardrails. So mental health treatment requires professional judgment, ethical responsibility, confidential protection, and accountability. AI systems, while powerful, do not process clinical training, licensure, or ability to understand full healing context behind someone seeking help. Age 16 does not seek to prohibit innovation. Instead, it sets clear boundaries to ensure that artificial intelligence is not used to replace licensed mental health professionals or misrepresented by providing professional care. The bill is designed to protect patients who are responsible use of technology and ensure transparency when AR AR tools are involved in mental health related services. By establishing these guardrails, Vermont can encourage innovation while prioritizing the safety, dignity, and well-being of individuals seeking mental health support. This bill is about protecting people at vulnerable moments in their lives and ensuring that technology serves as a support tool, not substituting qualified professionals. That's really what this bill does. And then I can go over maybe the the steps for this. Is that what

[Sen. Virginia “Ginny” Lyons (Chair)]: I'll walk through the bill. Okay. Yep. It's good. Question for you. Yeah. So so as you were going through this, did you what were the biggest hurdles you had to overcome to help modify or prove the bills there? Was it smooth sailing? It was pretty smooth sailing.

[Rep. Wendy Kirchhoff (House Member)]: Really, what we were just doing is making sure that the AI could be used. It's a it's a wonderful tool. Yeah. But it's not making the final decisions on any kind of decision making that a professional would do. That that was really Good job. The guidelines just put that into place. Thank you. Yeah. As far as liability and concerns in the steps too. Yeah. And the vote in committee? Eleven zero. Eleven zero. Well, that's good. And then

[Sen. Virginia “Ginny” Lyons (Chair)]: the vote on the floor, I'm thinking it's probably voice vote. Voice vote. They work. Yeah. We like these kinds of things.

[Rep. Wendy Kirchhoff (House Member)]: And that sounds a great one.

[Sen. Virginia “Ginny” Lyons (Chair)]: Yeah. No. Thank you. Thanks for your sharing your floor report and for bringing us the information. Absolutely. Thanks for having me. All right. All right. Now you can get back to work and work on some of our let's see. Are human you services or? Yeah.

[Rep. Wendy Kirchhoff (House Member)]: Health care.

[Sen. Virginia “Ginny” Lyons (Chair)]: You're using health health care. Yeah. So that We're looking at you right now.

[Rep. Wendy Kirchhoff (House Member)]: I know. Do good work. Okay. Thank you. Katie,

[Sen. Virginia “Ginny” Lyons (Chair)]: why don't we take a look at the bill? Good morning. Good morning.

[Sen. Martine Larocque Gulick (Vice Chair)]: Katie, we're glad that council.

[Sen. Virginia “Ginny” Lyons (Chair)]: Let me share my screen

[Sen. Martine Larocque Gulick (Vice Chair)]: with you. Here we go.

[Sen. Virginia “Ginny” Lyons (Chair)]: So we have h eight sixteen. I don't know. We have a companion bill in here that was here. Yeah. Yeah.

[Katie (Office of Legislative Counsel)]: 241. That is actually a there were two nearly similar, but not identical AI bills that were introduced on the house side, and I think yours is the companion to the other.

[Sen. Martine Larocque Gulick (Vice Chair)]: Okay. Oh my god. Yeah. So

[Katie (Office of Legislative Counsel)]: I made a chart looking at the differences between the two upstairs. So I will send that to you so you can Maybe it would be good to bring it in here so we can see if there's something in that bill we'd Okay. Like So let's put that up for some of your testimony and stuff. Okay, I'll send it to Colisto. Yeah, and I'm good. Okay. Okay. So we have H816. Already sort of gotten an overview of what this bill does, but it regulates both the profession of mental health professionals and outlines how, when AI, using AI is appropriate and when it's not appropriate, for what uses it's not appropriate and needs their professional practice. And it also prohibits the use of an entity using AI in the state for therapeutic decisions, sort of like a CHAPA. So those are the two components of the bill. We start with a purpose section. It is the purpose of this act to safeguard individuals seeking mental health services in Vermont by ensuring that therapeutic judgment, clinical decision making, and therapeutic communication remain the responsibility of medical professionals and are not delegated to artificial intelligence issues. Respecting individual choice and selecting mental health services, including community, peer and faith based options. And allowing the responsible use of artificial intelligence for administrative, operational, documentation and quality improvement functions that support access, efficiency and innovation in mental health services. Then we move to section two. This is in title three, and there is a long list in title three of what constitutes unprofessional conduct for the various professions that are regulated by OPR. So you'll see that I have the list omitted here with these ellipses at the top of page two, but being added to that list is for any mental health professional misuse of AI pursuant to a section we haven't looked at yet. So we'll look at that. But if AI is misused, constitutes unprofessional conduct by a mental health professional. Next, we move to language in section three. This is in title 18, the health title. And first, you'll see some definitions. These definitions are repeated in the next section of the bill, which would appear in a different title of the VSA. So first, it defines artificial intelligence to mean engineered or machine based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives, how to generate outputs that can influence physical or virtual environments? So I'm gonna ask this question because I asked it in the other bill, which is generative artificial intelligence. So here there's a definition for that. Is there a where did that definition come from? Originally? Yeah. So, I mean, because what we heard I'm trying to put all this stuff together. What we heard for for h eight fourteen is we can't land on a definition at this time. Things are changing. Yep. And now we have a a landing for generative. And so just a question. Yeah. Let me answer it off the top. I don't think I can exactly answer your question, but let me tell you what I do know. Oh, good. So generative artificial intelligence, I'd have to look through my notes to see where that originally came from. I think it was probably came from the sponsor originally. The definition of AI, the bill is introduced had a different definition. Prior to this being voted out of House Health Care, House Commerce that was also working on an AI bill reached out to House Health Care and said, we don't love your definition of the bill that you have moving. Why don't you conform it to the one that we're moving? So this subdivision a one matches what was moved by House Commerce in their AI bill. But with the exception of this last sentence, artificial intelligence includes generative artificial intelligence. So then the question is Well, there's more to consider. So my understanding from House Commerce is that this sentence kind of creates a circular situation. However, House Healthcare felt strongly that they wanted that sentence in because that is the only place that generative artificial intelligence was referenced in the bill, and they wanted to include a definition of generative AI. So they included that sentence to get the definition of generative AI to keep that in the bill. So I don't know that it's a problem that has been fully resolved yet, but that was the recommendation from house commerce as we approach crossover to to move the bill. So what this is really helpful because as we're looking at age eight fourteen and asking the council to make decisions about definitions and other things, it might be useful to reference what's actually happened already. We'll sort that out later. Well, we should know the definition of those, like, just

[Sen. John Morley III]: Yeah.

[Katie (Office of Legislative Counsel)]: It's on page five of page eight fourteen, and then this one is a definition, and who knows where it goes.

[Sen. Martine Larocque Gulick (Vice Chair)]: Similar to the Harvard University definition. Yes.

[Katie (Office of Legislative Counsel)]: That's another thing. Thank you for No. Saying This is I had forgotten that piece, that the reason that House Commerce had recommended this definition is because I believe it's language that California has been using, they are becoming A leader. That language the language they're using around AI is becoming the standard in the field at this point. That's useful. And this is, you know, information this is not my practice area. So this is information I'm I'm learning from Tufts Commerce and my colleagues who practice in this area more regularly. We're all learning together. Great. I'm glad you said that. Okay, so we've covered those two definitions. We have a definition of mental health services to mean support, counseling, therapy, or psychotherapy services provided by a mental health professional to diagnose or treat an individual's mental or behavioral health or provide ongoing recovery support, excluding religious counseling. And a definition of therapeutic communication. It means a written or spoken interaction intended to diagnose or treat any type of mental or behavioral health concern, provide ongoing recovery support, or provide any advice related to diagnosis, treatment, or recovery. And then, here we have a prohibition. So a person, corporation, or other entity shall not offer, provide, or advertise mental health services in the state that represent artificial intelligence as providing therapeutic judgment, diagnosis, treatment, or therapeutic communication. Nothing in this subsection is to prohibit the use or disclosure of AI for administrative documentation, operational, or quality improvement purposes when a health professional retains clinical responsibility as authorized in the section we haven't looked at yet. Yeah. Well, everything is I'm talking to somebody who does. Mhmm. So the whole Internet, social media, we have what you need to solve all your problems, information. What so commerce is probably where in our oh, institutions would be picking this up. Institutions. So Okay. Thank you. They do. Oh, Thank you. Just a question about what regulatory authority is there within the state over the Internet provision of information that would be violating them. Yeah. Yeah. I can see a lot of people being taken. I think that's a good question and I'm not always in the room knowing that many is hearing testimony, so I don't know to what extent they I'm gonna go talk around. Can talk to me on that issue? Thank you. So section c, this is a penalty section. So this would treat a violation of this section as a violation of the Consumer Protection Act. This language probably looks a bit familiar. This is language that we use elsewhere in statute, but this gives the attorney general the authority to to bring enforcement for violations of this action. Does that also include private body that It says parties have the same rights and remedies. I think it does. Again, not my title, but I will confirm again. Okay. There's been a lot of discussion about that. Oh, with kids COVID. Kids COVID. Right. Yeah.

[Sen. Martine Larocque Gulick (Vice Chair)]: I think we had we use

[Katie (Office of Legislative Counsel)]: this language of the PFAS bill, and I think the same question came up, and I think the answer was yes. But let me let me look at my notes again before I That's okay. We're just gonna get started in the Okay. Just questions. Down on the table. Top of page four, Nothing in this section shall be construed to preclude or supplant any other statutory or common law remedies. So that would suggest other common law remedies. Civil action could potentially take place. Section four, this is now, we're in title 26. That's our title that applies to the regulation of professions. A new chapter is being created, Artificial Intelligence Regulated in Professions. The idea being that in the future, it might not only be that AI is regulated with regard to mental health professionals, but other professionals. So this is a place where this is a statutory place where future regulation of AI and other professions could live. So we have a subchapter on general provisions, and all that's in this subchapter right now is the two definitions of AI and generative AI that you've already looked at. And then there's a new sub chapter being created specific to mental health professionals. And at this point in time, there'd only be one section in it. So this is permitted and prohibited uses of AI in therapeutic settings. Can I ask, do you know where the commission is located? I know it's under ADS now. The bill that Jenna's working on? Yeah. I'd have to look at her bill. I was just wondering how that coordinates with general provisions, If that's a general oh, if it could go in here? Yeah. That would be interesting. If you were to do that, I don't know enough about Ginny's bill Okay. To answer if that's a good idea or not. I will say that if you do choose to do that, you probably want the two bills To merge. Yeah. Otherwise, one passes and the other doesn't, then you have a little bit of a pass. But if that is doable, if that is your choice. Well, it depends on if it fits. Think I yeah. Yeah. I don't know. I I haven't looked at her bill. I don't know if it makes sense, but to the extent it does make sense and you do wanna do it, I think Jen and I could work together and put it into one document. Thank you. Page five. So now we have a lengthy definition section. Administrative support means a task performed to assist a mental health professional and the professional's delivery of mental health services, such

[Rep. Brian Cina (House Member)]: as

[Katie (Office of Legislative Counsel)]: scheduling, billing, and general logistics, but excluding therapeutic communication. Subdivision two clinical responsibility means the duty of the mental health professional to review, approve, and remain legally accountable for any use of artificial intelligence in connection with the provision of mental health services. Consent means an explicit affirmative act by an individual that communicates in writing, voluntary, informed, and revocable agreement. Consent does not include acceptance of broad terms of use agreements, passive actions, or deceptive practices. So again, I just dive into what can we do around the internet for that one. So this, we've now transitioned from what is happening potentially online, somebody advertising a service in the state. Now we're looking at the regulation of people licensed mental health providers providing services in the state. And here we get into who are we talking about when we say a mental health professional, and it is a lengthy list. So anyone licensed, certified, or rostered to provide mental health services as a physician, an APRN specializing in psychiatric mental health, a psychologist, a peer support provider or peer recovery support specialist, a social worker, an alcohol and drug abuse counselor, a clinical mental health counselor, a marriage and family therapist, a psychoanalyst, an applied behavior analyst, and a non licensed or non certified psychotherapist, a non certified psychoanalyst, or any other profession that provides mental health services except as exempted in subsection D, which we will look at. We have a definition of mental health services. I already went over that with you in the previous section. Peer support means support services provided by an individual with lived experience of a mental health condition or substance use disorder that is not certified. So, peer support that's happening by somebody who's not certified is not covered under this. If you're a certified professional, then you fall into the definition of a mental health professional. We have a definition of religious counseling. That's one of the carve outs. So that means counseling provided by clergy, pastoral counselors, or other religious leaders acting within the scope of the individual's duties if explicitly faith based and not represented as clinical services. And supplementary support means a task performed to assist a mental health professional and the professional's delivery of mental health services, excluding therapeutic communication and administrative support. Therapeutic communication, we already looked at. Therapeutic decision means the final clinical determination regarding diagnosis or selection modification or termination of treatment or care. Therapeutic decision does not include algorithmic risk scoring, data analytics, or other clinical decision support tools when used under the supervision and authority of a licensed mental health professional. And then we go into subsection about permitted uses. So when AI is permitted to be used by AI by mental health professionals. So I want to go back to I think this really falls into what did the committee discuss discussion how did the committee look at therapeutic decision making and not including certain things. Was there any talk about, this is probably rhetorical, bias, talk about data can be biased and how it's developed and structured. I'll ask that question. I don't know if I'm the best person to answer that question. No. I don't know who I don't know who can answer that. Alright. Subsection b, a mental health professional is allowed to use AI for administrative support, supplementary support, and operational or quality improvement functions, provided that the professional retains sole responsibility for therapeutic decisions. Permitted uses include scheduling, billing, coding, and claims processing, transcription and documentation support, preparation and maintenance of clinical records, de identified data analysis for quality improvement, and workforce and capacity planning, where the mental health professional reviews, excuse me, modifies where necessary and approves the final product. So those are allowed uses of AI. We have language about confidentiality and consent. So, in subdivision, c one, any administrative support, supplementary support tasks conducted using AI, including transcription and reporting, shall be subject to disclosure provisions and title eight. These are sections that already exist in Vermont state law about protection of patient privacy. And then we have consent, that consent by patient or client is required when AI is used to record identifiable therapeutic communications. And we have a bill. And then, in subsection D, we get into prohibited uses. A mental health professional shall not use AI in a manner that allows the AI to independently make therapeutic decisions, independently diagnose, independently determine treatment, or independently generate treatment plans. Nothing in this subsection prohibits the professional from disclosing or describing the mental health professional's use of AI for administrative support or supplementary support purposes to a prospective current or former patient or client. And then we noted that there would be some exceptions. So here they are in E. This section doesn't apply to religious counseling, peer support that's provided by a person who isn't certified, and generalized educational and self help resources that do not report to offer mental health services. And it states back on passage. Talk about can you talk a little bit about what it means to be exempt? That the really, prohibitions on practice wouldn't apply to you. So this limits, the use of AI for mental health professionals, sort of those more administrative and operational types of tasks. Then it prohibits mental health professionals from using AI to make sort of therapeutic independent decisions, diagnoses, treatment plans. So, this wouldn't apply to any of the individuals who are listed, meaning that they would be allowed to use AI for those purposes. However, let's say, for example, a peer support provider who is not certified, they may not necessarily be the person who is developing a therapeutic plan. Same thing with religious counseling. They might not be the person providing therapeutic care. So, I'm looking at number three there. Mean, I can look at any one of them. I'm thinking about some of the marketing that we see now in social media. I'm going back to internet and social media where a lot of sort of problems exist for people thinking they're being treated by a professional where it will say this is a dramatization and somebody's acting like a doctor. The marketing piece for me gets I'd love to talk with representative Martine and see what Thomas has done and we'll talk with others in our institutions committed and I'll see you shaking your head. But I think there's a lot in here that could boil up and be problematic. You know just a question. Just throw it up in there. Okay. Questions for Katie. This is a neat bill. We have a lot here. It's just there's so much here in terms of protecting human beings. Questions? I think when we take up one, we'll take up the other. I'll try to schedule them near each other so we can see how they relate and or not relate. That makes sense? Thank you. You're welcome. Oh, great. Our brains are gonna pitch close. Are we checking your brain?

[Sen. Virginia “Ginny” Lyons (Chair)]: Yeah. That's what Playing in that.

[Sen. Martine Larocque Gulick (Vice Chair)]: My god. Facebook's cute. So

[Sen. Virginia “Ginny” Lyons (Chair)]: we actually don't have any time on our schedule for a break. Oh. April. So we can So what we're going to is we're going to take a break for