Meetings

Transcript: Select text below to play or share a clip

[Thomas Burditt (Vice Chair)]: We are live.

[Martin LaLonde (Chair)]: Hi. Welcome to the House Judiciary Committee this Thursday morning, January twenty second, and we're gonna turn our attention to h six two six. And we're going to start with a walk through, and then we have some witness testimony. Over to you, Michelle. Thanks for being here.

[Michelle Childs (Office of Legislative Counsel)]: Thank you. Good morning, for the record. Michelle Childs, Office of Ledgesaith Council. I'm going take a look at H-six 26 as introduced. So I'm going to do a relatively high overview of the issues. The two primary statutes that you're working with are the ones with regard to voyeurism and disclosure of sexually explicit materials without consent. So I just wanted to give you a little history. So voyeurism, I meant to go back and look and see when the General Assembly adopted this, but it was a while back. And I had worked on that one originally when the General Assembly was considering it. And the voyeurism one, if you just kind of in order to kind of think about how they're different, voyeurism was originally crafted to be kind of like a peeping Tom law. So I remember at the time, it was kind of things will happen in the community, and then legislators hear about it and say, hey, we ought to have a law. We don't have a law on this. And prior to the adoption of the Boyers in Law, you could maybe try to fit something into lunasubious conduct or maybe prohibited acts, but there wasn't really something directly on point for what we think of that kind of activity. And at the time the General Assembly was considering this, this was when all of a sudden phones started to have cameras on them. So remember there was a time when that wasn't the case. So there were instances that were starting to happen in communities and saying, well, we don't really have a law that quite fits this particular circumstance. And I remember the one that was being discussed around Montpelier was that there was somebody who was and I don't remember the particular circumstances, but kind of peeking through people's blinds and such in their own homes when they might be in a state of undress. And when the Montpelier police finally brought this person in, whatever they found, that he had been taking photos of people as well. And so the voyeurism is that kind of thing. It's not consensual. There's no consensual aspect to the voyeurism. And disclosure of sexually explicit materials is different. That came a few years later, but I see Barbara nodding her head. She remembers all of this stuff. And that was to address, again, the law trying to respond to the emergence of all these new technologies and ways to share images. And so at that time, this new thing was coming on the scene of people, and it was more commonly referred to as revenge porn. And that was situations where someone who may have consented to the taking of an image, so maybe with a former partner and okay with that partner having that image, but did not consent to that partner then further sharing that image. And what people started to see was people break up, and then all of a sudden, those images that you considered very private were now on the internet. And as we know, when something's on the internet, it's hard to get it back. So I just wanted to frame those two issues of how they came about. So we're always trying to The law is really slow compared to the speed at which technology is developing. And so now part of what's in this new proposal in H. Six twenty six is something new that is being addressed in this, which is the idea of sexual extortion, which is as we go through the elements there. And if not, again, addressed in the law, we do have a general extortion statute entitled 13, but there's nothing really kind of focused on this. A lot of states have adopted something to address situations like this because you may have read some things in the news about these types of stories, and it often affects young people who have been through social media or something like that, who have been fooled into sharing perhaps an image of themselves. And then whoever it is on the other side who has received the image is then essentially extorting them, blackmailing them, saying, I'm going to release this image to everybody in school or put it on the internet unless you pay me money, unless you send me more images, unless you do certain things, either do things you don't want to do or don't do things that you could be doing. That make sense? And I'm going to go through the bill. If you can, know there's going be a lot of questions, but I also know you have some really important witnesses this morning, too, and I don't want to run into their time. So we'll do kind of a high overview now. And I'm always here, so we'll be coming back as somebody needs.

[Martin LaLonde (Chair)]: We'll hold our questions in general,

[Michelle Childs (Office of Legislative Counsel)]: because I think the thing is, as we walk through and we talk about the elements, I could see people being like, well, what does that mean? What does that mean? What does that mean? And I may be getting to it in my walk through.

[Martin LaLonde (Chair)]: Gotcha.

[Michelle Childs (Office of Legislative Counsel)]: Very patient. Didn't know that I didn't say anything. Again, just talking in generality. So currently, under the Warriors in Law, the Warriors in Law prohibits viewing, photographing, filming or recording a person who knew or engaged in sexual conduct. And sexual conduct is a sex scene definition. Think you guys have been dealing with that already defined in Title 13. Without the person's consent, when that person is in a place where they would have a reasonable expectation of privacy or under circumstances in which they have a reasonable expectation of privacy. Again, we're kind of talking about the Peking Tom law. The voyeurism law prohibits disclosing images taken in violation of the statute. The default statute of limitations is three years for the crime. And there is currently no specific statutory right of civil action against a defendant that's specifically in the lawyers' law. But there are civil remedies at common law so somebody can bring a suit. But I wanted to point that out because one of the things we're doing is bringing in language from the other statutes and modeling it and adding it into voyeurism with regard to a private right of action. So the general changes to the voyeurism statute is it would extend the statute of limitations for photographing, filming or recording a person and for disclosing those images. And it would extend it to six years for the prime. Viewing would remain three years. So it's making a distinction. And so you'll see there's a lot of changes and cross outs and things, the ways I've reorganized that statute. But most of it is just to make it so that you could have a separate subsection that I could have differing statute limitations for. And it also creates the statutory civil right of action based on the remedy that's provided in the other statute and applies it retroactively to offenses committed prior to the enactment date of the statute of limitations extension. It tolls the statute according to the date of the offense or the date discovered by the plaintiff. It also states that a diagnosis of a disorder related to trauma constitutes harm that's required to recover damages and negligence. So right now, there has to be a showing of and I'll have to look at the language because I always forgive, essentially like physical harm. And so the issue is whether or not you can recover for a negligence claim if there is a significant emotional harm that's been diagnosed as a result of trauma. And so on disclosure of sexually explicit materials without consent, that prohibits disclosing a visual image of an identifiable person. And so that's key. We want to always make sense because this is involving an invasion of someone's privacy. It's taking someone's private moments and making them public. So this isn't AI or anything like that. This is an identifiable person. It's disclosing a visual image of the identifiable person who's nude or engaged in sexual conduct without their consent, with the intent to harm, harass, intimidate, threaten or coerce the person depicted, and the disclosure would cause a reasonable person to suffer harm. The default statute of limitations is three years, and it does include a statutory civil right of action. For the proposed amendment there, it prohibits threatening to disclose a visual image of an identifiable person who is nude or engaged in sexual conduct without person's consent, with the intent to compel a person to do something that they do not want to do. So that's the sexual extortion piece of that. Also the changes that prohibits threatening to cause harm to a person unless that person produces images or produces nude images or images of sexual conduct or engages in sexual conduct. So in that circumstance, you think rather than the actor has a nude image of you and is saying, pay me money or else I'm going to disclose it, this would be, I'm going to physically harm you if you don't send me a nude image of yourself, or I'm going to accuse you of a crime if you don't do something and me explicit images. And then it applies a civil action language retroactively to offenses committed prior to the enactment date of the statute of limitations extension and has the same language around emotional harm and trauma qualifying for a negligence claim in those circumstances. So let's take a look at the language. Am going to go through a lot of this, what is there in existing law that is not changing, but just to draw your attention to some of the terms that we're talking about. So if you look at page two, I assume our versions look the same. Section one, Amendments to Voyorism. So under the definitions, the first definition I want to show you is line 16, circumstances in which a person has a reasonable expectation of privacy. It means circumstances in which a reasonable person would believe that their intimate areas, and we're going get to what that means in a minute, would not be visible to the public regardless of whether that person is in a public or private area. This definition includes circumstances in which a person knowingly disrobes in front of another, but does not expect nor give consent for the other person to photograph, film, or record the person's intimate areas. So if you think about that, I grew up in Atlanta, and one of my favorite things to do was to go shopping at Lomans with my mom. And they had this huge dressing room, and it was just a giant dressing room. And that's where everybody went. Now it seems like they're all little stalls and stuff, but it was one big dressing room. And you just all got undressed in there and tried on your clothes. So I'm doing that voluntarily. I know that those ladies in dressing room can see me changing whatever, but I am not expecting someone to be taking a photograph of me with their camera and then potentially sharing that. So that's the kind of thing we talked about. It's a public area, but the circumstances would lead you to believe that you do have some sense of an expectation of privacy to a certain extent there. The next definition is subdivision four for intimate areas. It means the naked or undergarment clad genitals, pubic area, buttocks, or female breasts of a person. The definitions are a little different between these two statutes around the areas that the visual image would capture. I was trying to remember why. So, but I honestly can't. Eric and I are always hitting about our two memories combined to create one halfway decent memory over the years with this stuff. But I would say the main difference there is you don't have to be new. It could be that you are, again, let's say in a dressing room, and you're not entirely new, but you have your undergarments on. And so that's the definition of what we're using here. So the next one is place where a person has a reasonable expectation of privacy. So that is a place in which a reasonable person, again, it's a common standard we use, right? So it's not the particular person in the action if charges are brought, it's a reasonable person standard, would believe that the person could disrobe in privacy without, their undressing being viewed by another or a place in which a regional person would expect to be safe from unwanted intrusion or surveillance. So that's thinking about your home or something like that, where you are not expecting that someone is going to be viewing you doing that. So you could use the example of you're in your bedroom, but you have your blinds closed and people can't see from the street. But if somebody walked up and they were peeking in between the little slats there and kind of looking there, so that's that kind of circumstance. Next definition I just draw your attention to is on Subdivision 7 for surveillance. It means secret observation of the activities of another person for the purpose of spying upon or invading the privacy of the person. I just mentioned that there's no changes in here around that, but there is language in the existing law around some exceptions for bona fide private investigators who are licensed by the state. And then the last one is view, means the intentional looking upon another person for more than a brief period of time. So, you know, like, some people comes up, oh, I accidentally see somebody. And I didn't mean to. I wasn't trying to be peeking into their bedroom, but I accidentally. So it's not that. It's that you have the highest mental state. It has to be intentional. It does not by accident or just happenstance that that occurs. It has to be intentional for more than a brief period of time other than a casual or cursory manner with the unaided eye or a device designed or intended to improve visual acuity. So moving on to the different elements, so subsection B, you'll see that on lines nineteen and twenty eight, right now the statute is crafted so that viewing, photographing, filming or recording is all in one subsection. I've broken that out again because we're going to need to reference the different subsections for purposes of the statute of limitations. So B is just it's not changing anything substantively, even though there's a lot of strikeouts and underlines. I've just reorganized the statute. So the elements of B when you're thinking about it. Right? So no person shall intentionally view the intimate areas of another person without the person's knowledge and consent, while the person being viewed is in a place where they would have a reasonable expectation of privacy or under circumstances in which the person has a reasonable expectation of privacy. So those you have to prove all those elements as part of the crime. And again, I just want to direct your attention that we're talking about intention. So the highest mens rea, when we move on to disclosure of sexually explicit images without consent, that's unknowingly standard. So they're different. So moving on down to subsection C. So that's where you now have the unlawful photographing, filming or recording, but generally the elements are still the same. You move on to page five, subsection D. That's the surveillance that I mentioned. There's no changes to that. But it's just that no person shall intentionally conduct surveillance or intentionally photograph or record person while the person being surveilled would have a reasonable expectation of privacy within a home or residence. Bona fide private investigators and security guards engaged in otherwise lawful activities when the scope of their employment are exempt from this subsection. E is your newly kind of recontified disclosure to a third party. So if you are violating subsection C and you take a photograph or you take a video of someone. And then E, you then share that with someone. No person shall intentionally display or disclose to a third party an image reported in violation of subsection C. And so those are different because the penalties are higher if you disclose that to a third party or you post it on the internet or somebody's Facebook page or something like that. Page six, just note that there are some existing exceptions in subsection G for law enforcement and Department of Corrections, everybody kind of working within their existing roles. Subsection I is an affirmative defense to a violation if the defendant was a private investigator or a security guard conducting surveillance in the ordinary course of business. So if something happened and that other exception got bypassed, they could raise it as an affirmative defense. However, it's not a defense to then sharing that image. So, made a mistake here, but then if you further disseminate that, then you can't raise that as a defense. So for penalties, page seven, subsection J, I wanted to note for you that a there's some typos there, and it was just miscommunication between me and my editors before it was printed. And so in J, what it should be is that is that the cross references that are in subdivision one should be to B, C and D for each one of those. We go back and forth in emails and marked up versions and stuff, and so sometimes we think we're both on the same page. And so that would be being for a first offense for a violation of viewing, photographing or surveilling. It is a two year misdemeanor. And for a second or subsequent offense, it's a three year felony. And then you'll see in subdivision two, again, this isn't changing substantively current law, I've just reorganized, is that if someone violates the disclosure provision, it is a five year penalty. And those are not changed. Subsection K is a new provision, that you'll see is copied from the other statute, which is putting in a statutory right for civil action. So again, somebody can sue civilly now for damages without this language, but this is specific to this, is that a plaintiff shall have a private cause of action if they intentionally display or disclose to a third party an image reported in violation of the statute. And so in addition to any other relief available at law, the court may order equitable relief. Equitable relief is our like non monetary damages. So something to try to kind of cure or make the plaintiff kind of whole again that it's not money. It could and it gives examples like which is not exclusive, but including a temporary restraining order, a preliminary injunction or a permanent injunction ordering the defendant to cease displaying or disclosing the images. You'll note that the court can grant the injunctive relief, maintaining the confidentiality of a plaintiff using a pseudonym in the case. Subdivision three is the new language that I spoke about that an action brought pursuant the required element in a negligence claim of actual injury to the plaintiff can be satisfied by a diagnosis of a disorder resulting from trauma. So this is in respect to there was a case, do have it, Nate put it on web for you. I think it's called the Kilburn case. It was the case that stirred this. You may have heard about this case in the news, and I'm not going remember necessarily all the facts. But there were two young women, teenagers, who were filmed undressing without their knowledge. And they actually did not discover that that offense had happened until years later when, I believe, friends of theirs found the videos on a pornography website. And so the case that you have there is talking about damages because the statute of limitations on the criminal had run, so you couldn't prosecute them. They could still, under common law, they could still bring a case. So they did for damages. And then the case is discussing damages and what they could recover for. And for the purposes of a negligence claim, the court, based on long precedent, was identifying that there has to be some type of physical injury and that there are and we can discuss this more and go into those cases later. And that there are some circumstances where an emotional harm may manifest in a physical way that might fit that. But the court declined to extend it in that way in this particular case. There was a dissent, I believe, by Justice Reiber. And he said in the dissent, these women were diagnosed with PTSD and they suffered and they continued to suffer extreme harm from this. And he was urging that there be an extension of the ability to recover for those damages. And so that's what that piece is about. And I think, Martin, did you introduce a bill on that?

[Martin LaLonde (Chair)]: Ian, I think it's on a bill that would be broader than just this kind thing. Right. Right. Those are broader. So we may be looking at that later. Okay.

[Michelle Childs (Office of Legislative Counsel)]: So next is on page eight, subdivision four. A civil action authorized by the subsection can be commenced at any time after the Act has caused the injury or condition. We're not withstanding this. There's a provision and title one around retroactive applicability that just generally says statutes, unless otherwise designated, are prospective. But this applies this is applying it retroactively to a violation that occurred prior to enactment date, irrespective irrespective of any statute of limitations in effect at the time the violation has occurred. There is case law that upholds being able to do this. If you think back to kind of the child sexual abuse cases that were in the news a few years ago and things like that, And the legislature did that with the childhood sexual abuse, civil actions. And so this is just kind of modeled after that. So next section, section two, this is on disclosure of sexually explicit images without consent. I'll just highlight a few things in the definition section. Harm means physical injury, financial injury or serious emotional distress. Subdivision three, there's that definition nude, different from intimate areas. So means your parts are uncovered as opposed to I think that's the big way to think about the two is the other one. You can be in your underwear. This one is nude. If you look at page nine, subdivision five, visual image includes a photograph, film, videotape, recording, or digital reproduction, including an image created or altered by digitation. But I just want to remind you that in this one, we're always talking about identifiable person. It could be that you have a person and they somehow manipulate the image in some way, but it's still an identifiable person. Subsection B. So here's where we get to the elements of the crime. So no person shall disclose an image of an identifiable person who is nude or engaged in sexual conduct without the person's consent, with the intent to harm, harass, intimidate, threaten or coerce the person depicted. And the disclosure would cause a reasonable person to suffer. So those are the elements there, and you have to prove all of those. A question that will come up that people may ask about is, do you have to have that provision in there that requires intent to harm, harass, intimidate, threaten, or coerce? I recall at the time when we were working on this that the experts were seeing what we were looking at is that we felt we really needed that because we are talking about something that the court has and you have a decision, which you guys have the Van Buren decision where as soon as this was used, this new statute was used, it was challenged as unconstitutional. It was upheld. So it has been determined to be constitutional. But that case explains a lot around the First Amendment issues involved in this type of thing. And at the time, anticipating that there might be a challenge and a First Amendment challenge and thinking about how do we meet a strict scrutiny test? How does the legislature, if you are potentially touching on First Amendment issues, how does the legislature craft a statute that is narrowly tailored to serve a compelling state interest? And so that's like the highest standard. And so that was kind of the goal. And so my recollection of that was this kind of intent to harm was something that we were doing as part of that. Subsequently, now, years later, there's lots of states that have these statutes. Some have an intent to harm. Some do not. I will say that in Justice Robinson's opinion in the Van Buren case, she mentions the intent to harm as one of the reasons why she feels as though it's narrowly tailored and meets the strict scrutiny test. If you remove that, I don't know that that necessarily tips the Jenga tower enough to but I do know that that's been something that people have wondered about in recent years. And so a violation of that would be as a two year misdemeanor. However, you'll see on line 16 through 18, if the person is doing that for financial profit, it's a five year. So now at the bottom of page nine, we're going to get to the new sexual extortion language. So starting on 19, no person shall knowingly threaten to disclose a visual image of an identifiable person who is nude or engaged in sexual conduct without the person's consent with the intent to compel person, to page 10, to produce nude images or images of sexual conduct, to engage in sexual conduct, to engage in any acts against the person's will, to refrain from engaging in any act in which the person has a legal rights to engage or provide money or anything of value. Again, so this is the person says, I have these images, and if you don't do these things, this is what I'm going to do. Subdivision B starting on line seven is that with the intent to compel a person to produce nude images or images of sexual conduct or to engage in sexual conduct, no person shall knowingly threaten the person to threaten to accuse the person of a crime, cause injury to the person or their property, expose a secret or publicize an assertive fact, whether true or false, intending to subject the person to hatred, contempt, or ridicule, or report a person's immigration status or suspected immigration status. And so that's the case where the person does not have images, but is saying, unless you have sex with me, I'm going to report you to the authorities about using drugs or something like that. So a violation of subdivision two is a two year misdemeanor. So I just want to draw your attention to subdivision three, which is for purposes of this section, a person may be identifiable from the image itself or information offered in connection with the image. So that could be something like there's an image, but you can't really kinda see the face, but then it's posted and be like, oh, here's this person. Right? And identifying that person even though you might not be able to identify them clearly based on the picture. So I'm going to move on and draw your attention to page 12. So you'll see here in subsection E, this is where I took the language around the civil right of action and mirrored it for the Goyerson statute. But you have the new language in subdivision three starting on line 10 about the required elements in a negligence claim of actual injury can be satisfied by a diagnosis of a disorder resulting from trauma. And then similarly, the Subdivision four retroactive application for that. So now we're moving on to Section three. So now we're working in Title 13 still, and this is on the statute of limitations for crimes. So statute of limitations essentially are something that you create here in statute and you can say you're balancing the interests of the defendant's right to a fair trial with the state's interests in achieving justice. That generally statute of limitations are there so that you're not bringing really stale actions. Right? And so you see the existing statute of limitations that exists on page 13. And then we have a new subsection on the last page on page 14, Subsection F, that for prosecutions for voyeurism that involves photographing, filming or recording, or voyeurism display or disclosure to a third party, And all of section twenty six zero six, which is the disclosure of sexually explicit images without consent, is a six year statute of limitations. And that has after the commission of the offense or within six years after the date on which the subject of the offense discovers the existence. So I will note and I'm sure witnesses will have more discussions about this, but generally the discovery rule for tolling the statute of limitations, you see that in the civil context. So let's say a medical malpractice action, right, is that it could be from the moment the harm is caused. But maybe you don't find out for five years that when you had that operation, they left a pair of scissors in you or something like that. So there's that discovery rule that basically allows in cases where the harm maybe isn't necessarily easily discovered at the time that it's caused. So this is brought in here for discussion purposes and is that the nature of these particular crimes when there is an image taken, right, is that it may not be readily apparent to the survivor in that offense to know that this offense has occurred or that they're being harmed by that. So if you look at a circumstance like the Kilburn case, where the young women who had been filmed, it wasn't until years later that they discovered that that offense had taken place and that their images were out there. And so this is an event. I will say an easier, more straightforward way legally might be just to put those offenses that involve the sharing of the digital images, not the viewing because the viewing is not impacted by this. And to put that in something where you have like, the forty year one for the longer period of time. It might be a more straightforward way to go. In general, like, even though there's retroactive application, if the period's already told, you're not going to revive the old action. So I think it would generally be that if you're talking about a criminal situation, if it's already expired, then not like you're going to revive the action in that particular case. I think that makes sense.

[Martin LaLonde (Chair)]: So I assume we'll have you back and walk through where our questions are later. Sure. Probably not until Tuesday next week or

[Michelle Childs (Office of Legislative Counsel)]: so. Okay. All right. I I can take questions, but I also know you have a lot of questions.

[Martin LaLonde (Chair)]: Yeah, let's skip to the witness. Other witnesses waiting, you'll kind of be explaining why we need some of this in our statute. So thank you very much, Michelle. So we'll turn to Brian Montgomery. And if you could unmute yourself, Brian, and introduce yourself and proceed. Thank you very much for being here this morning.

[Brian Montgomery (Parent advocate, Mississippi)]: Absolutely. Everybody hear me okay?

[Michelle Childs (Office of Legislative Counsel)]: Yes. Yes.

[Brian Montgomery (Parent advocate, Mississippi)]: Perfect. Yeah. Thank you for know, giving us the opportunity to speak to you guys and kinda give our perspective. So we lost our son, Walker, 12/01/2022 to sextortion. Walker, was extorted from a guy from Nigeria pretending to be a young girl. It's interesting, you know, the conversations you guys are having kinda how it's explained often, you know, as an image was taken, but Walker's situation was quite different. He was and and by the way, this all happened in one night. So Walker went to went to bed perfectly healthy, normal. Great social life, great friend life, very involved in athletics, the outdoors, farming. We we're a rural family. But went to bed healthy. I mean, everything was everything was we thought I mean, everything was, and that's what we've later learned is that everything was normal that night. He was approached around midnight. Walker engaged in a sexual act with someone through the video portal of Instagram. He thought was a teenage girl. And, you know, as it's explained often, it's described as a picture being snapped or a nude photo, but, you know, this was Walker engaged in a sexual act in a video. Walker didn't know it, but he was being recorded, from that person in Nigeria from a secondary device across the screen. And so, you know, always kind of when I'm telling his story, you know, our story, it's I try to ask people to put their sales in that position, you know, because I think we talk about this criminal act and it many times is described as just a nefarious, you know, activity between teenage kids or somebody that's a, you know, pedophile or fill in the blank. It's an attack and and so, at that point, you know, as an adult, you know, and as you guys are all adults in this room hearing me testify this morning, you know, just you can imagine and you most your most private situation and we've all we've we've all been there as adults, you know, whether that was in a true sexual act or alone, that being recorded and being displayed on the screen we're on right now. You know, that's that's where I want your mind to go because that's where Walker found himself that night. Directly after that act was over, Walker, the the person immediately, started asking Walker for money or he was gonna reveal that information to what they described as the whole world. Started going down his list of Instagram contacts, threatening to send that video to those people. And it was just a complete I mean, when you see the transcript between Walker and this person, it was a fight, most of the night. Then they finally got to they got to the point where they were pretending to send the videos through the messaging portal of Instagram to those contacts. It it was not they were truly weren't sending those videos, but they were, showing Walker screenshots of the video in the outbox as though they were sending them. And, of course, at that point in Walker's mind, I can only imagine that the, you know, the the information was out and, he, you know, he felt as though that was gonna be known by everyone. And anyway, they finally got to Walker's mom and and in that in that sequence and you know, Walker told them directly, I'm gonna I'm gonna kill myself and their response was go ahead because you're already dead anyway if you don't send us the money. So, I I tell that story and you know, within a few hours of that, we woke up the next morning to find our son had committed suicide. That was a little over three years ago.

[Martin LaLonde (Chair)]: Thank you very much and sorry for your loss as It's a terrible story. Would you take any questions?

[Brian Montgomery (Parent advocate, Mississippi)]: Absolutely. Yeah. I think that, you know, that's what I hoped we could do this morning. I'm open to whatever. I mean, there's I've spoken to we've made it a mission to try to move the needle on this type of activity with families, whether it's speaking to churches or schools or guys like you to try to pass legislation. We've passed legislation in our state. I'll be glad to answer any questions around that. Absolutely.

[Martin LaLonde (Chair)]: In which state is that? I may have missed that when you Sure.

[Brian Montgomery (Parent advocate, Mississippi)]: Yeah, we're in Mississippi.

[Martin LaLonde (Chair)]: Okay, I appreciate that. Questions? Angela?

[Angela Arsenault (Member)]: I don't have a question yet. Maybe after Mary speaks, we might, I know Mary and Brian know each other and are kind of in a way here together. Supporting each other and sharing their personal stories. So maybe after Mary speaks.

[Martin LaLonde (Chair)]: Okay, yeah. So we'll hold the questions, if we could go to you, Mary, if you could introduce yourself. You very much for being here as well. Thank

[Mary Roadie (Parent advocate, New York)]: I appreciate that you're talking about this. Hi, Brian. I know that stuff it's hard to say. It's a lot of bravery to be so real about what happened to our boys, but, I I have a thing prepared, but I just wanna lead by saying you'll see the pattern right away that exactly what happened to Riley happened to Walker, and you'll hear happened to Jordan. It's just okay. So good afternoon. No. Good morning, chair and vice chair. I'm you guys are more informal than I expected, which I really appreciate. So, thank you for the opportunity, and thank you to Angela for inviting me. Just for the record, my name is Mary Roadie. I live in very Northern New York for my entire life. I'm a frequent visitor to Vermont. My daughter attends UVM and my family makes maple syrup. So after season trips to Leader and CDL are a big part of our family culture. I'm a teacher, and I've worked in the same elementary school in Canton, New York since 2000. That's long enough to know that kids tell you the truth before adults tell you the truth and I've watched entire families grow up and my job is to connect with children, all of them and I can reach them with a book or a joke or a, you know, a well timed smile. You know, that's the magic of teaching, meet meeting kids right where they are. And this predatory tech environment is driving a wedge into that space that is so important that we nurture for our children. So our communities have really lived Riley's loss with us. They know him. They know me. They walk through that grief beside us always every day. When I speak about protecting children, I'm speaking as a mother and a grandmother, but also an educator who spent more than two decades caring for other people's children with the same devotion that I gave my own. And okay, my son Riley was born 06/02/2005. He's a beautiful, curly haired, funny, impulsive, outdoor loving kid, same as Jordan and Walker and all 38 of them. He lit up every room he walked into. He loved sports. He loved the farm. He loved his huge family and he had big dreams of becoming a state trooper or an NCON officer. And then he died on 03/30/2021. I just figured it out when Brian was talking twenty months. For twenty months, I was telling people about this. And they still let it happen to Walker. Riley died on 03/30/2021 by spontaneous suicide after being targeted by a sextortion criminal on Facebook. He was fifteen years, nine months and twenty seven days old and he's my baby. He was not a heavy social media user. He only wanted Facebook because he wanted marketplace because he had a dishwashing job and money burning a hole in his bank account. So I let him get Facebook and I was honest about his age and then Facebook denied him access to marketplace where he wanted to just look at dirt bikes. He still had to look at dirt bikes on my phone because I told them he was 15. And then they still let a criminal from the Ivory Coast who had 60 fraudulent accounts contact him and pretend to be a teenage girl named Megan Miller. This is gonna be the same thing as their stories. Danny, Megan, I don't know the name of Walker's. She sent Riley unsolicited imagery, see Sam. She pressured him to send images her of herself in a very convincing way that puts these boys in this ultra high. The moment he did, the threats began. This was the morning. The other two are at night. Riley's happened in the full daylight. 9AM to 02:15PM. He was dead. So she pressured him to sell images, to send images, the criminal then demanded money threatened to expose him. They isolated him from every source of love and support, and they used fear and shame as weapons against a child. From the first contact to the end of his life, less than six hours, he was excited about spring break. He was looking forward to hunting and lacrosse. He had plans and none of them included ending his life. His death was not the result of depression or long term struggle. It was a moment of overwhelming panic created entirely by a criminal who used Meta's tools to terrorize him. The heartbreak of losing a child is so unnecessary and it is something that far too many families are experiencing in this unchecked space. I could name them here. I will share with you. But I just want you to know their names should be set at parent teacher conferences and dinner tables, not right here from me to you in a memory of children lost to preventable crimes. This is a crime of coercion. A predator poses as a peer, sends and requests images, and then uses the images as leverage, threatening to expose the child unless they pay money or send more images. I shared with Nate actual visuals of how industrialized this is. These are organized networks targeting thousands of kids at once. It happens incredibly fast. A child can go from first contact to full blown blown crisis in under an hour, Braden. Marcus died in twenty seven minutes. It's enabled by platform design, anonymous accounts, disappearing messages, fake profiles, AI generated images, algorithms that push kids towards strangers. There's a lack of age verification and a lack of parental visibility that would never be allowed on the street or in a brick and mortar establishment. It's happening far more than the statistics show because people are so ashamed. It's so hard to come forward. It's hard for the kids to find someone to tell. Parents immediately go into holy moly, my baby just ran in the street and there, it's but it's verbal but it's what did you do? No, no, no, because they don't know because this is the criminal network and the the social media empire have protected themselves, and they're picking kids. These 38 boys, they weren't looking for reckless or troubled kids. They're looking for good kids, the athletes, the honor roll students, the kids with a reputation, the kids with something to lose, the kids with strong male role models. Fear and shame are the tools of the crime, and it's not the kids' fault. It's not their fault they're targeted, and that's another they're trying to put that all on parents and kids. They're addicted because the platforms are designed to keep them there and they're being manipulated by sophisticated adult criminals who would know exactly how a teenage boy's brain works. Teachers, law enforcement, parents are being asked to respond to a digital landscape. We don't understand and can't see my feed, your feed. It's nothing like the feed of a child. I encourage you for your own research. Start an Instagram account as a 13 year old, and you will vomit. We cannot protect children from what we can't see. We can't guide them through the dangers we don't know exist and that we're being fooled by Johnny Cash songs on Christmas commercials that Meta is safe. Like, these people will always outsmart parents and police because you said law is behind. Of course it is because it has rules. And tech is being allowed to operate fully unregulated. It's impossible to respond to. Part of their business tactic is that we always have to stay in response mode. We never can get to the real laws that say, and I'll put this as in my written testimony, I'm going off, but we never get into the laws that put the accountability on the platforms. These are great, and I want them. But we're dealing with tech companies that have spent so many years embedding themselves into children's lives, even classrooms. Google didn't just show up in schools.

[Martin LaLonde (Chair)]: They

[Mary Roadie (Parent advocate, New York)]: strategically positioned themselves there, creating dependence early in a third party facade from schools that this is okay. Well, it must be okay. I trust school. Meanwhile, the tech executives don't even let their kids have any of this stuff because they know the risk. They know the design. They know the impact. Congress has failed to act for so many years. The criminal statutes aren't updated. There needs to be a serious reexamination of section two thirty of the Communications Decency Act that was written for a completely different era of the Internet. We could go into the First Amendment stuff if that's something you guys are interested in the way that they say this is First Amendment. But Congress is stalled and families have been left unprotected and waiting, and predators have taken advantage of the gap, it's states like Vermont that could lead. We can't keep waiting for the federal action that doesn't seem like it's coming. It's just not free speech to send sexual images to a child. It's just not free speech to put up drug menus and deliver drugs to kids who die, teaching children how to harm themselves with a lamp cord and a door hinge while mom makes dinner. No. That's not free speech. Those are crimes. And when they happen on a platform that is choosing what users see, the platform is a delivery system. So in Riley's case, the man who extorted him was eventually identified and arrested in the Ivory Coast. He was 19 When he used Facebook's tools to kill my almost 16 year old. That arrest is the closest thing to closure my family will ever get. I have to call it a win. Even though nothing about this feels like winning. My kid's gone. And the only reason that criminal ever reached him is because Meta allowed it. H six twenty six matters because it names the crime and gives law enforcement a direct path and it sends a message to families. You are not to blame. This is a crime, and Vermont will protect you. And most importantly, it will save lives. I'll tell you, I can't absolutely speak for Brian, but parents in this space, like, we get told all the time you'll never know how many people you save back at you guys. You'll never know how many kids you protect if you modernize these terms, you get this out into the world, if you push back at these companies that children need stronger data and protections, then we're going to start by holding people accountable who are taking advantage of your space because that will shine a light on the space. This addresses the criminal side of sex torsion and I sent stuff to Nate about other laws because that was kind of with Angela like maybe I know Brian said he'll share his as well. And in New York we're looking at SOPA, Stop Online Predators Act, and I like that because that's platform accountability. Also, we're working on a seventy two hour search warrant bill, and I'm sure that the under sheriff, other people can respond to that, but that's definitely a strategy to ignore the search warrants. And, you know, these bills just really recognize things that we know. Every hour matters when a child is being extorted. Stalling is not neutral. It's dangerous. And Vermont has an opportunity to join the momentum with this law, any of these laws, strong privacy protections, and duty of care standards for things that are ubiquitously put in children's hands. I feel like I'm going on and on. I'm gonna skip some of this, but I just want you to know that I'm with you. I anything you need. I admire your service. And it takes mega courage to stand strong in the face of trillion dollar companies that will stop at nothing to avoid civil accountability and just the conversation and your willingness. It won't bring our kids back. But the idea of really things that can prevent future tragedies and give families and police time to intervene. Time to support their children and save a life, it's just also crucial. So I, of course, am glad to answer any questions and I'm so grateful that you're talking about this.

[Martin LaLonde (Chair)]: So thank you very much for your advocacy, Mary. I really appreciate it. Also very sorry for your loss. So any questions? Yeah, Tom.

[Thomas Burditt (Vice Chair)]: Thank you very much for your testimony and my deepest condolences. I just I don't even know how to ask the questions, but the one that comes to mind is it seems to be is this emanating particularly from this only region in the Yeah.

[Brian Montgomery (Parent advocate, Mississippi)]: So what we're seeing is, I'm a give you a quick just a very fast kinda overview of this, but I mean, we're I'm speaking to the to the sextortion side of things, but, you know, sextortion kinda started as a as a means for pedophiles to extort content from girls. That was the predominance to where where it started. And and perverts and pedophiles. So that's what happened. They were selling this content between themselves. And once a young girl sent this content, they were bound to continue to send it unless they if they stop, they were threatened with that being shared. So that was kinda where it started and as many things, you know, once kind of that's that Africa, you know, Ivory Coast, Nigeria area who's, you know, infamous for schemes. Once they understood that there was another market there and I'm calling it a market because it was a huge financial gain to be able to extort young men. They they're much more likely to share the share the content. They're much more risk averse. They're willing to they're they're in a prime. They're, you know, their bodies are overflowing with testosterone and I can only imagine if you're a male, you can only imagine if you'd had access to content that's on these phones at 15, 16 years old. So, anyway, that that they figured out that there was a lot of money to be made from, you know, extorting young men. I'm not saying that it's not happening with young women as well. It is in this context. The but the overwhelming attacks we're seeing are coming from, Nigeria, Ivory Coast, that region of the world. I think that there is an absolute need to have which which kind of, you know, ask the question, okay, how would a, you know, Vermont law attach to someone in that situation? I'm not sure about that. I just know that that the world needs to understand that this is not something that our laws are gonna tolerate. I think that's the message. I think you will absolutely encounter, local offenders. It absolutely happens. But I think what we're seeing predominantly is that is that originating overseas.

[Thomas Burditt (Vice Chair)]: And I'll follow-up. And I think, Mary, you started to go down the road. You said the subject in your particular case was apprehended, I believe. Under under what type of indictment or charge? Was it a federal indictment or charge? And was they extradited to The United States? Not sure how that would happen from there. And is there also the possibility that the ability to manipulate this technology that this could also be occurring in The United States but appear to be happening from somewhere?

[Mary Roadie (Parent advocate, New York)]: It is. Sorry about that. It is occurring in The United States. There is especially now kind of child on child, plus there's also these there has to be a money launderer in The United States if money's sent. And I I definitely know that the undersheriff can speak to how that we could do more in that money launderer space to be like, we're also not gonna put up with that because if there weren't that middle purse, he can do better than me. I know he can. So Jordan's murderers were extradited. They're in Nigeria. Are walkers extradited, Brian?

[Brian Montgomery (Parent advocate, Mississippi)]: No. So it's we're still awaiting, court hearing in Nigeria, for that person to be extradited. So that has not happened yet.

[Mary Roadie (Parent advocate, New York)]: Okay. In my case and in the case of Ryan Last from San Jose, our murderers are in the Ivory Coast, and the Ivorian government does not give up their citizens. But the real story of that is The United States has a trade agreement with Nigeria, and The United States doesn't have a trade agreement with the Ivory Coast. So it goes to a magistrate there, which I understand to sort of be a judge and prosecutor. And there are really awesome people at the Department of Justice who could answer all of this so much better than I can. If that's something you're really interested in but yeah, no extradition for me. A picture of the 19, well, now he's 24 and hit a picture of his driver's license and we got him.

[Brian Montgomery (Parent advocate, Mississippi)]: Yeah. So, for Walker's case, I know that I think what your one of your questions was is it was a it is a federal indictment and, I think it's it's under a child exploitation, under some CSAM federal regulation. So, that's that's the method that they are using to prosecute these federal crimes. Think so one thing I wanna speak to that I think is really important, it really, something I wanted to speak about was, you know, you guys hear our our stories and our stories are the worst case scenarios. You know, Mary talked about thirty eight of our sons have died from this type of event and but it's been very interesting since I started going public and talking about Walker and our family. So, especially the first two years, I would almost, for the first year, at least weekly, I'm usually multiple times a week, I would be contacted by parents and kids that were going through this particular attack. And so what that exposed us to, what exposed me to was hearing the voices and hearing the calls from kids who are being affected that not to the so not to the extent Walker was. So, Walker was hidden that, I mean, he thought he was on an island. He didn't have somebody like us speaking about it to understand that this was a scheme. It was something that was overcomeable. It was something somebody was trying to simply exploit money out of these kids and and hurt them. That's that was Walker didn't understand that. He didn't have that point of reference. But these kids do now. That doesn't change the the impact that it has on the kids. And so what I I guess what I want you to hear me say is that it's it's so so hard to understand how these kids can be affected in such a negative way until you talk to them and you see even kids like there was I'll recall one particular call I got were from a young man that he simply took a picture of himself from the chest up. He was sitting with his mom watching TV and he was interacting with somebody through Snapchat. He took a snap of himself from his waist up without his shirt on. They immediately sent back a doctored picture where they had superimposed that on a, picture of somebody, you know, showing his penis and and threatened said that they're gonna send that to all his friends that he had shared a nude picture. Even though he hadn't, he was sitting with his mom and what I want you to hear me say is that the stress and I mean, this kid was affected for weeks after this and and still is afraid of this information. So, of this information getting out. So, the reality of of how it affects a a kid and with the expectation that this information get to their friends, family, people that they care about through technology that has changed the whole game and how our kids see life and and that's so so so where I'm really encouraged is that with a law like this, you know, our law, I will speak real quickly to the to our penalties. Our penalties is five, ten, and fifteen years rather than where you guys are. I'm not saying it's right, wrong. I'm just saying it's it's more it's there's stiffer penalties but what I want you to hear is that is valid because a kid that is extorted in this way, the the the damage goes on for years and many times it can't be undone. And so, yeah, I don't know if that answers your question or not.

[Thomas Burditt (Vice Chair)]: Thank you very much. Okay.

[Michelle Childs (Office of Legislative Counsel)]: Just thank you both for coming and sharing your stories. I have two teenage kids, and I I can just feel all this. They are up against so much. And so I I just sent my whole family a text saying what we're talking about and doing that because you can never say have these conversations too much. And just appreciation for you for kind of taking away some of the shame and secrecy and so that this conversation can be out in the open and our kids have a chance. So just thank you. And we're we're gonna do the work that we can.

[Brian Montgomery (Parent advocate, Mississippi)]: Yeah. You're welcome.

[Martin LaLonde (Chair)]: Thank you. Questions?

[Angela Arsenault (Member)]: So not a question. But, again, just to say thank you both so much and to let you know that I really hear you, and I think everyone I don't want to speak for everyone, but I know speaking for myself, I really hear what you're saying about the need to go further upstream and to hold the platforms accountable. And that is work that we are doing. It happens in other committee rooms, not here in judiciary. So this is the angle that we can take, given our committee's jurisdiction. But we absolutely need to be having these conversations about what we can do to prevent this from happening in the first place. You both know very well that there is a lot we can do to prevent this. And so I want to assure you that we hear that part of your message as well, and that we are engaged in that work. And hopefully, perhaps inspired, maybe some others are inspired to take that work to the federal level as well. But right now, it's really happening in the states, as you know. So thank you both so much. Again, I'm also a parent of two teens, a 15 year old boy. And I understand what you're saying, Brian, exactly. It's so important to hear your perspective so that we can really try to place ourselves in that mindset. Because in all of the cases that I've heard about, yours and others, this all happens in a matter of hours. And that tells me that it's sheer terror. Is an absolute, I guess, definition of panic and hopelessness that's being created through this crime.

[Brian Montgomery (Parent advocate, Mississippi)]: It's a true, if I could speak to that just a little bit, it's a true trigger if I could put it that way. There's some short circuit in in a teenager's brain with this. I mean, we've got a, I mean, I just think about like in our human existence, you know, this is so new in terms of being able to distribute something so private, so rapidly across the world literally. And so, I mean, the, you know, our human existence, I'm gonna use that word and I think it's hard for especially our generation and older, my generation and older to relate to this but we are not prepared for this mentally. We are not prepared and and so, there there is literally a switch that flips in a teenager's brain once this threat is is unpacked that we can't understand right now and we're not prepared for it and we have not our and so what I would suggest to you guys that do have teenagers, man, you've got to, you, it, it, it may not, it may be hard to communicate yourself, but find a video of myself or Mary or somebody talking about it. It's it's easier that way. You don't have any idea how many parents have reached out and said, you know, I've I've showed this to my son and he may have fallen for the scheme two months later. I don't can't tell you how many of them have done have still known it was a thing, known it was a possibility, had seen our story, and still fell for the trick. But you know what? They end up going to somebody. They ended up, hey, dad, I need your help. Hey, mom. I need your help. That's the difference and that is a win. That is a huge win for us. If we can have these teenagers at least know enough to realize they're in a trap. This is a scheme. This is somebody that's just trying to hurt them. Find somebody to get help. I mean, that is that if we can do that, we can we can write the ship.

[Mary Roadie (Parent advocate, New York)]: Agree. We have to get them to step away literally. Put down the device. It feels crazy to them because they have to solve their problem. But really, the sextortionist will probably go to the next one, sadly. If the kid stops responding, the likely response of the criminal is not to keep digging there because they're going for easy, easy, easy. So I think getting them to put it down, walk away because I agree. I have spoken to children whose moms have then said, go tell Mary. Kids, teach with their mom. Go tell Mary what you did and the kid, they'll do it twice. She tricked me. She was my friend for two weeks. I mean, so if the kid keeps engaging, it becomes more and more and more legitimate, putting them in this really high serotonin, and then this quick activation to their fight or flight brain is just recipe for disaster, takes them out of any reasonable response or code of conduct that they know, they know they have a supportive family. To people who want to talk to your own children, I'm going to send to Nate these really great info sheets from an organization called Fair Play for Kids and they just drill it down for people even if you leave it on the counter and there's a awesome thing at NCMEC, the National Center for Missing and Exploited Children called No Escape Room. I mean, that is the easiest thing to show to everybody And it's it's generic. It's an actor, but it's exactly what happens. And I'll also I didn't send yet the memo one pager tip sheet from senator Grassley's office about three bills that they're working on just to, like, also look at. I think sometimes when, it's important the language lineup, you know, and things like that, but okay, this, thank you. I I think I spoke to, oh, that's why it's called spontaneous suicide. I just learned spontaneous suicide when my kid's been dead like four and a half years. It was like so amazing. Like, so separated my grief from my guilt and it's actually a studied thing from day traders who I'm so rich or I'm so rich to and they go home and they commit homicide and die by suicide or they they take themselves out on a bridge. There's like six studied men who went into complete panic. This wasn't trading. This is in the Golden Gate Bridge but they lived And so they did study with the men after what made you like and but the boys don't wanna talk about it. I can send you or you can look up the statistics at Nick Nick. I mean, they can't keep up. They cannot keep up and all that kid is saying is get my picture down. Get my picture down. Get my picture down. So they do. Pretty much 19 sites comply with NickMic and take pictures down. That's a whole entire huge learning day. But then they say, hey, Bobby. Can we talk tomorrow? Just wanna check-in. See how you're doing. No. No. That 15 year old is not coming back to that lady that he was just so or man. That he was just so vulnerable with and talking about what happened to him. So, like, to Brian's point, the shame is not erased because they don't die and we get their picture down. They're still children.

[Martin LaLonde (Chair)]: I appreciate that. So do you have did I understand, Brian, that you have, like, a public service kind of video? If you do, if you could send us a link so we can have it on website as well?

[Brian Montgomery (Parent advocate, Mississippi)]: Yeah, absolutely. The Mississippi Bureau of Investigation did a professional video, for law enforcement training and just for training purposes in general. They do a lot. I mean, it's been all over the country just for Nash just for law enforcement. So I'll be glad to send that to you guys. I'll send it to Nate, and he can distribute that. And so yeah. Absolutely.

[Martin LaLonde (Chair)]: Appreciate it. And Barbara has a question.

[Barbara Rachelson (Member)]: Thank you very much. This has been super helpful. I'm wondering if you think it would be helpful if we added a component to the bill related to outreach, like making information available through PSAs, through pediatricians, through school nurses, so that we're trying to take advantage of these great materials.

[Brian Montgomery (Parent advocate, Mississippi)]: Yeah. Absolutely. I mean, that's you know, there's everything we think about and try to do is about where can we move the needle? You know, I don't think in Mississippi that Walker so there's two laws in in Mississippi that there's Walker's law, which is the criminal aspect. Some of this bill you guys are trying to pass. And then there's the Walker Montgomery protecting children online act, which is more of a, design law, to prevent, feeding kids bad content. Right? It's not a content law, but it's and it's been that's been challenged. It's completely separate from you guys where you guys are at right now. Understand that. But what we understand, I don't think anybody's been prosecuted in Mississippi with Walker's Law. And, you know, I think so so the, you know, the impact, you would look at that and say, maybe it's not making an impact but I think where the where the impact is is similar to what Mary said is that it sets a precedence that this is illegal specifically. This is a this is called out as a specific illegal act which in turn creates awareness. So, our only, let let me be very clear. Our only current defense without some regulatory action against tech is awareness. That's our only defense and so, anything that you can include in this bill to bring about, more awareness is where you're gonna move the needle. I want to be crystal clear about that.

[Barbara Rachelson (Member)]: I've noticed that The US sadly is one of the countries where this is the biggest problem, and there are many countries that it isn't a problem. And I don't know if you're aware of what some of the countries that have obviously addressed this have done.

[Brian Montgomery (Parent advocate, Mississippi)]: I don't know the statistics in other countries. I know that if you look at Europe in general, they're very aggressive with regulatory laws against tech. We're very open. We've got something called the First Amendment, which we have to navigate, and I'm thankful for that on one hand. On another hand, I think it's being exploited by tech. And so, you know, there's I know it is. I mean, I I don't think. I mean, I look at our group, what, Mary talked about us all. You know, we've been lobbying in DC for the Kids Online Safety Act, since this has they were working on it before I ever came into the fold, but that's been a major push, and we continue to see the amount of money that tech spends, on to lobby against this. It tells you, look, we can always follow the money. I mean, you guys are in politics so you understand. It's it's a there's always a financial component to everything that we do and and so, you know, I I I don't have a great answer but I do know that I think I know the answer of how we how we resolve it. I mean, we have to, I think it's a two pronged approach. We've gotta be really good with awareness, continue to talk about the dangers where people and parents can do better but at the same time, we can't we can't have an industry that is completely unregulated without litigation and have that both ways. I mean, if you're gonna be completely under unregulated, you need to be open to litigation or if you're not open to litigation, you gotta be regulated. I mean, you can't have that both ways and right now, they do and they've got the financial backing to continue to promote that that atmosphere that they get to operate in. So, you know, that's and I I guess you guys just saw where Australia passed the, you know, 16 and under illegal to have social media. You know, yeah, I think that's for everybody to decide if that's good or bad but social media is here to stay. I mean, that's something that I tell parents. Parents ask me all the time. I'm sure they do, Mary as well. You know, what what are your kids, what are your kids doing now that I have four kids? I have, you know, three left here at home. One's just in college, two that's still here at home and so, I'm I hope I'm doing a better job of not necessarily prohibiting but educating and training on how to use the tools and and because this way of communicating is not going away. Right. At all.

[Mary Roadie (Parent advocate, New York)]: I totally second that, and I just add it's a it's, a unique landscape of acceptance. I agree it's not going anywhere, but that doesn't mean we always have to be say that it what they're doing is okay, and we accept it in this manner. I think to say why is it it not a problem in other countries. I could send you some of the internal meta documents that have been released where Mark Zuckerberg says, we feed spinach to kids in China and opium to kids in America. He's on the record saying children in America are worth $270 a day to me. He's on the record saying this Walker and Riley, parentheses, and all the rest of them are the cost of doing business. And so other countries aren't putting up with that. They aren't letting their children be the cost of doing business. And the truth of that is they can change it when they're forced to. The night before the oh blah blah blah brothers were sentenced, they took down 600,000 fraudulent Nigerian accounts. Sick. They know exactly what they're doing. So yes, education's ultra important, but not with a full acceptance that, like, oh, your kid's gonna get extorted if on social media, just accept it.

[Martin LaLonde (Chair)]: Right.

[Mary Roadie (Parent advocate, New York)]: Accept. Yes. Your kid is going to get extorted on social media, so maybe you don't want a kid under to have Instagram until you're ready for them to see hardcore porn. And John Demay gives the example, I'd rather send my kid to a strip club because there's bouncers and there's rules and there's age gating and they would be safer there than they are on Instagram. And so it's changing the narrative. I think also if you're going to the very, very, very most basic, some people don't even understand the difference between technology, the Internet, and social media. And people will say to you, you're anti technology. No, I'm not. I'm a school librarian. You can bet I love scanning those books and I don't want to do anything like Sister Mary Assumta did with that card catalog. I'm so glad maps pop up and tell me where to go. I'm not anti technology. I'm not anti internet at all. It's not that. Those aren't the same. Your kid can go on the Internet. That doesn't mean that they have to have Instagram. So it's some that doesn't mean they have to have Snapchat. You wanna let your kid have Snapchat so that you know where they are? Okay. So does everybody else. Everybody. You think, oh, I want to give my kid the world. Social media is giving your kid to the world. And that's like the real education that you don't need it. We got into this whole entire culture of school shootings. It's after Columbine that we needed our kids to be able to contact us all the time, and all of that's been broken down as totally unnecessary. I'm sure you guys know Laura Dee, and Laura Dee can fight every single thing about why kids need screens in schools. So it's like they can't have technology, they can't have the internet. I don't think they need to go to World Book Encyclopedia in their hands. But that's not the same as social media. So that's where I think we have to educate just the everyday American parent, because in New York, the top five addictions right now, gambling, pornography, alcohol, drugs, social media overuse. Which one of those did kids just get for Christmas? And that's the truth.

[Martin LaLonde (Chair)]: Appreciate it. Thank you very much, Mary and Brian. I really appreciate your testimony. So we were gonna have a break, but I don't wanna keep under Sheriff Larson waiting any longer. So we'll go ahead with your testimony under Sheriff Larson. I appreciate you being here as well. And if you could identify yourself and proceed.

[Brian Montgomery (Parent advocate, Mississippi)]: Do you want us to stay on the call here?

[Martin LaLonde (Chair)]: Yeah, you're welcome to stay or you can leave it. It's up to you. It's up to you. Thank you.

[Thomas Burditt (Vice Chair)]: Chair, my time is very important, but this is way more important. And if you guys need a break, we've been at this hour and a half. I'm I'm totally fine with that. And I'd love for Brian and Mary to stay on to hear what I

[Martin LaLonde (Chair)]: have to say. Alright. No. That's fine. We will, we will take a five minute quick break, because I don't wanna keep, under sheriff from getting out there enforcing the law on

[Thomas Burditt (Vice Chair)]: the