Episode Transcript

CNN One Thing

MAR 29, 2026
Is This Social Media's Big Tobacco Moment?
Speakers
David Rind, Clare Duffy, , Caroline Koziol, Boris Sanchez, Jane Conroy
David Rind
00:00:00
Welcome back to One Thing, I'm David Rind, and we call them tech giants, but some parents and advocates say giants can be slayed.
Clare Duffy
00:00:08
I also think this is a turning point in the sense that it proves that these companies are not invincible.
David Rind
00:00:14
Stick around.
00:00:17
Can you tell me when you first signed up for social media as a kid, like how old were you? What do you remember about it?
Caroline Koziol
00:00:23
Yeah, I think the first social media that I downloaded, I was probably 10 or 11 years old, definitely in either.
David Rind
00:00:31
10 or 11.
Caroline Koziol
00:00:32
Yeah.
David Rind
00:00:37
'Caroline Koziel is a 21-year-old senior at the University of Hartford, and like she said, she had a very, very early start to social media, much to her parents chagrin. She says all her friends were on Instagram, and she had serious case of FOMO, so even though she wasn't allowed to have an account, she made one anyway.
Caroline Koziol
00:00:55
So I was able to make an account on my tablet and bypass the age restriction by just putting in a pretend birthday and then started using it.
David Rind
00:01:05
Oh so there was a restriction but you just put in a fake birthday and on you went.
Caroline Koziol
00:01:09
Right, yeah, there was no, like, true check. It was just put it in and that was that.
David Rind
00:01:17
'Fast forward to high school and Caroline is on the swim team when COVID hits. Suddenly she was spending a lot of time at home but still wanted to keep in shape. So she starts searching for workouts and healthy recipes on Instagram and TikTok. But she says what the algorithm served her was not plant forward recipes or simple at-home workouts. It was way beyond that.
Caroline Koziol
00:01:38
Nothing that I saw was ever a level of moderation. It was always like extremes. It was eat 500 calories a day or exercise for X amount of hours a day. And so I started to believe that I needed to shrink my body as much as possible.
David Rind
00:01:57
Caroline insists she was not going on these apps to seek out weight loss or body shaming content. She says it was coming through her feed, whether she wanted to see it or not. Eventually she developed anorexia. She says she lost 30 pounds in one year.
Caroline Koziol
00:02:10
There were some times during swim practice where I would have to be pulled out of the pool because I was on the verge of passing out. I wasn't even able to walk up and down a flight of stairs without seeing spots and getting very lightheaded.
David Rind
00:02:26
Caroline says the eating disorder prevented her from swimming during her senior year. It completely derailed her plans to swim in college. Eventually she entered a treatment program and discovered to her surprise, the women in that program were seeing the exact same content on their social media.
Caroline Koziol
00:02:42
The idea that the feed that I was receiving had caused my eating disorder or led to these disordered thoughts had not really occurred to me. And I think a big part of it was because I was so tunnel vision. I had so much tunnel vision towards what my feed was that I didn't really even consider the fact that all of the disorderd content that I being fed was abnormal.
David Rind
00:03:04
'Caroline sued TikTok and Instagram, alleging their product design was directly responsible for her eating disorder. We asked the owner of Instagram Metta about Caroline's case. They told us in a statement that they strongly disagree with the allegations and pointed to changes they've made to protect young people, including teen accounts and parental controls. They also said content that encourages suicide, self-injury, and eating disorders go against their rules and is removed when it's found. TikTok did not comment directly on Caroline's case, but emphasized they too do not allow content that promotes disordered eating and pointed to tips and resources that pop up when someone searches for terms like anorexia. Caroline's case is just one of many. She is one of more than 1,800 plaintiffs from across the country who have active lawsuits against social media giants. And on Wednesday, a jury in one of those cases delivered a landmark verdict that could set a precedent for all of them.
Boris Sanchez
00:04:00
'We're watching a watershed moment in a California courtroom. A jury finding Meta and YouTube liable after a now 20-year-old woman accused them of addicting her to their apps, causing her years of mental health issues, including body dysmorphia, anxiety, and suicidal thoughts.
David Rind
00:04:16
We're going to dive deep into the ramifications of this trial in a bit, but in short, 10 out of 12 jurors agreed that Metta and YouTube were liable for the harm done to this young woman named Kaylee. The jury ordered the companies to pay a combined $6 million in compensatory and punitive damages. Metta said in a statement, quote, we will continue to defend ourselves vigorously as every case is different and we remain confident in our record of protecting teens online. Thank you very much. Google, which owns YouTube, says the verdict misrepresents YouTube, which is a responsibly built streaming platform, not a social media site. Both companies say they will appeal. What went through your mind when you heard about this verdict?
Caroline Koziol
00:04:57
My first thought is that I think it's very important that people are finally realizing that it's not the user's fault, it's the parent's fault. It's this company has purposefully designed this to make it addictive and to promote harmful content. And I think that the fact that people are finally starting to realize that is extremely important. And one of the biggest criticisms that I've received about the lawsuit is, Well, why didn't you just stop or you should have just.
David Rind
00:05:27
That was going to be one of my questions, to be honest. I mean, couldn't you just put the phone down? You said you deleted Instagram when you were younger to kind of hide it from your parents.
Caroline Koziol
00:05:35
Right, and what I meant by that is I would delete it and then redownload it so that they couldn't see. But yeah, absolutely, the question, why didn't you just stop is completely valid. And I would say, well, that's the addictive nature of these apps, of their algorithms, is when you're surrounded by all of your classmates and all of you friends who are using this, there is an extreme sense of FOMO. Not only that, but social media is so immersed in all of life now, it's really not easy. To just delete it and be done with it.
David Rind
00:06:06
Do you still use social media?
Caroline Koziol
00:06:09
I do still use social media. I have done a lot of work over the years to clean up my feed. I have tons of creators blocked. I have ton of keywords and hashtags blocked. I have weight loss blocked. But every once in a while, yeah, a video definitely does slip through that's promoting weight loss. And I mean, if you think about society now, like we have this just epidemic of weight loss as being... So centered. And for me, that's a huge issue for my mental health is seeing all this messaging about losing weight. So I do see it sometimes and I block it, or I put not interested, I don't engage with it to do my best to avoid getting future videos about it.
David Rind
00:06:51
It sounds like a lot of work though for an experience that is in theory supposed to be like fun and connect you to other folks, but you have to be vigilant about making sure that it's not harming you. I mean that sounds like bummer.
Caroline Koziol
00:07:03
Yeah, it certainly is. I mean, there's definitely like, yeah, you don't want to, you don't think of, oh, I'm gonna go hang out on, scroll on TikTok and Instagram and have to be aware, like hyper aware of what I'm seeing. But I do enjoy using social media when it's not feeding me these things. And I still enjoy staying connected with my family and with my friends and seeing the funny videos that I downloaded the app in the first place to see. So I do have to put in that extra work to be able to read. Those other benefits of the social media.
David Rind
00:07:39
Again, Caroline is just one story. And experts say this trial in Los Angeles could have huge ramifications for all of them. But will we actually see any change from these social media companies or from Congress? And are there free speech concerns to think about here? We're gonna dig into all of this with CNN Tech reporter and host of the CNN podcast Terms of Service, Clare Duffy, after a quick break. Stick around.
00:08:12
Okay, Clare. So we've had some time to digest this big verdict. What has the reaction been across the tech world?
Clare Duffy
00:08:18
Well, I think this decision really sends a message about just where sort of the American public stands in terms of thinking about big tech and the impact of big tech. Of course, this jury in Los Angeles is only 12 people, but this is the first time that regular, everyday Americans have been asked to issue a verdict on this issue of whether the big tech platforms have harmed young people. And they sent a pretty clear message here, finding that Metta and YouTube were liable on all counts. I also think this is a turning point in the sense that it proves that these companies are not invincible. They can be held accountable for the impact of their platforms on young people. And so I think that is the real sense today. Of course, the platforms are appealing. They are trying to say, look, we're going to overturn this. We disagree with this decision. But I do think this marks a real turning point.
David Rind
00:09:05
Yeah, we keep hearing that the case could be social media's big tobacco moment. Like, why is that being a vote? Why is this such a reckoning?
Clare Duffy
00:09:14
Well, this was the first of hundreds of similar cases that are set to go to trial. And it really could set a roadmap that those other cases could follow. This is sort of similar to big tobacco in the sense that federal lawmakers have sort of dropped the ball when it comes to passing regulations and guardrails to rein in social media. And so what the parents and advocates who for years have been raising concerns about this issue have decided to do is try to create change through the courts. And This is the start of that process. I also think that we could see, similar to the big tobacco moment, just cultural change that comes out of this. We have gotten so much information from this trial, testimony from meta whistleblowers, internal documents that gives us a sense of what these companies have known about the risks of their platforms to young people. And so I do think that we could start to see just sort of cultural change in terms of how people think about and engage with these platforms.
David Rind
00:10:11
About the trials that could be down the road, like, how many are we talking? How many more of these decisions are we going to see?
Clare Duffy
00:10:19
So this Los Angeles case that was decided is part of this larger, consolidated litigation of more than 1,500 similar cases that have been brought by individuals and families. So the next of these bellwether trials, as this trial was known, is set to go to trial later this year. So we've got 1, 500 individual and family cases. You've also got hundreds of school districts that have brought cases against the big tech platforms. You've, also got. Many, many state attorneys general who have brought their own state cases against these companies. So we are looking in total at thousands of lawsuits against the big tech platforms. And of course, this could also inspire other plaintiffs to come forward who haven't yet filed cases.
Clare Duffy
00:11:02
Hi, Jane.
Jane Conroy
00:11:03
Hi, Clare, how are you?
Clare Duffy
00:11:04
I'm well, how're you?
Jane Conroy
00:11:07
I'm really good today.
Clare Duffy
00:11:09
And I actually spoke with Jane Conroy. She is a member of the trial team who brought this LA case. They're involved in. Many of these other cases that are forthcoming.
Clare Duffy
00:11:18
How does this sort of shape your strategy for the next cases that are going to go to trial?
Jane Conroy
00:11:24
Well, you know, it really holds our strategy because as we enter a case like this, we're really looking at millions of documents.
Clare Duffy
00:11:34
And I asked her about what this verdict is going to mean for those future cases. She said a big part of this is it's going to help them build their legal strategy to try to convince those juries in the future cases
Jane Conroy
00:11:46
Documents that are now public as a result of this trial are so stunning and will be so stunning to parents in the United States that I feel we could use any one of those documents and it would really tell volumes about what these companies have been doing to kids and teenagers.
David Rind
00:12:07
And remind me, Claire, there was, like, evidence presented during this trial that kind of pointed to what these companies knew about how these products were designed, right?
Clare Duffy
00:12:16
Yeah, I mean, we have learned there's millions of pages of documents that have come out in this litigation. We also heard during trial from metal whistleblowers, one of those metal whistle blowers, Arturo Behar, worked on safety at the company before he left. And he talked about warning Mark Zuckerberg that his 14 year old daughter, Arturo's 14 year old daughter had been propositioned on Instagram and telling Mark Zuckerburg that this was a problem, that children were being approached by predators. And his argument is basically that Metta decided not to act on that issue. There was another really striking moment that came up both during Mark Zuckerberg's testimony and Instagram CEO Adam Mosseri, where the plaintiff's attorney asked about this internal study that Meta did, where it asked 18 independent experts to weigh in on the potential risks of allowing teens to access beauty filters, which manipulate their face. And all of those experts said, this is dangerous for teens. And Meta decided to go ahead and allow teens to access beauty filters. Anyways, it says it doesn't promote them on the app. But Mark Zuckerberg said that he felt like this was sort of a free speech issue, that he shouldn't restrict people from accessing these filters, despite the fact that the 18 experts that Meta went to to ask for advice about this said that they could be harmful. So those are the kinds of things that we heard come out during this trial.
David Rind
00:13:36
When it comes to the product design or the policies that are kind of at issue here, what exactly could these companies change about them that would like keep them clear of lawsuits in the future?
Clare Duffy
00:13:48
'Well, I think there's a few things. I mean, sort of to step back from that even just a little bit more, the companies came to this trial and said, look, we've spent a lot of money. We've invested heavily in trying to build out safety features and measures. And that is true. Things have gotten better over the last five years. The problem that many parents and advocates still have is that a lot of those features, things like parental oversight tools and take a break reminders, they still put a lot of onus on parents and teens themselves. To navigate their online experience. And what parents and advocates would like to see is the tech platforms taking more of that responsibility. So some of the things that people bring up are these algorithmically driven feeds, endlessly scrolling feeds, where there's just always something new to look at. Maybe we go back to timeline driven feeds. Maybe we just make it where you can't scroll forever if you want to. A couple of other things that advocates have talked about are, and that came up in this trial, the beauty filters on all of these platforms that manipulate the way that you look. Maybe those shouldn't be available to young people. And also auto-play, the fact that you open an app like Instagram or TikTok and videos just start playing. They just suck you in right away. That is another issue that folks have raised. And then beyond that, I also know that there are many parents and advocates who are hoping that U.S. Lawmakers will pass legislation similar to what we've seen in Australia, where they are requiring these platforms to verify the ages of users. And block any users under the age of 16.
David Rind
00:15:20
Right, but do we actually think any of that would happen? Because it's like the companies are making gobs of money off this model, right? And we saw the damages here in this particular case was really like a drop in the ocean compared to the revenue they're pulling in.
Clare Duffy
00:15:34
Yeah, it's a really good question. I do think that if this were the only case, this $6 million damage award wouldn't necessarily convince these companies to change their platforms. But if these companies continue to lose these cases, we could see them on the hook for potentially billions of dollars in damages. And in particular, with the state cases, the remedy in those cases could potentially be courts ordering changes to these platforms. I also think that Again, if these companies continue to lose these cases, this is a tension for the companies. Their business model is to try to get people to spend as much time on the platforms as possible, view as many ads as possible. But if they continue to get this message from jurors, from their customers who are saying, we're not happy with the way your platforms are operating, maybe they decide to try to serve that need and find a middle ground.
David Rind
00:16:26
Does any of this impact Section 230? This is that law that basically shields social media companies from liabilities because they're viewed through the eyes of the law as a platform that just hosts the users and whatever they may post as opposed to actual publishers. Does this verdict change that in any way?
Clare Duffy
00:16:43
Yeah, I'm so glad that you asked this question, David. This verdict essentially tested a new, or this case, I should say, tested a legal theory, which was that, as you said, these companies have for a long time avoided legal responsibility because of Section 230, which prevents them from being held accountable for what users post. But what the lawyers in Los Angeles did is they said, no, you should be held accountable for the design choices, the way that you operate your platform. That doesn't necessarily have anything to do with the content. And the jury has affirmed that. They found that Metta and YouTube were negligent in the operation of their platforms, that they knew that there were risks to children. They failed to warn parents and users of those risks, and that that played a substantial role in the mental health harms that this young woman experienced. So I do think that this is a huge moment in terms of opening up the floodgates, potentially for other cases, using this new legal theory that gets around Section 230.
David Rind
00:17:38
So, I mean, they're kind of aiming the concerns right at the product design. But are there free speech concerns here as well? Because, you know, you hear certain groups like the Civil Libertarian Foundation for Individual Rights and Expression, or FIRE, and they say that design choices are speech, you know? That the form is an extension of content. I mean how much does that argument have anyway?
Clare Duffy
00:18:03
'I think that the challenge is that these companies through Section 230 have sort of avoided the responsibility that other publishers like CNN have. These companies can't be held accountable if somebody defames somebody else on their platform or posts illegal content. They get that protection, and yet, it's like they can't sort of get protection on both ends. That's the real argument that a lot of advocates are making here is that if they protection for content that they're posting and they're not treated like a publisher in the sort of traditional sense, then they shouldn't also be protected when it comes to their design choices. They shouldn't be able to, sure, you can have content that talks about suicide on your platform, but should you be protected if you send many, many of those videos all at once to a 15-year-old? Maybe not.
David Rind
00:18:54
Well, I mean, you mentioned the sentiment around the world towards some of this stuff, you know, the law in Australia, barring children under 16 from using social media. It just seems like an uphill battle, especially here in the US, looking at Congress and the way they haven't really stepped up to the plate. Like, how much more of a fight is there for these parents who are raising the concerns about all this?
Clare Duffy
00:19:19
I think when it comes to the possibility of new legislation, new federal legislation, there still is a significant uphill battle, which is really interesting because this is one of those issues where there is actually bipartisan support. It is such a rare thing, but people on both sides of the aisle would like to see changes to how these social media platforms operate with regard to kids. These tech platforms have such a loud voice in Washington. And what we've seen over the last few years is Even when lawmakers on different sides of the aisle all agree that something needs to be done, they can't necessarily agree on what that should look like. But again, I think as we start to see more information coming out about what these companies have known, more detail, I think that you could see some lawmakers, again, sort of like feeling more motivated to try to get something done here. We are already hearing from folks like. Senator Blackburn, Senator Blumenthal, who have pushed for this legislation for years, saying, look, now is the time. A jury has affirmed that these companies knew of the risks. Now is the for us to act. We'll see if anything gets done. But in the meantime, I should say, there are many state lawmakers who are already passing child safety laws, child protection online safety laws. And I think that is starting to make a difference in some places. But it does make it a bit tricky when you've got different states passing different laws. And the sort of patchwork where kids might be protected in one state and not in another. In general, Silicon Valley says that it would prefer to have one piece of federal legislation rather than a bunch of different state laws that it has to navigate.
David Rind
00:20:53
Also, I guess for you and me, to the average person who scrolls endlessly on social media throughout the day, I'm certainly guilty of it. Like what should the takeaway be here in your mind?
Clare Duffy
00:21:03
Oh, it's such a good question. I mean, I feel like part of the problem in sort of giving any advice around this is what this seems to prove is that it's really an uphill battle. You're fighting against this really powerful system, these companies that have invested a lot of money in figuring out how to get people hooked on their platforms. And so there is, I think, sort of a personal responsibility piece here for all of us as individual users, certainly for parents to be paying attention to what their kids are doing online. But you're fighting these companies that thought a lot about how to get people to spend a lot of time.
David Rind
00:21:34
And that's why the internal documents and conversations that came out during trial was such like a wow moment, right? People were like, they knew about this, they're discussing the pros and cons, and still this is what we end up with on the other end.
Clare Duffy
00:21:49
Right, and it's not just me, it's not just because I'm a bad person who can't convince myself to get off of Instagram, you know, do something else. It is because these companies have been very intentional about designing their platforms in such a way that people spend a lot of time there. Now, we should say there are tools to help manage your time. They've got time limits, they've got reminders. All of those things are good to do in the meantime while we wait for potentially some of these larger structural changes to come into place.
David Rind
00:22:18
Well, thank you, Claire, for all of this. And we know you'll be talking about this more on your podcast, Terms of Service. So I recommend everybody go check that out after they're done here. Thanks, Claire.
Clare Duffy
00:22:28
Thank you so much, David.
David Rind
00:22:33
'Just one day before the verdict in Los Angeles, Meta was also found liable in a New Mexico case where it was accused of failing to warn users about the dangers of its platforms and to protect children from sexual predators. The jury ordered the company to pay $375 million in damages. A Meta spokesperson says it respectfully disagrees with the verdict and plans to appeal. And we should say, if you or someone you know is struggling, you can call the National Alliance for Eating Disorders hotline at 866-662-1235 or the National Suicide Prevention hotline at 988. That's all for us today, thank you as always for listening, we'll be back with another episode on Wednesday. Talk to you then!