Joseph Gordon-Levitt
It was almost 20 years ago that my brother first set up this off the shelf PHP message board on a website that we were running called HitRecord. And that message board became the sort of birthplace of the HitRecord community that ended up so dear to me over the years and years that we made art together. It was such a sincere source of human connection and creative inspiration and collaboration.
And, I think it’s a big part of why I feel a real love for the internet. But at the same time, over the last decade or more, we have to acknowledge that the internet has taken a dark turn. I’ve said this before, but you know what? What used to feel like a place centered on connection has become a place more about addiction.
And I think at the center of that are these engagement optimization algorithms that are driving the biggest social media platforms today. Engagement optimization algorithm just means a computer program, a piece of math, basically, that takes data points from billions of people and uses those data points to calculate statistically what’s the next piece of content to serve a user that will hook them and keep them so that it can serve more ads and make more money.
And these algorithms have all kinds of damaging side effects. Harms to individuals, I think especially young people, whether we’re talking about mental health or the more subtle damages impacting people’s ability to just think critically or pay attention for more than 30s to something or relate to other human beings in real life, and then maybe even more concerning then the harms to individuals are the harms to our society.
And when I talk about society, of course there are benefits too, to this technology. People use social media to organize and to advocate. All of that is really good, of course, but I would say that even more often the internet is being used to spread confusion, extremism, polarization, anger, hatred, fear. I would say that authoritarians and dictators are benefiting more from the internet today than good, reasonable, kindhearted people and citizens of democracies.
And then, of course, here comes AI. And the biggest AI businesses are using the same engagement optimization algorithms and driving the same advertising business models as the social media companies have for the last decade or more. And this, to me, is all really worrying and even painful because perhaps of my real love for the internet and what it could be.
And so lately I’ve been dedicating a lot of my time and energy to trying to do my small part to help. I’m currently actually in pre-production on a movie I co-wrote, and I’m directing for Netflix about AI, and I’ve also been doing a bunch of advocacy work, for the past year, writing op eds and blog posts and making little videos.
I’ve also been supporting various pieces of legislation, because I think those kinds of guardrails and laws are necessary to try to make this go better. And then recently, I also supported a bill called the Sunset Section 230 Act. Section 230, for those who don’t know, in simple terms, it says that a platform on the internet can’t be held legally liable for something that someone else posts on it.
And in many ways, that’s super important and really good. A platform like HitRecord and so many others would have a lot of trouble existing without the protections afforded by Section 230. At the same time, the biggest big tech companies have really abused the protections of Section 230 and used it to get away with so much harm. To individuals, I would argue even, to our society. So the Sunset Section 230 act sets a time limit that says by 2027, this bill is going to go away. And it was presented to me as a strategy towards reform, to force Big Tech to come to the table and find a way to ideally, reform the parts of Section 230 that are being abused while preserving the parts of it that are good and important.
When I supported, this sunset section two three act, a lot of concerns were raised that I think were valid, that this strategy of threatening to take away all of Section 230 in order to achieve reform, it’s actually not the best strategy to achieve that reform, to achieve the larger goals that I want to support. So I’ve decided not to support this particular bill moving forward, because I’ve been learning about what I think are better strategies to achieve these larger goals of holding Big Tech accountable for all the harm being done to individuals as well as to our society.
So I wanted to have a conversation about all of this with some people that probably know a lot more about it than I do. We have Cody Fenske from the ACLU. Cody is a senior policy counsel in the ACLU’s national political advocacy department working on issues in surveillance, privacy and technology.
And I’m gonna take just a second to give some love to the ACLU. If you haven’t heard of what that is, the American Civil Liberties Union. It’s sort of the premier legal nonprofit in America that fights in court for American civil rights. I’ve been a supporter of the ACLU for many, many years. We’ve done a bunch of different collaborative projects with HitRecord in the ACLU and I’m really, really happy to have Cody here.
So thanks for being here, man. And I’ve also invited Olivia Carvell, Olivia is an investigative reporter for Bloomberg News focused on the intersection of child safety and the digital world. Thank you both for coming and talking to me about this.
Cody Venzke
Great to be here, Joe.
Olivia Carville
Yeah, great to be here.
Joseph Gordon-Levitt
I thought I would just start with a little getting to know your question for both of you. Which is just I’d love to hear. I was just talking about kind of my own personal connection to the internet. I’d love to hear from your perspective about your personal connection to the internet and how that connects to the work you’re doing today.
Joseph Gordon-Levitt
Cody, do you want to start?
Cody Venzke
Yeah, happy to. And thanks again for having us here, Joe, for this important conversation. And thank you, as always, for your support and shout out to the ACLU and the work we’re doing. I think for me, the internet is about community and connection. And I think the way that you describe the early days of HitRecord is spot on for the sort of internet many of us seek to cultivate, and in some corners still can cultivate, whether that be you’re sharing your love for indie gaming, who with others, who share that love for cats, or crafting or quilting or whatever.
Cody Venzke
It’s a way to find that connection. And then in addition to that, it’s so key for advocacy. As a lawyer at the ACLU, we use the internet to help motivate people, to help bring people out, hear their voices and concerns, and try to spur change in defense of civil rights and civil liberties so that community and connection, I think, are really what the core of the internet is for me personally and in my work.
Joseph Gordon-Levitt
Thanks, man. All right, Olivia, tell us about your personal connection to the internet, how you got to where you are today.
Olivia Carville
Well, thanks so much for having me on as well, Joe. I feel like that is such a broad question, and it’s a really difficult one to answer. But, as an investigative journalist, you know, I use the internet every single day. It is crucial to the work that I do. And I often think about reporters in an era before mine, you know, how did you actually do your job without the internet, without that connection and the ability to just look something up immediately, the ability to find documents, to find a history or a timeline of things that have occurred, to find people like Cody to interview for stories, for example.
I also see the internet as that kind of connection portal to friends and family that you’ve both referenced. I live in New York City, but I’m originally from New Zealand, and I got my 90 year old Nana, who lives in Tauranga, to download Facebook Messenger so I can call her on video and talk to her and remain connected to her.
That’s the kind of priceless, you know, connection and sentiment that the internet gives us every single day. And there are so many good things that we get from the internet and from social media platforms. But unfortunately, in recent years. I agree with you, Joe. I think that we have seen a dark side to some of these platforms that we’re using on a daily basis, and these smartphones are now in the pockets of every child out there, and we need guardrails in place.
And that’s the work that I’ve been doing, kind of focusing on in the past few years is understand where we’ve been, where we’re at today, and what kind of an impact is social media and as the internet having on kids because, you know, up until now, we really haven’t been able to study or really fully understand that.
Joseph Gordon-Levitt
Great. Thank you. All right. I want to start with a first question for Cody. But this is a big question. And Olivia, certainly feel free to jump in. The ACLU, it’s an organization dedicated to protecting people’s civil rights and especially or certainly, our First Amendment right to free speech. But today, it does feel like, as we were just seeing on, on the internet and on these big social media platforms, certainly it’s a place where people are saying things, but really what it is, is people seeing things through the lens of these engagement optimization algorithms that I think arguably are causing a lot of harm.
So how do you as someone working at the ACLU who’s dedicated to protecting free speech, like the question is, are these algorithms a form of speech? And should these big tech companies be allowed under the First Amendment to just have whatever algorithm they want running their platform? How do you think about balancing the harms that seem to be happening from these algorithms with our First Amendment right to free speech?
Cody Venzke
That’s a great question, Joe. And I think you’ve drilled down on one of the key issues you mentioned earlier about Section 230, and it meshes up with the internet, and with the First Amendment in some really key ways. And if you think about the sort of history of speech for a long time, your ability to speak to address others was limited by intermediaries.
An editor at the newspaper who agreed to print your article or your letter to the editor or a book publisher who would sign a book deal with you and make sure that your book made it to book stands. The internet, in its early days, showed a whole new promise: a multitude of voices speaking on a whole variety of issues.
But we saw courts early on struggle with how to apply traditional First Amendment rules to this new blossoming era. And early courts, in some ways, got it wrong in those early struggles where they took some cases where companies were moderating content on old school message boards, which it sounds like you’re familiar with from the early days of HitRecord.
And they said if you were moderating, you must have been aware of the sort of defamatory statements that were at issue in those cases. Fun fact, one of those cases involved the investment firm at the center of Wall Street. So in hindsight, maybe not so defamatory, but that could not have been right. And so what we saw was Congress passed Section 230 as a way to ensure that platforms had the breathing room to moderate content, to ensure that people could speak and platforms would not feel a chill in order to try to take down defamation or true threats or other speech that’s illegal under the First Amendment.
Now, this is where the First Amendment comes in. Much of the speech that we see out there is protected by the First Amendment, and for good reason. We don’t want elected officials or others in power to be deciding what can be said or not be said unless it falls within certain historic exceptions. And so much of the speech that is at debate in these issues today, are First Amendment protected speech.
And so whether or not you can say them, whether or not platforms can carry them depends not on Section 230, but on the Constitution.
Joseph Gordon-Levitt
But isn’t it true, because what I’ve heard folks say is that yes, technically speech could be protected by the First Amendment. But in practicality, without Section 230, the liability still could exist that even if you would ultimately win a court case under defense of the First Amendment, that, trying that court case would be expensive and prohibitively expensive for some smaller platforms.
Joseph Gordon-Levitt
And therefore it would have the practical effect of chilling free speech, because it can just be so expensive to go to court, even if you’re right.
Cody Venzke
That’s exactly right. And that was the promise of Section 230 is helping protect platforms from having to go to court and go through all the hoops that litigation involves to show, hey, this speech was protected and we had a constitutional right to carry it. It’s a way to make sure those cases reach the constitutional conclusion a bit more quickly.
Joseph Gordon-Levitt
Right. But then on the other side, and Olivia, you can like speak to I mean, maybe in fact, I’ll just ask you this, like in the reporting that you’ve done when you and I’ve spoken to a number of people who are really worried about the harms that are happening on these social media platforms, I’ve spoken to parents whose kids have suffered tragedies.
They’re really moving stories. And they often talk about Section 230. I’ve, I’ve spoken to parents who say things like, I’m trying to hold META accountable for what happened to my child, and I can’t hold them liable because Section 230 protects them. Have you heard stories like that, Olivia.
Olivia Carville
Yeah, I’ve heard I mean, exactly the same thing. These families who have been fighting for going on for half a decade now, fighting for accountability and fighting for justice because they believe that their children were targeted by these platforms who were, you know, it’s engineered addiction. They designed these platforms to try and keep kids glued to the screen for as long as possible.
And these parents have been fighting to get access to the justice system. And Section 230 was the legal shield that prevented them from gaining access to the courtroom. And I think it’s interesting what what you’re talking about with, you know, the origin of Section 230, why it came to be and Cody mentioned the moderators dilemma, which is how do you moderate a platform if you’re going to be held liable for every single thing third party users say on that platform?
And I think it’s fascinating when we look back at that era and realize that Section 230 was really written to try and protect kids, it was written to ensure that the platforms moderated content to prevent violent, vile, disgusting content from being allowed on their networks and to keep it a safer, more sanitized place for kids. But unfortunately, now we’ve seen it kind of being turned into what some have described as a get out of jail free card, where the platforms have relied on this law to grow into the biggest corporations in the world, the biggest company ever seen, you know, making billions, if not trillions of dollars because of this legal protection that Section 230 has afforded them. And I think what the families are fighting for is the chance for accountability over not the third party content. And that’s what these lawsuits that we kind of seen recently go to trial. This isn’t about third party content. This isn’t about what one user posts on Facebook or what one person uploads to Instagram. This is about the way the companies designed the algorithms to addict children.
This is about engineered addiction. The fact that these companies knew what they were doing, that they were trying to hook kids. They didn’t warn the children. They didn’t warn the parents, they didn’t warn teachers, they didn’t warn society. And, you know, that’s what we’re saying in the court system right now. And that’s why I think this moment in time is so important.
Joseph Gordon-Levitt
There’s actually a big kind of historic trial happening right now. Cody, I want to hear what you have to say, but I just want to kind of place us in our moment in history, because there is this incredible trial that Olivia I know you’ve been covering. So, Cody, will you hold that thought for a second and let’s just talk about Olivia.
Can you tell us about this trial that’s happening in the California courts right now?
Olivia Carville
Sure. So right now in Los Angeles Superior Court, we have Kagame, this big tech, essentially, this is a 20 year old woman from Northern California going up against meta and YouTube. She was also against Snapchat and TikTok. But those two platforms settled before the case went to trial. She has alleged that these companies, you know, essentially addicted her to the networks from the age of six years old that she was addicted to social media and it caused her significant mental health harms anxiety, depression, suicidal thoughts that it kind of took over her life in some respects.
And the reason why this case is so important is because it’s a bellwether case. Even though just one plaintiff is represented here, this one 20 year old woman. Once this case is completed, we’re going to see thousands more waiting in the wings. There are more than 4000 plaintiffs who have sued Big Tech in addition to a thousand school districts and, you know, dozens of state attorneys general.
This trial represents the start of a legal reckoning for Big Tech. And the reason why this is being described as a landmark trial or a historic trial is because this is one of the biggest legal fights of our time. This case is being presented to the big Tobacco of our generation.
Joseph Gordon-Levitt
Cody, did you, did you want to chime in there about all that?
Cody Venzke
I think Olivia really laid out in a sort of way some of the parameters around Section 230, which, as she pointed out, it protects platforms, firms from being held liable for what their users say on the platform. So if you tweet something, or excuse me, post on X something that a politician might consider defamatory, they can of course come after you for that alleged defamatory statement, but they can’t bully X into taking it down.
It does not. And this is so critical for understanding in the moment we’re in. For tech platforms, for their own conduct, their own speech, or the ways they may contribute to the legality of speech that’s on their platforms. And that can be legally complicated. But as Olivia observed, this is a bellwether trial, and there are literally hundreds of others hanging out in what’s called a multidistrict litigation.
And this will be a pivotal moment. So Section 230 is, yes, a shield to liability in some instances, but it’s not a full shield. And that’s why these cases are advancing.
Olivia Carville
And if I could just jump in there, it’s not hundreds of others. It’s literally thousands of others. You know, 4000 people are waiting in the wings to have their day in court. And while this one case is ongoing right now, there are dozens of families who are flying, flying into LA from across the country and around the world because they want to be there to mark this historic case, because their own children have been impacted by social media and in many situations, their own kids have actually tragically died linked to what they believe a social media harms.
So this is a crucial fight for them as well. This isn’t about just one plaintiff. It’s about society in general.
Joseph Gordon-Levitt
What I’d love to zero in on is and I think it’s like a subtle distinction that’s so important here is and Cody, you’re you’re getting it. This is what is protected and what isn’t protected or maybe shouldn’t be protected under 230. And to me again, I really like I’ll give you an example. Cody, I read you. You wrote a piece that was raising concerns about the Kids Online Safety Act, because you were saying what could happen is that individual attorneys general could use this legislation to crack down on particular content that they don’t like, and use the example of, LGBTQ content.
And I couldn’t agree more that, no government… the government should not be allowed to pick and choose what messages are okay or not okay. That’s the whole idea of the First Amendment. That’s what a free society is. But these engagement optimization algorithms aren’t any particular idea. And this is where to me, it feels like that’s what I’d love to see change is I’d love to see a distinction made between human speech and this algorithmic amplification, which isn’t any particular message, isn’t any particular ideology, isn’t speech.
It’s just a sort of commercial moneymaking machine that has the damaging side effects of causing all kinds of harms, both to individuals and to societies. And, so that’s to me, the question is where do you, Cody, fall on like, should an algorithm count as speech? And if we were to say there should be a law that regulates these algorithms, would you feel that that were, violating somehow the First Amendment rights of META or X or TikTok or Snapchat or YouTube or any of these platforms?
Cody Venzke
You might have seen me laugh a little bit there because my answer popped in my head and is unfortunately the most lawyerly answer, which is it depends. And the reason I say that is because we are really on the frontier between the technology and the law. And many of these cases that Olivia has described are framed as product liability cases, which if you know, back in the day, that would be you got a soda bottle and it broke in your hands, it cut your hands.
Who can you sue over that? And courts have long drawn a distinction when we talk about things that are normally in the speech world, like newspapers and books, you can bring a products liability case for the container, but not the ideas in the words. And so if the book had poisonous ink, there’s your products liability case. If had poisonous ideas, not so much.
And that’s a much harder question when we’re talking about digital services. And so we’re seeing all of these courts wrestle with the ways that social media as a product versus ideas in speech online. And if you sit down and look at some of the decisions that have come out in these thousands of cases that Olivia has described, they are really expertly pulling apart the different functions of social media to analyze them.
Is this a product? Is it speech? Where do we draw the line between the two of them? And it’s a painstaking process, but that’s how the law proceeds. So when you ask, you know, where would we weigh in on the algorithm? And if we regulated it, I say, what algorithm? How are we regulating? Because it is, as I said, a painstaking process.
Olivia Carville
And if I could jump in here because the the easiest or most simple way, I’ve come to understand, you know, what you’re talking about, Joe, is like, what is content, what is the algorithm? How do we differentiate all of this? How do we actually simplify it down? Think about the Facebook that you first downloaded. It was a way to connect you to friends and family.
I remember looking at Facebook and seeing posts from my high school friends’ weddings or my auntie’s holiday, or you know, what my old school teacher was doing. It was all posts about the people that you loved and the people that were in your life. You open Instagram now and you’re seeing content from strangers that you don’t know often that’s shoved to the top of your feed because that’s the algorithm kicking in to try and keep you scrolling, to keep you attached to the platform as long as possible.
So we’ve really seen the business model of social media change over time. Initially it was a way to connect friends and family. But now with the rise of platforms like TikTok and the infinite scroll kind of features that they have, we’ve seen all other social media platforms follow suit, because the rivalry between these companies is so intense, because they all want eyeballs on the screen, that they’ve all followed that same pattern, and they’ve really tuned into the world’s biggest addiction machines.
So it’s no longer about you saying what your family and friends are doing. It’s about, you know, what content can we deliver to you that will keep you online, keep you on our it for as long as possible, and understanding that kind of help me understand the difference between content and algorithmic manipulation, which is what you’re describing, Joe, because really, these cases and what’s going on in the courts right now is not about the third party content and what people are posting online.
It’s about the way in which the platform itself is designed. The specific features that YouTube or Facebook or Snapchat created to keep you connected. And one example would be the autoplay features on YouTube. So you watch a video and then it stops and immediately another one starts. They designed it that way because they want you to watch the next one, or you’ll scrolling Instagram, it’s infinite. At no point does it stop. They want you to keep scrolling and they designed it that way. Or the like. Notifications where you get a ping saying, hey, someone liked your photo and you get that dopamine hit. Facebook designed it that way because they want you to keep opening their app. And that’s the difference, I think.
And that’s been the most simple way for me to think about it and write about it, as a journalist.
Joseph Gordon-Levitt
You, Olivia, because I feel like as I’ve, as I’ve read more of the debate and discussion about these topics. There’s, I think, a valid concern from some folks who are saying things in the area that you’re saying, Olivia, indicating a suspicion of the big tech platforms and using words like addiction, talking about harms, talking about kids and their well-being.
There is, I think, a valid concern because there are some folks out there who want to talk about the well-being of kids as a sort of veil for true censorship, for trying to suppress certain cultures, certain ideologies, certain political messages. You, Olivia, you’ve obviously dedicated a lot of yourself and your time and effort to this work. Are you doing it because there’s certain political ideologies or messages or things that you would like to see suppressed?
Olivia Carville
I mean, the short answer to that is just no.
Joseph Gordon-Levitt
I think I anticipated that answer. But what would you say to folks that might be worried about that sort of ulterior motive in your work?
Olivia Carville
I think that, you know, that’s drilling down on such an important point here. It’s like it’s so much bigger than just this one case or just this one harm. It’s how is all of this discussion in this conversation being weaponized for other ulterior motives? And this is a very polarizing subject, and both sides kind of strongly disagree with the right way to go about reforming Section 230.
If we’re even going to get to that point. I think that I almost feel like when you look at the harms that kids have been experiencing, the sextortion scams driving teenage boys to suicide, the eating disordered content that’s resulting in teen girls being sent videos about how to lose you know, how to eat less than 100 calories a day, or how to get skinnier legs.
When you look at drug dealers optimizing platforms like Snapchat to connect with teenagers and sell them fentanyl pills that have resulted, and hundreds of kids accidentally dying. When you look at the suicidal content or mental health kind of depression related content being pushed to kids who are vulnerable, who are then being driven in some respects, as the parents allege, to suicide.
I feel like those harms are just so extreme that they need to be addressed. And that’s what these lawsuits are attempting to do, is to hold the companies accountable for, you know, harming kids in ways that we never would have imagined when these platforms were first created. But these are the edge cases. And that’s where this question is so important.
Are you using the edge cases to make an argument that’s going to have broader implications for society as a whole? And I think that’s where there’s a real tension and the work that these platforms are doing. And the way they’re attempting to address these problems, because we have seen in recent years all of these platforms roll out safety measures and new policies to try and be to protect kids, building safeguards.
But every time they do something, they get criticized for it because they either don’t go far enough or they go too far. And it’s like this swinging pendulum where one side will say, you’re going too far, that censorship and the other side will say, you’re not going far enough. You have to do more to protect the kids, and it puts them in an impossible position, because when you look at the size of these companies, I mean, we’re talking billions of users are online on a daily basis, and the moderators of these platforms are trying to work through this content as fast as they can.
But they’ve described it to me as like a tidal wave of content. And they’re standing there with a mop and a bucket, you know, how do you possibly moderate a platform with that amount of content coming through in a way that’s safe and in a way that both sides would agree on? And it is an impossible task. It’s you know, I just don’t think that we’re ever going to get to a situation where everyone believes we’re doing the best thing and the right thing.
But I think what we can do and what a lot of child safety advocates argue is we can make these platforms at least put the safety of children above profits and, you know, try and do what they can to build guardrails. That ensures that kids are at the top of their mind rather than money. Because, as you pointed out, these are moneymaking machines.
They’ve made billions of dollars advertising to minors and in some cases, minors have been tragically impacted. And in some cases, you know, children are dead. And I think that more can be done to protect kids without necessarily limiting the kind of things that we’re talking about here and censoring the platforms themselves.
Joseph Gordon-Levitt
To me, again, I know I keep coming back to this, but when I hear you talk about the harms of content about eating disorders or content about fentanyl, etc., where my mind keeps going is these algorithms, because I actually don’t think that it should be not allowed to post on the internet about fentanyl or eating disorders or whatever.
You should ideally be able to say whatever you want. To me the issue is that these platforms that are driven by engagement optimization algorithms, they’re pushing this content. And it’s not because anybody, any human being in the META offices is saying like, ha, I’m going to try to get these kids hooked on fentanyl. All it is, is an algorithm that’s calculating what’s the most likely content to hook the user and keep them and serve them more ads and make more money.
But what actually happens is the algorithms select this damaging stuff more often than the algorithms select nuanced, sensitive, subtle, or any other kind of worthwhile content. It’s more likely to select sensationalism and extreme things and kind of dangerous things because it keeps your attention, because it hooks you. So to me, if you could regulate those algorithms, would you still see posts online about fentanyl? Yes. And in a free society, people should probably be allowed to talk about that. But that doesn’t mean that an algorithm, the biggest platform in the world, should be able to algorithmically push it and amplify it just because it makes the most money. And the question this leads me to is you mentioned this bellwether case. That’s happening in California right now.
And this is a question for both of you, because, Olivia, I know you’ve been following this draw very closely. And Cody, I’m sure you are too. And you have your legal perspective. Is there a version of this case, is there an outcome, a potential outcome where we are able to remove some of the protections that 230 seems to be providing… That’s allowing these platforms to leverage their engagement optimization algorithms? Is there a version where a good outcome in this case could somehow surgically add a sort of a part of the law that helps with this issue, but leaves intact the good and important parts of 230 that I think we all would like to see remain.
Cody Venzke
Joe, I’ll hop in there.
Olivia Carville
That works for Cody to go first. I’d love to hear what you have to say.
Cody Venzke
You know, my first reaction is, no. And the reason for that is, of course, courts won’t be removing anything about Section 230. They will be interpreting and applying it to these hard questions.
Joseph Gordon-Levitt
Sorry, I might have phrased it wrong. Is there a version, let me not say “remove anything”. Is there a potential outcome of this case where we see these attention maximizing algorithms brought to heel a little bit somehow, where platforms can be held liable for the harms that come from their algorithms, while leaving the protections of 230 intact, that protect free speech online. That I think we’d all like to see protected.
Cody Venzke
Parts of those algorithms are at issue in this trial, and Facebook might be held liable for them. And like I said, this court has been very, very careful in dissecting what it is these platforms do. And some of the things that Olivia has mentioned, including, for example, Infinite Scroll, have been identified as things that are not protected by Section 230, but in other respects, Section 230 not only protects that third party content, but, as I said, your ability to moderate it.
And Olivia asked a really hard question. If you have, if you’re there with a mop and a bucket and you have a sea of content, you’re expected to clean up, how do you do that? And algorithms are an important way to do that, where they reflect not just engagement maximization, which is undoubtedly part of what these platforms are up to, but also content moderation decisions around politics, hate speech and things like that.
And the Supreme Court has said that is First Amendment protected. So some of these features that Olivia has described that are issue here, they have not been barred by Section 230. And we might see the court and the jury rule that Facebook can be held liable.
Joseph Gordon-Levitt
Olivia, what do you think would be a good outcome in this case, a bad outcome in this case? And what do you see is happening next?
Olivia Carville
Well, I mean, I feel like this bellwether trial is so important to so many people. You know, that’s one plaintiff on the stand. But she represents so many more people. You know, we saw teens and youth outside LA Superior Court just last week holding up signs that said we are K.G.M. In other words, we are this plaintiff in the way in which we’ve been manipulated by Big Tech, or they claim to be manipulated or have had these platforms impact their lives.That K.G.M. is talking about on the stand is how they feel as well. We’re also seeing parents and families fly in from across the country to attend this court trial, because they feel, you know, the cases coming up next and the way and, you know, the questions that these companies are being asked on the stand are questions that they themselves have been asking for years now.
So for many of them, the fight has already been won because the fight was to gain access to the court system. It wasn’t to get a verdict arguing that they are liable necessarily. It was to get the chance to go to court, to get the chance to get into that courtroom and get before a jury and in many cases, they feel like that discovery process, that pretrial process where the plaintiffs are allowed to pull internal documents from the company and…
Joseph Gordon-Levitt
So many of those internal documents are crazy. I’ve seen them coming in, and it has been great that the courts have had the authority to demand that those documents be produced.
Olivia Carville
100%. And this is for the first time, we’ve seen 6 million internal documents obtained through the discovery process. In this litigation, 150 company employees have been deposed through this trial. I’m sure most people who are listening in heard that Mark Zuckerberg took the stand to testify just a few weeks ago. In this case, you know, that impact, regardless of what the outcome of this trial is, people are listening and people are paying attention now.
And a lot of these parents feel as though they’ve been screaming into the wind like they’ve been saying, you know, my kid was harmed and you’re not listening. And I think for a long time now, a lot of the conversation was around, you know, when things went wrong online and when kids were impacted and when kids maybe tragically died, the onus was put on the parents that this is bad parenting or these are bad kids misusing good products.
And what these parents are saying is, no, that’s a lie. These are good kids who are being manipulated by bad decisions inside these companies. And that’s what they’re pointing to, in these internal documents, which show company employees comparing the platforms to digital casinos targeting kids as young as 4 or 6 years old. In some cases, employees comparing social media to big tobacco and the way they’ve been trying to addict kids to the platform too…
Joseph Gordon-Levitt
The employees making those comparisons, not activists, employees not working for META making that comparison in internal communication in the company.
Olivia Carville
That’s right. Internal messages between employees where they’re referencing big Tobacco and saying that they feel like this is kind of the big tobacco moment for social media. And I think having documents like that come out, having these depositions come out, seeing Mark Zuckerberg being forced to testify. These parents stood outside the courtroom holding photos of their children, and there were camera crews, like from all over the world, documenting that moment.
This is giving them a voice. So I think from their perspective, regardless of what happens with this verdict, they feel like they’ve won part of the fight because they’re finally being heard. We have closing arguments in this case, starting tomorrow, we’re likely to see a verdict even possibly as soon as Friday, if not into next week. And a lot of people are going to be watching because the like, what happens in this case is going to impact the way children use social media going forward.
There’s no doubt about that.
Joseph Gordon-Levitt
Cody, what other paths do you feel hopeful about when it comes to trying to make the internet a more positive place and trying to mitigate some of these harms, both to individual people and to society at large? What are you working on that could potentially help move the needle?
Cody Venzke
It is so wonky, Joe, and is so boring. But it’s privacy. Yeah. And you know, one of the things I’ve come back to over and over in this conversation is the algorithm is nebulous, and there’s actually lots of technology underlying these platforms. Many of which are at issue in that courtroom in Los Angeles. But all of these features that Olivia has described have one thing in common.
They’re driven by our data, and that’s how platforms seek to identify individuals and keep them on their platforms for extended amounts of time. The reality is, as a policy lawyer, I am in Congress, I am in state legislators urging for robust privacy laws that really put people in control of their data. And over and over, a whole alphabet soup of tech associations show up, lined up, and universally oppose that legislation.
And why? Because at the end of the day, our data is their dollars.
Joseph Gordon-Levitt
It’s funny you say it’s so wonky and boring, you’re right. I’ve cared about privacy for a long time. I played Edward Snowden. That’s actually how I got to know Ben Wizner. Who’s your boss, who is, you know, Edward Snowden’s attorney at the ACLU. And it’s funny when you say privacy, people instantly, I think, not everybody, but a lot of people feel like, who cares really? I don’t feel like a private person. I don’t care if a tech company knows where I live or knows what I like to watch. It doesn’t actually matter to me. And I understand that instinct. Not everybody cares about privacy per se, but I feel like what underlies privacy is this thing that you’re getting at is the unethical and sort of aggressive and predatory use of people’s personal data that’s having all these damaging side effects on society.
I wonder if, I wonder sometimes about using a different word besides privacy, but it all sounds wonky, you’re right. It’s like how do you get people to really… It’s complicated how this cascades into the harms we’re seeing. But I guess that’s why we’re having these conversations. Right? I know that you both have to go. Thank you so much for making the time to talk to me about this. I really look forward to continuing the conversation. And anybody out there who listened, thanks for listening. And, let us know what you think. This is an ongoing process. We’re going to try to make the digital world a better place.
It’s only going to happen because we’re communicating and having conversations, and that doesn’t necessarily happen in, you know, little, 30s, scatterbrained, attention seeking videos. It happens in longer form conversations like this and frankly, even much longer dialogues than the hour that we’ve got to spend together. So I really appreciate you taking the time. I appreciate everybody out there who’s listening.
And, anything either of you want to say further?
Olivia Carville
From my perspective, I just think that you hit the nail on the head there. The most important thing is that we’re talking about this, that we’re raising awareness of this fight and that people understand where we’re at. Because for so long now, I feel like no one really fully understood the impact that technology was having on children.
And we recently saw that neuroscientist testify before Congress about how our generation Z is the first generation in modern history that has underperformed their parents on every cognitive level from memory, numeracy, literacy. And you know what changed with generation Z? That’s the introduction of the smartphone. The introduction of social media. And I think we need to fully understand the impact that these platforms and technology are having on kids because, as Cody mentioned, a short time ago, technology is outpacing regulation.
The laws and the litigation cannot keep up with just how fast technology is moving. And we haven’t even touched on generative AI or chat bots or anything like that in this conversation, because it’s all going so fast and we’re all scrambling to keep up. And I think, unfortunately, parents are left in a really impossible position where they’re trying to protect their children who are begging for smartphones, begging for access to social media from the age of ten years old.
And they don’t really know the best way to go about it. So conversations like this can at least help parents understand what the monsters are and understand what they’re comfortable with. Their children, you know, being given access to and when. So I feel like this is just one step in the right direction.
Joseph Gordon-Levitt
Great. Thanks, Olivia. Cody, anything else?
Cody Venzke
I’m thankful that we’re having this conversation. You know, we brought in a number of views about Section 230, about speech online, about kids in the relationship to technology. But in the privacy space, we all have this sort of surveillance fatalism. I’m already being surveilled. They already know everything. Why should I even try? We’re here trying. We’re having this conversation and that is the most important step.
Joseph Gordon-Levitt
All right, Cody, Olivia, thank you so much and all of you. Thanks again.
Olivia Carville
Thanks, Joe.
Cody Venzke
Thanks, Joe.









