My first time at Web Summit was in 2016. Back then, even though we’d been running HITRECORD for years, we never really considered ourselves a tech company. I met some incredibly smart and accomplished people who were generous enough to suppress what must have been a frequent urge to laugh at just how little I knew about the tech industry. Now, nine years later, taking the same stage was a trip. So much has changed—in my life, in the wider world, and certainly in the intersecting spheres of media and technology.
I was honored to be interviewed by Jennifer Cunningham, Editor-in-Chief of Newsweek. It was particularly illuminating to hear her take on what today’s digital technology means for journalists, and how much her concerns overlap with what so many of us artists are feeling. Spoiler alert: neither one of us are against generative AI, but she did seem to share my conviction that today’s AI companies need guardrails to rein in their purely-profit-driven amoral compasses.
Thanks to Jennifer, Web Summit, and to all of you who joined us at the M.E.O. Arena in beautiful Lisbon. Here’s a transcript…
🔴
[TRANSCRIPT]:
JENNIFER: Hey, everybody. Good afternoon, everyone. Are you enjoying Web Summit, Lisbon? It’s wonderful to be here with all of you.
JOE: Hold on, should we just do that again? This is a huge, like, are you enjoying Web Summit, Lisbon? Come on, let’s hear it. Yeah.
JENNIFER: Yeah. You never do talks like this in an arena. It’s pretty funny. All right. It’s wonderful to be here with all of you. And I’m super excited for this chat with you, Joseph Gordon-Levitt, actor, filmmaker, entrepreneur, and advocate. And I’m really excited for a conversation that is multidimensional and goes beyond the surface level topics that can come up in AI and creativity. So without further ado, let’s jump right into it. Let’s do it. You’ve argued that our digital selves should truly belong to us. What does that mean for the working artist or content creator who is navigating the social media platforms and AI tools in the modern era?
JOE: You know, I’ve heard it said that being a creator online today is sort of like building your dream house on rented land because you know you join one of these big platforms and you post your stuff and you start building your audience, but none of that belongs to you. It could be taken away from you, the algorithm could change, or the platform could decide they don’t like what you’re saying. And you really don’t have control over your digital self. And right now, that feels natural.
But it doesn’t have to be. I would make a comparison in history that a long time ago, hundreds of years ago, kings owned all the land, right?
Ordinary people didn’t own land, they were serfs. And at the time, that also felt natural. And if a serf had said, “hey, you know, this little plot of land where I built my house and I farmed my crops and I raised my family, I think this plot of land should belong to me, not the king.” That would have been thought of as sort of crazy. And then, of course, that changed and for the better. Not just better because it’s humane and kind to let people own their own land, it’s actually a much better way to run a society and an economy. It leads to all kinds of prosperity and advances when you incentivize people with ownership over their own selves and their own lives, right?
So I think the digital world needs to go through a similar revolution, if you will, from the sort of feudal internet that we’re living in today where just a few big businesses sort of control the whole internet and all the people are just surfs on these platforms to a different kind of infrastructure where, yeah, you own your own self and your ideas and your data, your content, your voice, those things belong to you and if a tech company is going to make money off of those things, well, they need to share that money with you.
JENNIFER: It’s very interesting. And that dovetails very nicely into my next question because I want to talk about the concept or the idea of data provenance. In your substack, you spotlighted data provenance and how urgent is the need for creative credit and traceability? And what’s broken right now in the way that tech companies handle it?
JOE: Well, so now we come to AI, right? Because here’s the thing about quote-unquote artificial intelligence. Even in the name, it sounds like there’s this independent entity, this other robot or alien or even God that is intelligent on its own and can make things. But that’s actually not how this technology works. Large language models don’t work like that. And I imagine if you’re here, you probably have some understanding of this. The way these models are built is they’re fed with enormous amounts of content and data that was produced by humans. So there’s no actual intelligence inside these models other than the human intelligence, all the different people’s skill and talent and labor and humanity that went into producing all this content and data that’s used to train these models.
I’m not against the technology known as generative AI at all. I actually think it’s really exciting and really powerful and could lead to all kinds of great leaps forward in creativity and more. But the way the economics are set up right now are not fair and are only bringing us further into this sort of digital feudalism I was talking about a minute ago. And if we want to live in a more free and open digital world where people have incentives and can be rewarded for having a good idea and working hard and making something, then we need to set up a system where these AI models are giving attribution and compensation and asking for permission from the people whose data and content they’re using.
This is something that there’s a number of credible technologists that are working on this. Then if you ask some of the big AI businesses, they say it’s impossible. But of course, it’s not really impossible. It’s just not be as good for their bottom line.
I think this is actually important beyond creators because if we establish the principle that anybody who does anything valuable, that that work can be hoovered up into an AI model and a huge tech company can make money with that idea or that work without paying the person,
what kind of economy are we ultimately heading for? We’re headed towards something very dystopian and feudalistic. So I think this is something that we all really have to put our heads together and get right right now.
JENNIFER: It’s really interesting to hear your perspective. Oh, yes, please.
JOE: Thanks. You’re too kind.
JENNIFER: It’s interesting to hear your perspective on that particular topic because that’s something that myself and my colleagues in the news business are grappling with as well. It’s a huge issue, and it’s not necessarily black and white as well.
JOE: Tell me about the meetings at Newsweek. I don’t know if you all know. This is like an extremely accomplished journalist here. I’m honored to be talking with her as the editor-in-chief at Newsweek. So I’m so curious to hear about the conversations that you all are having about this. Are you seeing the impact of, oh, when people want to learn something about the news,
they’re getting it from this chat bot and that chat bot was maybe trained on work that our journalists did, but we’re not seeing any of the economic upside. And what is that doing to the business?
JENNIFER: So what I will say is that myself at Newsweek, as well as my colleagues in the media business, we want our work to be widely read. We want to have huge audiences around our content. And we want to train LLMs because that is the future. AI is disrupting all industries, including journalism. But my position is that we want to be compensated for that.
I am in an industry that is facing tremendous headwinds. And if there was an opportunity for folks in the news business to come to the table with a lot of these tech companies, we could all walk away with a compromise where we are feeding the LLMs and our work is being compensated for. And so that’s a place I hope that we can get to in the near future.
JOE: I’m really with that. I want to just get even a little more specific because there are, I’ve read about some news organizations that are doing deals with some of these companies. And the problem I see from what I understand of these deals, I’m not in the room where it happens, but is that these deals, they’re sort of one and done buyouts oftentimes. They forgive the years of past theft.
Whereas what we should be negotiating for, I think, as creators, whether in the world of entertainment or journalism, is an ongoing compensation system that is understanding and attributing the importance of, okay, this training data was particularly important to this output that was generated by this model. Okay, that generated this amount of ad revenue. Let’s share that ad revenue with the creator, whether a journalist or an entertainer. I’m worried about some of these sort of… I think sort of short-sighted deals that are being struck nowadays, we all, I think, need to kind of get on the same page as creators and say, let’s work out a sustainable system.
That’s probably where, frankly, the government needs to get involved.
JENNIFER: Tech companies, you listening? Yeah, yeah. It’s an ongoing discussion, you know, and I do hope and believe that we will get to a place in the short term where everybody is happy at the table.
JOE: We have to figure it out. It’s about even more than art or the news. This is… Any kind of issue that you care about is going to be impacted by this, whether you care about the environment or you care about the economy or you care about, I don’t know, crime or immigration or criminal justice or whatever it is that you care about, whether it’s a left-coded issue or a right-coded issue, whatever it is, it’s going to be heavily impacted by this.
So this is part of why I’m going to… stages like this and talking about this stuff. I’m not selling anything. I don’t have a company that’s going to solve this problem. I just think that this is something we all need to be thinking about and talking to our lawmakers about and voting about and understand that this is a crucial moment and something we really need to be focused on.
JENNIFER: Let’s stay in this space for a moment because there is a lot of anxiety about AI stealing art for training data. How do you define the line between inspiration, appropriation, and frankly, exploitation in this new digital marketplace?
JOE: Yeah, it’s a really good question. So I’m a big fan of Remix, and you know, I ran an online artistic community that was all about remixing each other for years. It was called Hit Record.
Yeah, in fact, the first time I was on this stage here at Web Summit, I was talking about Hit Record. And the whole premise was, okay, all of us in the community, we’re gonna build off what the other one does. But at the end, when a project was finished, if that project made money, we always made sure that credit was given to all the different contributors, and we made sure that that money was split up in a way that made sense for what happened with the creative process.
So to me, in a lot of ways, that’s the difference. I’m not opposed to anybody using my stuff and remixing my stuff. That’s how the world works. That’s how the creative process works. But if you’re gonna make money off of my stuff, then yeah, you should share the money. And it’s not just for me.
One thing you hear a lot, and I think it’s really true, is how much these tools can help micro-budget productions, independent artists, or even just people that don’t have any budget at all, or just making something in their bedroom. I’m all for two 15-year-olds in wherever, Mumbai, say, making a movie in their bedroom and using AI to make it look as good as The Avengers. Like, sounds great to me. But when that happens, all of the different creative people whose work went into those AI tools need to be compensated. So if that movie that those two kids made blows up and makes money, okay, well let’s split it up. And the reason why it’s important, even for those two kids, is because what if they do succeed?
If they do succeed, and then the AI company is allowed to just take their thing and run it through their model and then spit it out and not pay them. Well, now they’re the ones being screwed.
So this really isn’t about preserving… I hear people say sometimes, you’re just trying to preserve the Hollywood status quo. I’m very grateful for the career that I have in Hollywood, but it’s really not about that. It’s about establishing a principle moving forward that… creativity needs to be compensated if it makes money.
JENNIFER: And speaking of Hollywood, what would you say is the biggest myth or misunderstanding that you hear fromHollywood or Silicon Valley about AI’s impact specifically on creativity?
JOE: Well, I’ll tell you what that makes me think of. I sat in a room… with, I’m not going to say who, but a bunch of really renowned and established Hollywood entertainers, filmmakers, artists, et cetera, executives, and some really established and renowned folks from Silicon Valley, technologists, entrepreneurs, et cetera and the argument was made that, look, the argument was made from one of these Silicon Valley heavy hitters:
We have to take your stuff and we don’t have time to work out some way to compensate you because if we don’t, we’re gonna lose to China. And this is bigger than art and creativity. This is a national security issue.
This is something you’ll hear a lot coming from Silicon Valley right now. And I do think it’s fair to call that a bit of a myth. It’s not to say that there’s not a kernel of truth in it. I think there is. It is important for free societies and democratic nations to compete with what I would say is an autocratic regime in China, for sure. I don’t believe that setting up this kind of dystopian feudalism in the digital world is the best way to compete with China.
A lot of this myth is based on a comparison that’s made sometimes explicitly, sometimes implicitly with the nuclear arms race, with the Manhattan Project in particular. Oppenheimer is one of the best movies I’ve seen in years. If you saw that movie, you saw what it was. There was a great world conflict between the fascists and the democracies. The scientists went and had this race of like, who could build the nuclear bomb first? And whoever built it first would win the war. And that came to pass. The U.S. won. We built the bomb first, we dropped it on Hiroshima, it was a tragedy, many people died, but it did stop fascism and the West won the war, right?
So there’s an analogy that gets made nowadays that we are in that same moment and China’s trying to build this thing they call AGI and we’re trying to build this thing called AGI and whoever gets there first will like push the button and all of a sudden the world will change and we’ll win the war. That’s not happening. We can see, look at how the models are advancing. They’re advancing really well, but look at how they’re coming out. It’s gradual. It’s bit by bit.
The world is going to move into an economy and a set of systems where this technology is integrated and do so in a more gradual way. There’s not gonna be a Hiroshima moment. We don’t need to race for that. So I don’t frankly buy this notion that we must forgo fairness.
We must forgo what’s a healthy free market economy in favor of this dystopian feudalism, so that we can beat China to the button. I think it’s a myth, and we should stop believing it when Silicon Valley tells that story.
JENNIFER: You’ve worn many hats in your career, actor, filmmaker, entrepreneur, advocate. How has your perspective at each intersection shaped the way you approach technology’s role in protecting and amplifying creative voices?
JOE: So I learned so much running. I mentioned HitRecord a minute ago, right? So that started as something that very much was not a startup. It was just an online community that was very small and something I was running with my brother, but it grew and grew and grew, and eventually we were a VC-backed startup.
We went up to Sand Hill Road and raised venture capital. I learned a lot about what it means to have investors and have a burn rate and be that kind of company. This is why I said a second ago, the government needs to be part of this because we can’t rely on businesses like that to prioritize the public good. They can’t do it.
They’re not set up for it. They have to prioritize their bottom line, their business interests. And if they don’t, a competitor will come in and they’ll do it. I don’t think there’s any real version of this where we shame the private businesses to do the right thing. They can’t do it.
That’s why we have laws. And look, there are laws that govern every major industry. There are laws around your food, laws around your medicine, laws around airplanes, laws around banks, laws around everything. Do you know how many laws were, I’m sure, taken into consideration when this arena was built?
Of course, and that’s good. We want those laws because we trust those laws to know that we can sit in this crazy arena and nothing’s going to fall on our heads and kill us. The reason that we can have that confidence is because there’s a law that there’s a code that made sure that that construction was done according to code.
That wasn’t a private business following pure market incentives. There’s a balance that needs to be struck. And so I feel like, to answer your question, my time sort of running one of these businesses made me all the more sure that this is a role that the government has to play. And I’m sitting on a stage in Europe right now, so I’ll just be blunt about this. In the United States right now, as I’m sure many of you know, the dominant attitude in the government is we are not going to make any laws about AI.
In fact, they’re trying to prevent the individual states in the United States from making laws about AI. And to me, this is just completely counterproductive. And again, I said this at the beginning, I’ll say it again. I’m not against this technology. I think this technology has so much potential to do so much good. But we’re kidding ourselves to think that… it will do all that good motivated purely by profits. It has to be a balance between yes profit incentives and yes private businesses and yes entrepreneurship and also guardrails from the public through our democratic governments – and I don’t say democratic as in the democratic party I mean democracy – guardrails that our governments put in place to protect the public interest.
We can accelerate and go fast and achieve great things and build beautiful things and also steer and be responsible about it.
JENNIFER: Joe, as we wrap… You talk about optimism and hope for creative futures. Where do you see the greatest opportunities for artists in this new era?
JOE: Well, I mentioned these two hypothetical kids who get to make an amazing movie. That is incredibly inspiring to me. I remember using… my family’s old video camera to make little videos when I was, I don’t know, nine, 10. And by the way, I’m a dad. I have a 10, eight, and three-year-old. And they’re making stuff now. It’s the most inspiring thing. If you ever want to get really inspired about the future, have some kids. I highly recommend it. But when you see them making things, it’s enough to get me emotional.
I want there to be a future where their creativity is valued and is cherished because what else is valuable in life on earth other than your kids’ creativity, right? So let’s not build the machine that turns us all into just numbers and products and, you know, to make the numbers go up as much as possible.
Let’s build these machines to further value and further enhance that human creativity we can do it and it’s if you want to say like what makes me optimistic it’s like there’s this huge room of people here that are all here talking about it so thank you for being here i’m really really glad to have had this conversation with you all.









