We can never really see the world through another person’s eyes, but art is the closest thing we have to that. Art has value because it offers a window into another person’s mind, their ideals, their beliefs. It shares an authentic, vulnerable, and humane story. Media created by AI does nothing but offer a view into the average, the average of society, of the data it was trained with. For that reason it will never be exceptional.
True if you are talking about media generated by ai without a creative human input. That’s what most people see I guess, the low effort slop. Which is same as gifs at the dawn of internet.
Ai is a tool, same as 3D graphics was, remember?. When 3D graphics emerged purists said exactly the same thing - 3D is soulless it will never be artistic like Disney cartoons. 3D will never be part of a serious movie because you can see it’s 3D from mile away.
You can see ai generated stuff in some of major tv shows and you don’t even know it’s there,
CGI still requires human input at every step of the process. You don’t just tell the software to add an explosion and you’re done. And while I agree that AI can be a tool it should be limited to monotonous functions which intrinsically don’t require creativity. For example: denoising or subject tracking in a frame. When AI gets involved in the artistic process further, even with human input, it takes away from the authenticity of the message because the human doesn’t control his art fully. So the message becomes diluted.
AI video also requires human input at every step. It’s deceptively simple to generate something quickly. But for something worth watching it could be as much work as 3D. If not more, we don’t know yet because it hasn’t even reached that quality level.
What makes people worry about gen AI is that the minimum guidance output looks flashy and impressive on a quick glance. And we know it’s AI generated looking so good and that creates panic. But look closely - every movement of every object has unnatural motion, there’s always wrong physics,etc. You wouldn’t see such trash on tv, except maybe in a music video where everything goes.
If it wasn’t AI we wouldn’t be looking. Because as a content those are just tiresome jokes, visual experiments with no point to them. Everyone wants to play now and get it out of their system.
But have we truly accepted it or have we just accepted it because it was just forced, in a way, onto our eyes to just get used to it. Like insurance companies wanting us to believe they are always there for us.
We also have a choice, we either like it for the mood we are in or don't because we rather want art, like I want true insurance so, State Farm doesn't cut it for me, lol.
Just like every other major innovation throughout history, some people see AI as an existential threat. The same thing happened with the loom, the printing press, and the automobile. But AI differs from those moments in a few important ways that make the concern feel more urgent.
First, AI systems were built on the hard work of millions of people without their permission — often in legally gray or unresolved ways. Regardless of how individual cases shake out, the reality is that vast amounts of creative labor were absorbed without consent, and that pattern is continuing.
Second, AI isn’t transforming one sector at a time; it’s impacting nearly every industry simultaneously, and at a speed that far outpaces our ability to adapt laws, labor protections, or social norms. The benefits of that disruption are likely to flow disproportionately upward, while the risks fall on workers and creators who have little real ability to opt out.
What concerns me most is the combination of scale, speed, and asymmetry of power. Individuals didn’t choose to participate in this system, yet they’re expected to absorb the consequences — including job loss, precariousness, and decisions made by opaque systems with very little accountability when harm occurs. Taken together, this feels less like technological progress and more like a fast track toward a technocratic future layered on top of late-stage capitalism.
I’m honestly not sure we can still avoid that outcome. But I’m glad to see people trying to slow things down and put real guardrails in place while there’s still time.
I'm not going to keep doing this waiting game with Travis Barber he should've helped but he's in a football mindset which means he doesn't want to show up unless he's pumped up and he's drunk
Vos yeux m'habitent et m'envoutent… J'ai l'impression de ressentir votre regard si parlant…
Pourtant, Dieu sait que dans le monde artistique… il y a de beaux garçons… Mais vous et votre manière d'être, et ce petit regard mutin…ne cesse de me tourmenter…
I was right that they were having secret meetings without me inviting men in suits to talk about who can and cannot be in my life. We kind of talk every single day and I hope they don't think they can get you to stop talking to me and just walk in the hallway while I take an Uber with a Stranger. I keep getting offers to stay here longer but I'm hoping to get out I'm just waiting to put in my 2 week notice
I think you are actually too generous in saying that “there are also positives to AI.” The tech industry has used its cultural and financial influence to shove AI down our throats while telling us we like it.
The problem is not that there are not ultimately useful applications for AI. There are a world of potential applications in health science, bioinformatics, bioengineering and prosthetics, radiological imaging analysis, etc.
There are three fundamental, root problems with generative AI, as I see it it in its current incarnation:
1) models were fed massive amounts of stolen or non-consensually mined data in the form of intellectual property. How is this being rectified and is there any talk of fiscal reparations to the people who were victims of that theft?
2) They are trained with what is becoming more and more apparent as white supremacist, patriarchal, capitalistic, colonial bias. Lead computer scientists at google were fired or pushed out for voicing specific concerns about AI racial bias. Their concerns were ignored or mocked.
3) They are incredibly destructive to the natural environment and strain/damage natural resources, especially water, yet this aspect of AI, its massive drawing of natural resources is minimized and played down.
Any real and critical conversations about AI have to acknowledge that tech companies have propagandistically laid the groundwork for people to accept AI as an ubiquitous part of “technology use” without questioning in what way AI would parse data on their behalf before they even see it; in other words, AI by its nature is designed to manipulate data, but the manner of such manipulation is opaque, not transparent.
There has to be a conversation explaining how much consent and personal information people give to AIs to do the “fun” things google and amazon have portrayed through targeted media campaigns. For example, in order for AI to fulfill the seemingly simple and helpful request, “find and purchase two tickets to the Brooks and Dunn show in the city that is nearest to where I am” a person has to consent to let the AI know where they live or where they are in real time, they have to give it access to banking information, they have to let it know their address, and date of birth, possibly their social security number and “mother's maiden name” data that is used to digitally confirm ID and age. What does AI do with that information afterwards? Who and what company may have access to it after it goes into the nebulous cloud of AI data storage?
My strongest point of contention is that until all these things are solidly hashed out and legislated to prioritize the protection of consumers, it's premature to talk about the "useful applications" of AI.
Im not sure I understand your question, if I am understanding it right, it's essentially the same equation as before AI. It depends entirely on the person, their motivation/inspiration and how much effort they put in. Regardless of what it is, just taking unedited output from an llm and posting/publishing is almost always going to result in something less than good. I don't start with a draft from an llm. I start with my own writing and feed that through, and it's a kind of back and forth process until I have a final draft that I have edited line by line and then I just have the llm do a final pass for errors only. So it's not really "this was created with ai" vs "this was not created with ai". It's a process and it involves both inputs. That's the process for text anyway. Images are different and I think a slightly different discussion, but generally the images I need are not something an artist would have just kicking around, and I'm certainly not in any position to pay anyone. For instance, if I need an image of a cell phone on a table, so I can put a waveform on it so it looks like it's playing a recording, that seems like a good situation to use an AI image generator for me. I also use traditional image editors afterwards to get it where I really want it. So again it's a process, not just a binary use or don't use decision.
Also level labs allows me to create just about any sound or voice I may need for my audio work. All of my character voices are custom made through very specific prompts, and again its a back and forth process. The audio editing, trimming, cutting positioning, distorting, laying background sounds... That's all me using audacity, doing each clip from scratch. I only eleven labs to create the raw voice clips and some sound effects. I'm creating multi character conversations in many different environments and themes, and soundscapes like radio transmissions and sound effects etc. all sewn together by me in audacity.
It's still pretty easy to tell when someone is just copy pasting from an llm directly with nothing but a prompt to start and no editing afterwards. The constant contrastive padding and the overuse of the word "like" in metaphors that often sound hollow are a giveaway. Here's the thing though, and this has less to do with art and more to do with writing essays or journalism or think pieces. Just because someone is using an llm for the bulk of the work, doesn't mean they don't have something valid and important to say. It's the subject, the meaning, the feeling, motivation, effort and inspiration behind it all that's what's really important, not the tools that are used. That has ALWAYS been the case in art and writing. That hasn't changed and likely never will.
I think we are going to start to see a split, between those that use AI as only a PART of their process, vs. those relying on it to do everything. You can already see the differences in the work. Eventually those without the motivation to spend the time to really shape their work into something that's uniquely theirs, will fade out because they won't be getting the response they want. While the people putting the real work in and writing/creating from real human expression and emotion will stick around and keep doing what they do.
I think people in general need to be a little more open to the idea that AI is going to allow a certain sub-set of people to become multi format creators, where before it would have been very difficult or impossible. I essentially have an entire production studio in my computer now. But still takes lots of time and effort to create something that people are going to want to read or hear or see. Most of my work involves all three.
Yes slop will proliferate for a while. And there are always going to be growth gurus grifting with llm copy pasta. But there's also people who now have access to tools that are allowing them to create amazing work that they just couldn't before, the right tools didn't exist.
There's an entirely different and very interesting discussion to be had about neurodivergent people and how they use AI and why it's like stepping into a whole new world of creative ability. But yeah anyway sorry for the novel 😂
I wrote. I also carved, finished and hung axe handles, hand tied fishing lures and poles, I was an accomplished carpenter and woodworker, I built a boat, I did martial arts for decades if you want to count that. Is that sufficient? Because if not theres more.
I worked out that we have 1 major issue in a recent paper I'm writing. This biggest problem is public action. You can't motivate the public these days without a moral panic over reaction, and that makes them extremely emotional for about 1-2 weeks. No one is organizing the people in effective ways to make an impact. So politicians can do sadistic things like "We are bringing back slavery." And the people will go crazy for a week and maybe become vilified for being psycho protestors, and then everyone forgets about it. There are 3 reasons:
#1 Desensitization. Both in not being surprised by disgusting behavior and not caring because that's just the way corporations and governments are. "It's just normal and it's fine." But also, "Oh it's just another protest. Morons. Don't care." So people aren't informed about the issues, and monitoring the facts. They are being stimulated by emotional triggers that they either act on or ignore.
#2 When protest happen these days, they are more likely to fail so people see protests as useless, so why would they waste their time trying to join others to push for change. So even if they were informed, they won't do anything because they don't see it as making a difference.
#3 We don't have a clear solution to fight for. "End Slavery" was an extremely clear goal and extremely simple to implement to fix the core problem. People were clearly informed of what the change would be and what it would mean if the changes were in place.
Solution: The Creators Coalition on AI needs to:
#1, get a clear and concise solution (many people on the planet already have the solution). Creators Coalition on AI can crowd source for ideas about how to solve it. You can create a competition and pay out $5000 to top 10 people who contribute to the solution. And $5 for every person who echoed the most well written version. So for 7k or so, you end up getting the best direction forward for the world.
#2, Organize a way the people can monitor the levels of the problems. It's like a site that monitors the weather. It's not millions of tiny local problems bombarding people as hype stories. It's not a tree that smashed a car, or a flooded street. It's a place to monitor the facts about how dangerous the situation is. Graphs and stuff to see the rates at which the bad consequences are rising or falling.
#3 Organize the people to take effective collective action. Shirts that express the protest message. Flags, signs, mugs, etc. Anything that people can actually spread the message constantly without the emotion and with very little effort. But also to organize marches on a global scale in effective ways. Because without the Creators Coalition on AI organizing the people as a whole, the message will be all over the place, and the action will be all over the place, some violent and some just appearing like weirdos, and the numbers will be all over the place at different times all over the place.
But without a central force like the Creators Coalition on AI organizing all this, we are doomed to fail because you'll only be preaching the converted, and the mass media will use emotion to have people fighting over meaningless fights.
"how can we make sure that the VFX artists and the animators and all these people are protected?" My feeling they are going the way of typesetters, and paste-up people went when Apple came into the scene. They disappeared into the sunset.
It is sad that we reached era of AI and still mentally bound to slavery to capitalism with its requirement that everyone should be working all their life. When for 50 years we have been producing more than enough to feed and shelter everyone and work only few hours a week. 50% of jobs are so called bullshit jobs. We’re just running out of those. Temporary relief was crypto industry and now it’s people using AI generating useless content to post on social media. But we can’t all be just running ai generated content farms.
And the next chapter - the robots are coming! More jobs lost! whew!
The conversation supposed to be not about losing jobs but about switching to UBI. About stupid government bureaucrats and when they will finally realise that UBI is a right, not privilege. And when stupid Elonmusks will realise that there will be no human army of slaves for them because there’s nothing to do for us. What they imagine we are going to do, run next to robots and lubricate their joints?
it's good to keep some sort of positive spin on AI in the industry because like many have said when in the right hands the possibilities are amazing! So imagine how the industry can use it to their advantage 👍🧐👏
We can never really see the world through another person’s eyes, but art is the closest thing we have to that. Art has value because it offers a window into another person’s mind, their ideals, their beliefs. It shares an authentic, vulnerable, and humane story. Media created by AI does nothing but offer a view into the average, the average of society, of the data it was trained with. For that reason it will never be exceptional.
True if you are talking about media generated by ai without a creative human input. That’s what most people see I guess, the low effort slop. Which is same as gifs at the dawn of internet.
Ai is a tool, same as 3D graphics was, remember?. When 3D graphics emerged purists said exactly the same thing - 3D is soulless it will never be artistic like Disney cartoons. 3D will never be part of a serious movie because you can see it’s 3D from mile away.
You can see ai generated stuff in some of major tv shows and you don’t even know it’s there,
CGI still requires human input at every step of the process. You don’t just tell the software to add an explosion and you’re done. And while I agree that AI can be a tool it should be limited to monotonous functions which intrinsically don’t require creativity. For example: denoising or subject tracking in a frame. When AI gets involved in the artistic process further, even with human input, it takes away from the authenticity of the message because the human doesn’t control his art fully. So the message becomes diluted.
AI video also requires human input at every step. It’s deceptively simple to generate something quickly. But for something worth watching it could be as much work as 3D. If not more, we don’t know yet because it hasn’t even reached that quality level.
What makes people worry about gen AI is that the minimum guidance output looks flashy and impressive on a quick glance. And we know it’s AI generated looking so good and that creates panic. But look closely - every movement of every object has unnatural motion, there’s always wrong physics,etc. You wouldn’t see such trash on tv, except maybe in a music video where everything goes.
If it wasn’t AI we wouldn’t be looking. Because as a content those are just tiresome jokes, visual experiments with no point to them. Everyone wants to play now and get it out of their system.
I think you’re onto something Roman. Over time using AI will be more about how a person learns the tool and manipulates it to reflect their voice.
But have we truly accepted it or have we just accepted it because it was just forced, in a way, onto our eyes to just get used to it. Like insurance companies wanting us to believe they are always there for us.
We also have a choice, we either like it for the mood we are in or don't because we rather want art, like I want true insurance so, State Farm doesn't cut it for me, lol.
and not to forget that AI was infact trained on that very Art.
Just like every other major innovation throughout history, some people see AI as an existential threat. The same thing happened with the loom, the printing press, and the automobile. But AI differs from those moments in a few important ways that make the concern feel more urgent.
First, AI systems were built on the hard work of millions of people without their permission — often in legally gray or unresolved ways. Regardless of how individual cases shake out, the reality is that vast amounts of creative labor were absorbed without consent, and that pattern is continuing.
Second, AI isn’t transforming one sector at a time; it’s impacting nearly every industry simultaneously, and at a speed that far outpaces our ability to adapt laws, labor protections, or social norms. The benefits of that disruption are likely to flow disproportionately upward, while the risks fall on workers and creators who have little real ability to opt out.
What concerns me most is the combination of scale, speed, and asymmetry of power. Individuals didn’t choose to participate in this system, yet they’re expected to absorb the consequences — including job loss, precariousness, and decisions made by opaque systems with very little accountability when harm occurs. Taken together, this feels less like technological progress and more like a fast track toward a technocratic future layered on top of late-stage capitalism.
I’m honestly not sure we can still avoid that outcome. But I’m glad to see people trying to slow things down and put real guardrails in place while there’s still time.
Hi Joseph!
Je vous regarde amoureusement sur facebook... Les ailes célestes me poussent dans le dos.
Vos douces photos....
Et quelle surprise de voir votre petit nombril...
Mon petit centre du monde à moi...
Mon euphorie...
Et ce costume... Quel homme !
Mon doux mirage...
I'm not going to keep doing this waiting game with Travis Barber he should've helped but he's in a football mindset which means he doesn't want to show up unless he's pumped up and he's drunk
Vos yeux m'habitent et m'envoutent… J'ai l'impression de ressentir votre regard si parlant…
Pourtant, Dieu sait que dans le monde artistique… il y a de beaux garçons… Mais vous et votre manière d'être, et ce petit regard mutin…ne cesse de me tourmenter…
Je crois être inspirée plus que merveilleusement.
Je suis eperdue et abandonnée à cette douceur…
Excusez tous ces messages un peu rêveurs…
C'est tellement bien de l'écrire, pour moi…
Mon petit amour à moi…
Dans le fond du mystère de mes rêves.
Mon petit amour secret… 💘
Tony Abbate/Nick Leatherman still on the iPad screen on Uber they have to 🛑 STOP
They spread a rumor that I'm going to emotionally be in a relationship with you. We're friends that's more important
I was right that they were having secret meetings without me inviting men in suits to talk about who can and cannot be in my life. We kind of talk every single day and I hope they don't think they can get you to stop talking to me and just walk in the hallway while I take an Uber with a Stranger. I keep getting offers to stay here longer but I'm hoping to get out I'm just waiting to put in my 2 week notice
I think you are actually too generous in saying that “there are also positives to AI.” The tech industry has used its cultural and financial influence to shove AI down our throats while telling us we like it.
The problem is not that there are not ultimately useful applications for AI. There are a world of potential applications in health science, bioinformatics, bioengineering and prosthetics, radiological imaging analysis, etc.
There are three fundamental, root problems with generative AI, as I see it it in its current incarnation:
1) models were fed massive amounts of stolen or non-consensually mined data in the form of intellectual property. How is this being rectified and is there any talk of fiscal reparations to the people who were victims of that theft?
2) They are trained with what is becoming more and more apparent as white supremacist, patriarchal, capitalistic, colonial bias. Lead computer scientists at google were fired or pushed out for voicing specific concerns about AI racial bias. Their concerns were ignored or mocked.
3) They are incredibly destructive to the natural environment and strain/damage natural resources, especially water, yet this aspect of AI, its massive drawing of natural resources is minimized and played down.
Any real and critical conversations about AI have to acknowledge that tech companies have propagandistically laid the groundwork for people to accept AI as an ubiquitous part of “technology use” without questioning in what way AI would parse data on their behalf before they even see it; in other words, AI by its nature is designed to manipulate data, but the manner of such manipulation is opaque, not transparent.
There has to be a conversation explaining how much consent and personal information people give to AIs to do the “fun” things google and amazon have portrayed through targeted media campaigns. For example, in order for AI to fulfill the seemingly simple and helpful request, “find and purchase two tickets to the Brooks and Dunn show in the city that is nearest to where I am” a person has to consent to let the AI know where they live or where they are in real time, they have to give it access to banking information, they have to let it know their address, and date of birth, possibly their social security number and “mother's maiden name” data that is used to digitally confirm ID and age. What does AI do with that information afterwards? Who and what company may have access to it after it goes into the nebulous cloud of AI data storage?
My strongest point of contention is that until all these things are solidly hashed out and legislated to prioritize the protection of consumers, it's premature to talk about the "useful applications" of AI.
Im not sure I understand your question, if I am understanding it right, it's essentially the same equation as before AI. It depends entirely on the person, their motivation/inspiration and how much effort they put in. Regardless of what it is, just taking unedited output from an llm and posting/publishing is almost always going to result in something less than good. I don't start with a draft from an llm. I start with my own writing and feed that through, and it's a kind of back and forth process until I have a final draft that I have edited line by line and then I just have the llm do a final pass for errors only. So it's not really "this was created with ai" vs "this was not created with ai". It's a process and it involves both inputs. That's the process for text anyway. Images are different and I think a slightly different discussion, but generally the images I need are not something an artist would have just kicking around, and I'm certainly not in any position to pay anyone. For instance, if I need an image of a cell phone on a table, so I can put a waveform on it so it looks like it's playing a recording, that seems like a good situation to use an AI image generator for me. I also use traditional image editors afterwards to get it where I really want it. So again it's a process, not just a binary use or don't use decision.
Also level labs allows me to create just about any sound or voice I may need for my audio work. All of my character voices are custom made through very specific prompts, and again its a back and forth process. The audio editing, trimming, cutting positioning, distorting, laying background sounds... That's all me using audacity, doing each clip from scratch. I only eleven labs to create the raw voice clips and some sound effects. I'm creating multi character conversations in many different environments and themes, and soundscapes like radio transmissions and sound effects etc. all sewn together by me in audacity.
It's still pretty easy to tell when someone is just copy pasting from an llm directly with nothing but a prompt to start and no editing afterwards. The constant contrastive padding and the overuse of the word "like" in metaphors that often sound hollow are a giveaway. Here's the thing though, and this has less to do with art and more to do with writing essays or journalism or think pieces. Just because someone is using an llm for the bulk of the work, doesn't mean they don't have something valid and important to say. It's the subject, the meaning, the feeling, motivation, effort and inspiration behind it all that's what's really important, not the tools that are used. That has ALWAYS been the case in art and writing. That hasn't changed and likely never will.
I think we are going to start to see a split, between those that use AI as only a PART of their process, vs. those relying on it to do everything. You can already see the differences in the work. Eventually those without the motivation to spend the time to really shape their work into something that's uniquely theirs, will fade out because they won't be getting the response they want. While the people putting the real work in and writing/creating from real human expression and emotion will stick around and keep doing what they do.
I think people in general need to be a little more open to the idea that AI is going to allow a certain sub-set of people to become multi format creators, where before it would have been very difficult or impossible. I essentially have an entire production studio in my computer now. But still takes lots of time and effort to create something that people are going to want to read or hear or see. Most of my work involves all three.
Yes slop will proliferate for a while. And there are always going to be growth gurus grifting with llm copy pasta. But there's also people who now have access to tools that are allowing them to create amazing work that they just couldn't before, the right tools didn't exist.
There's an entirely different and very interesting discussion to be had about neurodivergent people and how they use AI and why it's like stepping into a whole new world of creative ability. But yeah anyway sorry for the novel 😂
I wrote. I also carved, finished and hung axe handles, hand tied fishing lures and poles, I was an accomplished carpenter and woodworker, I built a boat, I did martial arts for decades if you want to count that. Is that sufficient? Because if not theres more.
I worked out that we have 1 major issue in a recent paper I'm writing. This biggest problem is public action. You can't motivate the public these days without a moral panic over reaction, and that makes them extremely emotional for about 1-2 weeks. No one is organizing the people in effective ways to make an impact. So politicians can do sadistic things like "We are bringing back slavery." And the people will go crazy for a week and maybe become vilified for being psycho protestors, and then everyone forgets about it. There are 3 reasons:
#1 Desensitization. Both in not being surprised by disgusting behavior and not caring because that's just the way corporations and governments are. "It's just normal and it's fine." But also, "Oh it's just another protest. Morons. Don't care." So people aren't informed about the issues, and monitoring the facts. They are being stimulated by emotional triggers that they either act on or ignore.
#2 When protest happen these days, they are more likely to fail so people see protests as useless, so why would they waste their time trying to join others to push for change. So even if they were informed, they won't do anything because they don't see it as making a difference.
#3 We don't have a clear solution to fight for. "End Slavery" was an extremely clear goal and extremely simple to implement to fix the core problem. People were clearly informed of what the change would be and what it would mean if the changes were in place.
Solution: The Creators Coalition on AI needs to:
#1, get a clear and concise solution (many people on the planet already have the solution). Creators Coalition on AI can crowd source for ideas about how to solve it. You can create a competition and pay out $5000 to top 10 people who contribute to the solution. And $5 for every person who echoed the most well written version. So for 7k or so, you end up getting the best direction forward for the world.
#2, Organize a way the people can monitor the levels of the problems. It's like a site that monitors the weather. It's not millions of tiny local problems bombarding people as hype stories. It's not a tree that smashed a car, or a flooded street. It's a place to monitor the facts about how dangerous the situation is. Graphs and stuff to see the rates at which the bad consequences are rising or falling.
#3 Organize the people to take effective collective action. Shirts that express the protest message. Flags, signs, mugs, etc. Anything that people can actually spread the message constantly without the emotion and with very little effort. But also to organize marches on a global scale in effective ways. Because without the Creators Coalition on AI organizing the people as a whole, the message will be all over the place, and the action will be all over the place, some violent and some just appearing like weirdos, and the numbers will be all over the place at different times all over the place.
But without a central force like the Creators Coalition on AI organizing all this, we are doomed to fail because you'll only be preaching the converted, and the mass media will use emotion to have people fighting over meaningless fights.
"how can we make sure that the VFX artists and the animators and all these people are protected?" My feeling they are going the way of typesetters, and paste-up people went when Apple came into the scene. They disappeared into the sunset.
It is sad that we reached era of AI and still mentally bound to slavery to capitalism with its requirement that everyone should be working all their life. When for 50 years we have been producing more than enough to feed and shelter everyone and work only few hours a week. 50% of jobs are so called bullshit jobs. We’re just running out of those. Temporary relief was crypto industry and now it’s people using AI generating useless content to post on social media. But we can’t all be just running ai generated content farms.
And the next chapter - the robots are coming! More jobs lost! whew!
The conversation supposed to be not about losing jobs but about switching to UBI. About stupid government bureaucrats and when they will finally realise that UBI is a right, not privilege. And when stupid Elonmusks will realise that there will be no human army of slaves for them because there’s nothing to do for us. What they imagine we are going to do, run next to robots and lubricate their joints?
it's good to keep some sort of positive spin on AI in the industry because like many have said when in the right hands the possibilities are amazing! So imagine how the industry can use it to their advantage 👍🧐👏