Humans are incredibly creative, especially when it comes to wasting time. Throughout the ages, we’ve explored time-wasting with zeal, inventing new methods and distractions to pass the time and avoid contemplation. In the current age, that tool is generative AI. Generative AI has transformed not into an indispensable productivity tool, but into a babysitter. While AI companies push hard to convince enterprises that their tools are a great fit for business use cases, for many people, it has become a way to fill time exploring AI slop in its many forms. Welcome to the world of sloputainment.
Sloputainment: Next-Generation Time Wasting With AI Slop
No matter your position on generative AI’s usefulness for day-to-day productivity, it’s undeniable that generative AI truly excels at producing slop. The risks and impacts of slop outputs are aligned with the context in which they are generated. In business or safety-critical use cases, slop can have a significant negative impact, causing damage and endangering people. However, people messing around on the internet or playing with these tools on their own typically present a lower risk. There are exceptions, for example, using generative AI to bully or harass others, but, largely, that’s not what most people use them for.
In a previous post, I commented that people secretly like slop and that it is here to stay. Slop is a way for people to entertain themselves, pass the time, and generate content. Something we all witness daily.
Ethan Mollick, one of AI’s biggest cheerleaders and proponents of its productivity, spends much of his time playing around with image and video generation. Or at least, this is what his social media feed suggests.

The AI music generation app Suno reported annual recurring revenue of $150 million. You don’t think all these people are getting gold records out of this, do you?

And, before someone mentions, “But, did you see the number 1 country album was AI,” let me stop you there. You should watch this video instead of listening to that garbage.
Not only does the song suck, but apparently, someone only had to spend 3k for this publicity. Pretty good investment.
Using AI to make music makes someone no more of a musician than a child wearing a firefighter’s helmet makes them a firefighter. Nobody is going to see a child with a firefighter’s helmet on and send them into a burning building, remarking that’s what they signed up for.
Using AI to make music makes someone no more of a musician than a child wearing a firefighter’s helmet makes them a firefighter.
People are even using AI for gender reveals in totally normal ways, such as smashing into the Twin Towers or taking down the Hindenburg. You know, perfectly normal, totally sane shit.
OpenAI is even making the shift towards porn. This isn’t the move a company makes when they are on the cusp of AGI. It’s a move you make when you are hemorrhaging money and desperate for any avenue whatsoever to revenue.
Sloputainment is popular because it checks critical boxes. It’s a form of entertainment for the person creating it. A form of entertainment that requires no talent or effort, resulting in extremely low friction. Also, it creates content that the person can share on social media. Many constantly search for things daily to transform into content, fearing that a single day without posting on social media will make them irrelevant. The fact that sloputainment checks both the entertainment and content boxes all but guarantees it’s here to stay.
Sloputainment and the Illusion of Productivity
But directly using AI to create slop images and video is only one form of sloputainment. There is another form that masks itself as productivity or hustling. In a new trend, people boast publicly about how they’ve abandoned things like video games in favor of building software. Largely inconsequential software projects for themselves, but the task is transformed into content. And their perspective based on isolated projects is transformed into an entire worldview.

Why it’s bad to let AI creep into every moment and aspect of your life? I don’t know, why is that bad? With some people, there’s no delineation or line they won’t let AI cross. Don’t get me wrong, I’m sure it’s fun for people playing around with this stuff. I mean, building things with Legos can be fun, too. But nobody is confused when they build something with Legos that they are building the next big thing. That something really could come out of it, and that they may be onto the next multi-million-dollar app. This isn’t AI psychosis, but it is delusional.
There is a distinction that needs to be made here between the forms of sloputainment. The only way to get good at something is to practice it. For example, you can’t get good at building AI agents without building agents, which is why vibe coding doesn’t teach you much about real-world development. Even the person who coined the term vibe coding understands this. This activity is different than people just posting slop images to social media.
The issue is the undercurrent of hustle bro culture, which gives the impression that if you aren’t hustling, you’ll be left behind. In many ways, this public performance is meant to show that their personal activities are better than everyone else’s. It’s self-flagellation in the era of generative AI. People playing around with things to learn and understand is good. People replacing other activities in their personal lives with the illusion of productivity is bad.

In many ways, these activities resemble people playing around with their friends, similar to garage bands having fun jamming on nothing in particular. However, secretly hoping in the background that they get their big break and a gold or platinum album. Only, instead of a gold or platinum album, this is their award. It’s fitting that in the generative AI age, you get awards for spending money instead of making it.

It’s fitting that in the generative AI age, you get awards for spending money instead of making it.
However, instead of representing people enjoying the output of their creative pursuits, it’s just representative of a system chewing up tokens. The dystopia says, “Nom nom.”
Sloputainment Is Content
AI is not only making everything entertainment, but as a byproduct, it’s creating content. This is the knockout punch for our modern information junk food diet. It’s the ice cream piled high on a cake, topped with potato chips, chocolate, caramel, and a pound of M&Ms for breakfast, lunch, and dinner.
Social media has rewired our brains to see everything and everyone as content. The whole world is our content oyster, and nothing escapes the content lens. It’s why you see things like idiots defacing coral reefs with graffiti. If only this were AI slop. This is why some people have no problem using AI to harass other people and organizations. As I mentioned back in 2020, harassment is the true legacy of technologies like deepfakes. This does seem to be bearing out in the age of generative AI.
I think what bothers me most about this hustle bro nonsense is that it gives the impression of devaluing so much of what is truly valuable. I mean, quiet contemplation looks like time wasting to morons in motion. Which is Newton’s Fourth Law of motion, morons in motion, stay in motion. It’s also why so many people get so many things so wrong. They are so busy hustling that they never stop to contemplate, you know, to get things right.
It takes me forever, by generative AI standards, to write an article like this. Writing is thinking, and in each of these articles, I’m working my way through the topics like everyone else while attempting to give them the level of contemplation they deserve. If this site were about content instead of contemplation, I’d be blasting out AI-slop articles with clickbait headlines, trying to game SEO. You know, like over 99% of the internet. Instead, I’m content to labor away in obscurity. If writing is thinking, then generating with AI is the lack of thinking. This gets lost in the tidal wave of content slop.
If writing is thinking, then generating with AI is the lack of thinking.
Sloputainment is Entertainment
Somewhere along the way, we conned ourselves into thinking everything needs to be entertainment. Things now have to be converted into entertainment to be valuable. Even something as mundane as our own data can now be transformed into entertainment. I mean, Google created NotebookLM, allowing the generation of a podcast from our data. Because learning from reading, analyzing, and engaging with ideas is boring, and only losers would learn that way. We now have to be entertained to learn.
There’s a problem, though. When something is viewed as entertainment, it appears to have a more truthful weight or feeling. To use modern vernacular, you could even say that data-as-entertainment emits truth vibes. Where we may question an AI overview or summary, we are less likely to question the same data in the form of a podcast that sounds like humans or a video presentation with a human voice, but it’s the same data with the same issues. Especially since much of this data doesn’t conflict with our biases, otherwise, we wouldn’t consider it entertainment.
Where we may question an AI overview or summary, we are less likely to question the same data in the form of a podcast that sounds like humans or a video presentation with a human voice.
Take Graham Hancock’s Ancient Apocalypse docuseries on Netflix. Presented on a platform like Netflix, stunning locations, cinematic shots, but all 100% pure, unadulterated bullshit. Yet presenting the content this way lends it weight and credibility. Of course, it doesn’t help that the docuseries doesn’t feature any actual experts either.
Graham Hancock is a fraud with no expertise and a peddler of bullshit. Yet, he was given a platform to spread his nonsense to a wider audience. AI-as-entertainment platforms risk doing the same for every type of content and data. We risk embedding falsehoods and misperceptions deep in our brains due to formatting issues.
We risk embedding falsehoods and misperceptions deep in our brains due to formatting issues.
Text presented in paragraph form with easily clickable links is a much better, easier way to verify content than an audio podcast you listen to in the car or while doing something else. The same can be said of video presentations or cartoon talking heads. But paragraphs have higher friction than podcasts.
Admittedly, NotebookLM is neat technology, but the disconnect lies in failing to distinguish between a technology being cool and the value it truly provides, and in a far greater disconnect about what the technology does to us. You know, the tradeoffs. So much of our current moment consists of waving hands, directing our attention to how cool a technology is without consideration of use cases or impacts.
I’m sure this is just a continuation of a trend that Neil Postman identified as starting with television. But AI supercharges it. Imagine, instead of a podcast, next up will be video. As a matter of fact, while I was writing this post, Google updated NotebookLM to include generative video overviews, taking data entertainment to the next level. Using video, we can learn about someone like Neil Postman not by engaging with his work but through cartoon summaries that may or may not capture the important aspects. An approach Postman would detest, but no doubt see coming. There is a good chance these summaries miss important details as they focus on what’s most interesting, shocking, or exciting. We are about to “true crime” everything.
I should note that the impacts aren’t the same for all types of information. For some things, simple summaries are fine. However, the difficulty we face is understanding where the true delineation point lies, and, of course, the tendency to overestimate our knowledge based on trivial information.
There was an early attempt to use AI to cut together movie trailers. The AI identified the most “exciting” aspects of the movie, explosions, car chases, etc., but it didn’t connect with people or follow a story. It was just a bunch of cutscenes with no through line. Now, everything is cutscenes, fueling our entertainment addiction. There’s a lot of history that’s boring but important, and we risk paving over history as we reengineer it into entertainment.
We risk paving over history as we reengineer it into entertainment.
If you are lucky, there’s a memory of an entertaining school teacher who made classes more tolerable, and you may have even learned more because of it. We also have memories of learning things from documentaries. These memories may lead us to think that entertainment is the best way to engage with a topic and learn. However, there’s a distinction between entertaining and entertainment.
Entertaining is a method that still requires friction to get to a goal. It’s just that the friction becomes more tolerable due to heightened interest and engagement. For example, the entertaining school teacher still required the same reading and homework assignments. Entertainment is content that promises a complete reduction of friction. No need to read a book or engage with the content, watch this AI-generated short instead. Remember, knowledge and understanding aren’t generated from summaries or bullet points.
You may be thinking that there’s not a lot of harm in these activities, and for the most part, this is correct. How each person wastes their time is up to them. Fair enough, to each their own. However, consider that when we use AI to harass other people, we cause them harm. When we use AI to create entertainment masquerading as something else, we harm ourselves. This is what I take issue with: the tradeoffs that nobody considers and the false perception that hustling this way is the only way to make progress in the modern age.
There is no doubt that people use generative AI daily for productivity tasks. Great. And if people truly are using these activities to learn, fair enough. However, things like vibe coding and AI summaries don’t teach valuable lessons. Quite often, the lessons come afterward, when you get owned or try to apply your newfound summarized knowledge.
Conclusion
Sloputainment leaves so many things undiscovered, about ourselves, others, and the world. With every prompt, it steals from us, taking our time, understanding, and even our sense of who we are. We get sucked into the content vortex spinning chaotically around a hollow center with no ability to center ourselves, and it takes effort to break free.
Wasting time isn’t a modern concept. However, we have supercharged it with AI. In On The Shortness of Life, Seneca explains that it isn’t that life is too short, but the fact that we waste so much of the time we have. Seems some things haven’t changed since the first century AD. But we’ll close with some words from the great American philosopher, Sebastien Bach, who posed this concept in a series of questions:
Is it all just wasted time?
Can you look at yourself,
When you think of what you left behind?
Is it all just wasted time?
Can you live with yourself,
When you think of what you've left behind?