We are living in the golden age of bullshit—Trump proved that!—but also in the golden age of artificial intelligence (AI). It shouldn’t be a surprise—they’re basically the same thing. That’s what this post will be about: AI and bullshit.
More and more you see breathless hype about new AI—in 2022 it was ChatGPT, which generated crappy essays. Type a prompt and get a crappy wall of text that is kind of like an essay, but not really. Then late last year the big thing was AI generated images—type a description of an image into the AI and it spits out a picture. That was popular for a little while but seems to have died down now because so many of them are restricted—if you try to produce an image it will give you a message that it can’t, because of some violation, like using some celebrity’s likeness, or violating unclear “safety” standards in ways that aren’t really ever defined.
I’m sure both of those services are still popular and will be used a great deal in our awful future. But the buzz has died down a bit. There’s a new one that just popped onto my radar this week—it’s called Sora, a text-to-video AI model. You type in a text prompt into and it produces a video. In my opinion the video looks like shit—the faces it produces look grotesque and inhuman. But it is a video, it is a product, it exists—and that is enough for these tech people.
So what’s my point? These new AI products are excellent at producing bullshit—essays, pictures, and videos that seem like real things, but aren’t. They are characterized by a basic indifference—to truth, to quality, to accuracy, to usefulness, to explanatory power. It’s just filler content meant to take up space and give the appearance that something is being read, seen, viewed, experienced, or understood. Whether anything is actually being communicated, expressed, or understood, doesn’t matter—there’s an indifference on both sides. The generator of the content is indifferent to its value, and the audience receiving the content is indifferent as well. They aren’t expecting to receive much of value, and the generator of the content isn’t expecting it to be good either—it’s good enough to seem like maybe it could be something, and that’s all that matters.
This indifference is the essence of bullshit. There’s a good book about bullshit, called On Bullshit, by Harry G. Frankfurt, that basically defines bullshit as indifference to truth. This is a useful definition—and it’s helpful to distinguish indifference from lying. Bullshit is not lying—it has a different relation to truth. Lying means you know the truth, but you’re concealing it, and misdirecting people away from it—keeping the truth in your back pocket, rather than revealing it. Liars actually have a better relation to the truth than bullshitters, because they at least value the truth enough to hide it. Bullshitters don’t care about truth—they just say things without caring if it’s true or not. They are indifferent to it. The dark irony of all of this is that so much brainpower and energy is going into creating these AI tools—and all it produces is indifference, in various ways.
It seems to me that this is a good way to think about these fancy new AI products—they are artificial, and fake, but they aren’t necessarily lies. They are below the level of lies—lies would be an improvement over what they are.
AI isn’t artificial in the sense of fakeness or lies. Artificial intelligence produces something real—real indifference. What’s “artificial” is the impression that anything other than indifference is produced. AI is a massive machine that produces indifference.
The rise of AI tools like ChatGPT for text and Sora for video will just create more bullshit—content that we are indifferent to, and which is indifferent to us. So is this a big deal? What’s so bad about indifference? I think indifference fundamentally gets down to a lack of love. If you are indifferent, you simply don’t care if what you are saying is true, or if it’s being understood. You just want to shove out some content to the world to distract it and occupy it, because what becomes of the people hearing or reading what you say means nothing to you. As indifference increases, love necessarily decreases—they are kind of like polar opposites, they can’t exist in the same space.
I suppose the AI "revolution" arrived so late because they had to turn everyone into a consumer first.