Years ago, I read Eckhart Tolle’s A New earth, a commentary on the malevolent shadow that ego casts on human existence. Reading your piece, I was reminded of a strain of that argument: ego loves drama because it provides a stronger sense of self. Drama is a textbook symptom of the collective human ego. We can apply this to our current AI discourse and join the dots to see how the Apocalyptic Default becomes an inevitable psychological byproduct.
If Tolle were to analyse this, I suspect he would argue that since it defines itself through opposition, by framing AI as a ‘predator’ or an ‘existential threat’, the human ego finds a grand, cinematic foil against which to define its own importance. A peaceful, utility-driven AI is boring to the ego because it offers no friction. The apocalypse, however, provides the ultimate drama!
Not at all! This is a really interesting lens to bring in.
The ego–drama connection maps quite cleanly onto the structural argument in the piece. What shows up in the data as “high-signal, high-intensity content” can also be read as psychologically salient content - the kind of material that gives people something to react to, define against, or feel within. In that sense, the system isn’t just amplifying randomness, it’s amplifying what reliably captures attention at a human level.
The point about framing AI as a “foil” is also compelling. The more dramatic the frame, the more it sustains engagement, which then feeds directly back into the datasets and the outputs. Whether the starting point is psychological (ego, narrative, identity) or structural (engagement, data weighting), the effect seems to converge in the same place: intensity becomes the most reinforced signal.
What’s interesting is that this doesn’t require intent at any level. Individual psychology, platform incentives, and model training all point in the same direction without needing coordination.
It would be interesting to see how far this line of thinking goes - will the “drama bias” feel like a stable feature of human systems, or something that can be moderated if the surrounding structures change!
Years ago, I read Eckhart Tolle’s A New earth, a commentary on the malevolent shadow that ego casts on human existence. Reading your piece, I was reminded of a strain of that argument: ego loves drama because it provides a stronger sense of self. Drama is a textbook symptom of the collective human ego. We can apply this to our current AI discourse and join the dots to see how the Apocalyptic Default becomes an inevitable psychological byproduct.
If Tolle were to analyse this, I suspect he would argue that since it defines itself through opposition, by framing AI as a ‘predator’ or an ‘existential threat’, the human ego finds a grand, cinematic foil against which to define its own importance. A peaceful, utility-driven AI is boring to the ego because it offers no friction. The apocalypse, however, provides the ultimate drama!
Or have I gone down the wrong rabbit hole?
Not at all! This is a really interesting lens to bring in.
The ego–drama connection maps quite cleanly onto the structural argument in the piece. What shows up in the data as “high-signal, high-intensity content” can also be read as psychologically salient content - the kind of material that gives people something to react to, define against, or feel within. In that sense, the system isn’t just amplifying randomness, it’s amplifying what reliably captures attention at a human level.
The point about framing AI as a “foil” is also compelling. The more dramatic the frame, the more it sustains engagement, which then feeds directly back into the datasets and the outputs. Whether the starting point is psychological (ego, narrative, identity) or structural (engagement, data weighting), the effect seems to converge in the same place: intensity becomes the most reinforced signal.
What’s interesting is that this doesn’t require intent at any level. Individual psychology, platform incentives, and model training all point in the same direction without needing coordination.
It would be interesting to see how far this line of thinking goes - will the “drama bias” feel like a stable feature of human systems, or something that can be moderated if the surrounding structures change!