The attention economy is inverting
Once upon a time, in the way back before generative AI, people would have to work hard to produce a digital object of any real quality - an image, a document, a report, a PowerPoint. This was helpful in one way - the obvious effort that went into its creation was a good signal or proxy for quality. We’ve discussed this part of it before, and this effect was present across all kinds of media - both personal and broadcast. If you could get on TV, you likely had something to say, because (for a while at least) it was hard to do that. Filters and signals are great heuristics for deciding what to pay attention to, or they used to be.
But there’s a different kind of economy at play here, too. At some fundamental level, giving you an artifact like this is a “transaction” in what we might call an attention economy. That is, I “spent” a bunch of my time and attention to create something, and “gave” that attention to you. You can consume it with (often) less attention. The value is flowing from me to you in the “attention economy” - I am doing more attentional work than you are. (this is a little more complicated because I might do one thing and share it to many people, so I can be amortizing my own attention work, but that’s ok - we can still contemplate each individual interaction for the purposes here).
But what happens in the generative world? In that world, the signal is broken - a complex, well-crafted artifact is much easier to build, so it’s appearance is no longer a proxy for quality. This is a familiar effect. But the other thing that’s happening is that the attentional transaction is now “inverted” - I may have spend a small amount of my time and attention to take up a large amount of yours. I’m no longer “paying” you with attention, or transferring value to you, I’m taking it away.
This problem has recently been labeled “work slop”, and that’s a good example. Work often has this signaling problem, or even a ‘busy work’ problem: make someone produce a report not because it’s valuable per se but because the act of producing it as an assurance that the work was done or is a proxy for some other activity. Or often, you would see someone do careful work to think through a business problem and create a report or memo from that work. Receiving something like this was, in some sense, a gift - someone else had done work so that you would have a better understanding of the world.
But now, it’s not a gift - it’s a tax. And the presence of these kinds of false signals taxes the entire system - now you have to be on guard for wasted time and spend more energy detecting “fakes”.
Unfortunately, I don’t have a neat answer to this one, it’s just an observation. Something fundamental has shifted - digital artifacts now often have ‘negative attentional value’ if you want to think of it in economic terms. I think that’s worth noting because it’s so foundational - we don’t often see a sign flip in real life like that.
The modern art of creation might be less about adding to the pile and more about subtracting from it. Probably we need to find new signals.


So form and medium are no longer reliable signals of quality. This feels like an opportunity to rethink how signals of quality might work. Quality signals could come from something like product reviews for all sorts of work product, but designed with eyes open to the ways in which typical product review mechanisms are gamed today. My intuition is that this new mechanism should be tied to something that is valuable to keep uncorrupted -- something like reputation. If, for instance, my evaluations of the quality of some article or video or book prove to be valuable in aggregate to others, and others can register trust (revokably) in my evaluations going forward for their own recommendations, and if I receive some value for maintaining that trust (giving me incentive not to sell out), then I suspect there is enough incentive in the system to replace the old form-based signals of quality. Clearly there is a bootstrapping problem, but this feels to me akin to the current influencer dynamic that people are already familiar with, but where this version would hopefully be more distributed and scaled than just having a handful of Kardashians.
The result would (could) be that people can amortize the attention spent on evaluating the quality of any given work across the population, which can free up most of us from consuming content we don't find valuable, and provide explicit value to those who facilitate that ability for the rest of us.
I’ve recently had similar thoughts. If we go one step back, I see the following progressions: Access > Generate > Filter
Before the Internet, information was highly siloed and localized. That changed and anyone could access information. But, as you point out, creating or generating anything still required time and effort…work and skill. We’ve solved that now, too, with Generative AI. So, if anyone can generate anything, what happens next? We drown in empty passages of text that fills our inboxes. We have a filtering problem. What is true and what is false? What is importantly and urgent vs just informational (at best). The productivity gains that this technology promises are being circumvented by the people using it to satisfy the systems and constructs in which they operate.