Article
When Efficiency Suffocated Modernity
Efficiency installed itself as the rule in the nineties and, with AI, it gives us back a sea of mediocrity. My thesis: the next edge is not depth in one discipline, it is the capacity to connect several.
It was the nineties. We were in the middle of a deep technological shift — from analog to digital, from cassette and vinyl to CD, from the landline to the cellphone. That race started fast and, with time, accelerated beyond anything we could have predicted.
That acceleration installed a mindset that conquered the corporate world first, then quietly seeped into daily life: efficiency. Doing more with less. In a more competitive, higher-demand world, efficiency became the rule of the last twenty years.
The quiet installation of a mindset
That mindset shaped the way we embraced every new technology that arrived. The premise was almost religious: these tools would save us from ourselves — from our slowness, from our scatter, from our "inefficiency."
The philosopher Byung-Chul Han called this the achievement society: an order where the imperative is no longer to obey but to produce, optimize, perform. Where the supposedly free subject ends up exploiting itself in the name of output (Byung-Chul Han, 2015). I see this every time I walk into a room with a client and the first question is no longer what should we build but how much can we ship this week.
It was under that frame that we embraced large language models. We didn't ask what does this change in the way we think. We asked how much more content can I produce today.
The mirror: a sea of mediocrity
What we got back is the logical consequence of asking the wrong question.
With LLMs at scale, we have produced a sea of more of the same. We have traded critical thinking for "speed" and "efficiency." That same efficiency is flooding the servers with content, apps, and whatever technological artifact anyone can think of — copy of the copy of the copy, and at the end of the day, one more snapshot of our own mediocrity.
This isn't just a personal hunch. In Science Advances, Doshi and Hauser found that AI-assisted writers produced stories judged individually more creative than human-only stories — but collectively more similar to each other than the unassisted control (Doshi & Hauser, 2024). Each person gets a little better. The ecosystem gets flatter.
And there's a technical mechanism underneath the cultural observation. When models train on data generated by previous models, the tails of the distribution — the rare, the original, the strange — get lost. Shumailov and his co-authors showed in Nature that this produces model collapse: what's left is the statistical center, the average of the average (Shumailov et al., 2024). What we mistake for abundance is in fact compression.
And still we keep believing that two clicks will make us millionaires.
The symbiosis (what the machine cannot be)
Years ago I used to write and argue with colleagues about a technology that didn't yet exist by this name. I said something close to this: a moment will come when there is a symbiosis in which machine and human coexist in a codependence.
I still believe that — sharper now.
The machine will never be human. Being human is a complex emergent property, not an organized pile of cells. The level of human complexity is something we still don't fully understand, which is why we are far from producing a synthetic version of ourselves, no matter how close the imitation gets. And the inverse holds: the machine depends on us, because it is humans who evolved it and will keep evolving it.
That isn't romantic. It's the material condition of the system. An EEG study from the MIT Media Lab showed that users who lean most on ChatGPT exhibit the lowest cognitive engagement during the task, not the highest (MIT Media Lab, 2025). The machine extends us — but it can also empty us, if we don't put an operator with judgment on the other end.
Where we're going: the generalist who connects
What's clear to me is that, in this era, deep knowledge of a single discipline will be displaced by the generalized view — not because depth stops being valuable, but because a wider perspective lets you make connections across areas, see problems from different angles, and that is exactly where these new technologies fit.
I know there's a strong counter-position. Cal Newport defends deep work as the scarce asset of the 21st century; Anders Ericsson's tradition of deliberate practice shows that mastery is built with concentrated hours, not with scatter. I don't dismiss them. On the contrary: I think the profile that will stand out is neither the shallow generalist nor the monastic specialist, but something closer to a T-shape — enough depth to defend a technical judgment, enough breadth to connect it to another discipline. The difference with the previous era is that the horizontal bar of the T now carries more weight than it used to.
And the question I'm leaving here is the one I find most interesting:
What would a solution to a medical problem look like, seen through the lens of creativity or art? How could a musician unlock new potential through technological tools and produce applications, products, forms of care that don't exist yet?
If efficiency was the rule of the last twenty years, the connection between domains could be the rule of the next shift. Not to produce more. To produce what wasn't there.
That's what I'm watching.