The Gravity of Intelligence: Why AI Cannot Become Social Infrastructure
Author: Reichi Kirihoshi (min.k) | mncc.info
Every time I hear that "AI will become social infrastructure like electricity or the internet," the same question surfaces. Electricity and running water became infrastructure because they got cheaper the more people used them. AI is moving in the opposite direction — so why do we keep telling the same story?
Starting with a question: what is infrastructure, exactly?
The word "infrastructure" is often used as a synonym for "important technology." But the actual meaning is narrower. Infrastructure is technology that, when it stops working, disrupts the majority of society — technology whose costs keep falling, whose alternatives disappear, and which eventually becomes so embedded in daily life that people no longer think about whether to use it.
Electricity, running water, and mobile phones share a common pattern. Unit costs fell as more people used them, and at a certain point, the reasons not to have them simply evaporated. Using them became a precondition of living.
That path is not yet visible for AI.
The smarter AI gets, the heavier it gets
Conventional IT services tend toward near-zero marginal cost once the underlying system is built. Search engines and social networks are the clearest examples.
Generative AI works in the opposite direction: every inference requires computation to run.
Higher-performing models consume more GPU power and electricity, and the demands grow heavier still as AI expands into long-form text, video, and multimodal tasks. Any technology where "more useful" directly means "more expensive to operate" is in fundamental tension with the logic of infrastructure.
Kirihoshi tentatively calls this the AI Infrastructure Paradox. The deeper AI embeds itself in society, the more indispensable it becomes — yet that same process intensifies the pressures of rising costs and barriers to universal adoption. The more important AI becomes, the higher the wall against widespread access.
The dead end of free trials and revenue structure
AI adoption today is advancing on the back of expanding losses. Most users encounter AI through free tiers, but the more capable the model, the heavier the operational costs on the provider's side. Users accustomed to free access tend to leave when pricing is introduced; expanding free access widens the deficit.
Higher performance drives up costs. Higher costs create pressure to charge. Charging drives away casual users. And losing casual users forecloses the possibility of universal infrastructure.
The external environment adds further headwinds: rising electricity costs, constrained GPU supply, and soaring data center construction expenses. AI carries a deep dependence on vast physical infrastructure behind the cloud, and many model companies are currently viable largely because of investment capital. A scenario in which the entire industry converges toward higher prices is not far-fetched.
So where does AI actually take hold?
Here is a somewhat bold prediction. For individuals, AI may be headed toward something resembling Twitter or Discord — a polarization between heavy users and non-users, where "people who use it do so every day, and people who don't use it never touch it" becomes the fixed state.
For enterprises, penetration will likely go considerably deeper. Operational efficiency, knowledge retrieval, customer support, automated systems — AI is steadily taking root in these contexts.
The key distinction is this: it will not be "AI itself" that becomes infrastructure, but rather "business systems with AI embedded inside them." AI will recede from view, dissolving into the architecture of systems, and that is how it will settle into society.
Why AI will not follow the path of the mobile phone
Consider the mobile phone — itself a relatively recent addition to the category of infrastructure.
Mobile phones achieved infrastructure status because not having one meant being unreachable, because prices fell rapidly, and because network buildout created uniform coverage nationwide. None of those conditions apply to AI.
Life functions fine without it. The variance in how intensively individuals use it is enormous. And the relationship between capability and cost — smarter AI costs more — shows no sign of reversing.
The result may be a three-tier structure: those who use high-performance models daily, those who only access free-tier AI, and those who do not use AI at all. This could calcify into a gap in cognition and productivity — not an electricity gap, but a gap in access to thinking assistance.
As this analysis suggests, infrastructure is defined by the moment when most people stop thinking about whether to use something. Given the structural dynamics at work, AI will probably not reach that point.
But business systems with AI built in will quietly become infrastructure. It will not be AI itself, but structures containing AI that form the skeleton of society — and in that form, AI will indeed permeate.
"Infrastructure" should be a word of design, not of fantasy. Applying it to AI as it currently exists requires a somewhat higher resolution than the conversation has yet reached.
☕️ If this was worth your time, a coffee is always welcome. https://buymeacoffee.com/mink_obs
Written by: Reichi Kirihoshi (min.k) / Research & structural support: Claude Sonnet 4.6, ChatGPT / AI-assisted / Structure observation