- 8 minutes ago
- 19 min read

The Data Center Revolution
April 5, 2026
Over a hundred years ago, transformations in the technological structure of capitalism took the form of the struggle of workers to master their product. Strikes, lockouts, co-operative associations, factory legislation — all the way up to party resolutions and the seizure of power — were inseparable from the transformations in technology proper. Changes in capitalism could be understood as the history of the proletariat’s own coming to class-consciousness. [I]
But such practical unity, after a long political liquidation, is now absent. Technology seems to arbitrarily change as a function of the stock market “casino” and/or from geo-political priorities subjecting the masses to a new deluge. Because of this practical regression, the Left’s orientation has been less about grasping changes in capitalism as forms of estranged consciousness, but rather about reading the tea leaves of the “next crash.”
Let’s say the AI boom is a bubble. Because the 2008 Great Recession was preceded by a stock market crash, generation of Leftists have been primed to look for any run-up in the stock market as definitely a bubble and therefore, the Götterdämmerung of Capitalism — or at least a recession (see Roberts, Reese, RCA, Cosmonaut, Second Thought, Ann Pettifor). As Karl Korsch put it already in the 1930s:
For example, the Left has passed around the infamous statistic by Economist Jason Furman that “data centers” explained 92% of the GDP in the first quarter of 2025. [II] This is supposed to give the impression of a tottering Empire hanging on by an accounting trick. But we must be careful not to identify too closely with bourgeois ideology.
Unfortunately, the “Marxist economists” are fond of focusing on profit rates and “surplus value,” strangely to the detriment of “use-value.” In particular, the use-value of capital. The latter is regularly treated as the use-value of some specific thing — the activity of a machine — but it is rather “a collective product, and only by the united action of many members, nay, in the last resort, only by the united action of all members of society, can it be set in motion.”
What does this mean? The industrial revolution made the world interdependent both on and beyond bourgeois social relations. In the fight over the working day, the workers politicized their immediate cooperation — or what Smith called the detailed division of labor — against their past cooperation. The latter was embodied by the master manufacturers that hired labor. Only over time were these figures transformed into capitalists. [III] This separated the living time of producers from their product, or what is known as the separation of the producers from the means of production. Not separation in the arithmetic or spatial sense, but in the sense that “science [became] a productive force distinct from labour[.]”
With this, the world built by labor was able to shed itself of the organic limits that characterized associated labor in the Manufacturing period and manifest as alien, technological knowledge. But this didn’t overcome the property relations that characterized even the manufacturing era, with its detailed division of labor, as emancipatory. All of the famous phrases — “property in the means of production”, “industrial reserve army”, etc. — are phenomena reflecting the contradiction of the bourgeois and industrial world. Speculatively, they suggest another world on the other side of the present.
Thus, it is not that there is continuity between forces of production, but continuity is a force of production. The common slip reflects the self-contradiction of cooperation, which manifests as separate sites of production. In the detailed manufacturing era, continuity was formed by the workers themselves. They owned their means of production — their cooperation. This is why Smith made the demand that workers be paid according to time, rather than piecework, because their value was by being part of the “collective labourer formed by the combination of a number of detail labourers...the machinery specially characteristic of the manufacturing period.” This is no longer simply the case. The continuity now has a dualistic reality. This appears as the antagonism between the struggle of the value for present labor and past labor.
When the data center relates to the semiconductor company, the relation is both as if between private property owners (labor) and as if they operated on one giant conveyor belt in a factory. In other words, as property, the data centers are dependent upon the labor that sets them in motion. Therefore, their product is compared to other properties on the basis of labor time.
On the other hand, the rare metals and energy, sent to the fab foundries, that produce the semiconductors, that go into the data centers, that power the hyperscalars, that allow for the training and inference of AI models, etc. — all the parts can only meet their use-value, individually, if they meet the use-value of the whole. They have to relate as if they are exactly aligned according to some unconscious, technical plan. However, they can’t (under present conditions), literally be rationally planned. The logic of machines must appear independent; machinery must stand on its “hind legs vis-à-vis the worker and confront him as ‘capital’." If this wasn’t the case, it wouldn’t be possible for capital to make an independent claim to being a necessary condition of production, for it could be traced back to Mother Earth and Father Labor in the last instance. It was only when machines were producing machines, did this independent character “[stand] on its own feet.”
The cooperation of the machines is what is known as machine time — this is both actualized by the cooperation of labor and yet undermines the very same cooperation of labor. From the standpoint of labor, this appears to be wrong or irrational. Labor is the “yeast” that puts the machinery into motion, but then combined industrial force only fulfills itself by blocking the claim made by labor. [IV] It is important that what is meant by industrial forces is not limited to technology but contains the whole social and political apparatus acting at once.
Hence the phrase, the “material relations between persons and social relations between things.” Capital is not one side of the contradiction but sits at the contradiction. When labor-time contracts, it is registered in the economy in the form of falling profit rates or wages, etc. When machine time collapses, however, it is registered as heteronomy and incongruity.
Thus, it is not reducible to the profit rates, the gross domestic product, the wages — to labor value. Whether or not the “bubble” bursts [V], the question would still be what happens to all this build-out. A market sell-off, flushing out “zombie” AI startups, will not eliminate accumulation, but transform how the accumulation is expressed.
In fact, it should be remembered that the major tech companies that drove the present AI explosion grew and invested during the last several financial crises. ASML, which has a near-monopoly on the advanced lithography machines that etch transistor patterns onto silicon wafers, was doubling-down on investment when its remaining Japanese competition was retreating under the pressure of the 2008 Crisis. [VI] Likewise, Apple began offshoring the production of its products in the wake of the East Asian Financial Crisis and accelerated this through the dotcom bust. [VII] “Data Center Alley’s” current infrastructure advantage has its roots in the fiber-optic bubble.
The reason why 2008 had the stagnant recovery that characterizes the Millennial experience is because the financial crash meant a whole latticework of concrete forms of production had to be unraveled to free the production process to revolutionize. If it was just a housing bubble and the stock market crash, then the liquidation of the assets would have just gone to the “vulture investors” who may have put it to better use. Even the more sophisticated falling rate of profit theories, while focused on the ratio of living-to-dead labor, leave out why profitability is the problem that it is. Only under certain conditions does a profitability crisis necessitate a complete revolution in the forms of production, and it is the unconscious way in which this happens that is associated with prolonged economic slumps.
The new latticework is now being built out. What we refer to as data centers only became through the compounding of several technological trends. The buildings went from being passive data storage to programmable entities-in-themselves. [VIII] This has constituted the technological standard that measures what counts. To produce capital, requires the precise coordination of the whole socialized apparatus — from rare earth extraction to the GPUs to the electricity to the liquid cooling and financing that allows the process to remain continuous. [IX] These parts relate as separate properties, but their product is the combined fact of all.
Let’s trace some history.
The replacement of the vacuum tube with the invention of the transistor allowed not only the translation of analog to digital logic, but greater energy efficiency in more compact space. At the time of the invention, it was the Cold War-era military that drove the demand for placing electronics on smaller spaces to make them more computationally efficient.
Because Capitalist production seeks to maximize the efficiency of all its fixed-capital, computer scientists soon began pushing the dimensions of the machines, both spatially and virtually. To avoid dead-time, computers were automated and scheduled to run batches of programs through a virtual computer. Continuity in production is a force of production brought by the industrial revolution. Once separated from humans, its limit becomes unbounded.
While computing expanded virtually, its hardware shrunk physically. It wouldn’t take long before the invention of the integrated circuit led to the relentless pursuit of placing more transistors onto ever shrinking spaces, or Moore's Law. [X] This really is just the latest iteration in the power discovered during the industrial revolution — that of the economies of scale. [XI] Once the object is given and separated from organic limitations, scientific advance can perfect the execution. Efficiency also meant preventing “leakage” or the spontaneous flow of electrons when they should be impeded. The Silicon transistor was invented to solve that problem at a cheaper cost and thus began the “Silicon Age.”
Prior to the rise of the internet and the commercial spread of the personal computer, the data center would have meant a room or part of a floor in a building that housed a company’s data files. [XII] Over time, the mainframes that had occupied entire rooms in the mid-20th century were shrunk to “minicomputers," opening the way to the home personal computer boom of the 1980s. Basic client-server models were developed to connect multiple computers, forming the basis for the internet. As the use of internet became more regular and adopted by businesses, internet data centers became a necessity. [XIII] What had been essentially a component part internal to an enterprise, now became the cooperation between enterprises. It is as if the factory floor itself has become alienated into separate factories.
If data centers were going to be reliable to other businesses, they would have to be explicit about the continuity of their services. Along with the first internet data centers came Uptime’s Tier Classification system. Global data center electricity share grew from 0.5% to 1-2% over 23 years.
However, over time, this scaling — “Dennard Scaling” — began to break down. [XIV] While chips have improved voltage efficiency through shrinking, the diminishing size and change in chemical composition of the chips, created new avenues for leakage, i.e., quantum tunneling.
At around the time of this plateau, competitive video gamers were driving the demand for graphics processing units (GPU) independent of any concern for machine learning. [XV] Higher framerate translated to more frags in a Quake deathmatch. While parallel processing predates GPUs and CPUs can perform the task, the architecture of GPUs is more readily built for parallel processing. It wasn’t long until computer scientists at Stanford began daisy-chaining GPUs to produce an exponential growth in computer speed [XVI], culminating in the development of Compute Unified Device Architecture (CUDA) or the interface for programming several chips to coordinate for general-purpose processing.
The drive to efficiency was diverted to alternative paths to improve efficiency through reducing software bloat and hacking algorithmic speed. [XVII] This continues today. DeepSeek-V3 and TurboQuant’s algorithmic advances challenged the market shares of GPUs and memory chips, respectively.
Various machine learning techniques used in artificial intelligence had been around for decades but remained resigned to academic or purely novel settings — such as playing backgammon. [XVIII] Two articles in particular — Google's 2017 “Attention is All You Need” and OpenAI’s 2020 "Scaling Laws for Neural Language Models” — have exploited the strategy of chaining GPUs together for the use of parallel processing. Thus, the market has shifted from Central Processing Units (CPUs) to GPU’S. Intel was 68% of the chip market in 2021, whereas it is now 6%; Nvidia, on the other hand, has gone from 25% to 86% in the same period.
The CPU’s previous reign was tied to the era of producing the training set for these AI models. The popularity of commercially available digital products — smartphones, cameras, computers — created the material for data to become Big Data. This quantitative increase laid the ground for the qualitative transformation of the “warehouse-scale machine” [XIX] or treating data centers as computers.
These smart devices have already transformed ways of living. Only the Unabomber could be expected to not have a phone today. But all the apps and internet connectivity have become needs or means of satisfying needs — they enter the reproduction of life and are presupposed in our wages.
Alex Karp and Nicholas W Zamiska identified the impetus:
The creative energies of Silicon Valley engineers would end up being directed toward solving their own problems, which, for many, stemmed from a fundamental disconnect between the life they thought they had been promised as a result of their intellectual talents — a life of ease and wants sated, of car services and assistants at the ready to fetch meals and groceries — and the reality of their relatively modest incomes. . . . So they would ultimately go about constructing the apps and consumer services that would create an illusion of the good life for themselves and their peers by making it possible to summon taxis, make restaurant reservations, and book vacation home rentals with only a few swipe on a phone. [XX]
Thus, whatever efficiency was gained by the innovations in computing could not stop the drive to accumulate capital. Jevon’s Paradox eventually reared its head: any advance in efficiency drove competition further.
This naturally puts pressure on energy demand. Tech companies, such as Microsoft, have become involved in local infrastructure by investing in water plants and negotiating electricity prices to limit the effect of their demand on local populations. Bitcoin miners have developed ways to smooth energy consumption overtime. Jet engines have been refurbished to power data centers.
The previously cyclical character of semiconductors has become regulated. For example, Micron, known for their memory chips, recently signed several 5-year Strategic Customer Agreements — previously, these agreements were one-year in length. It makes sense that hyperscalers would wish to integrate their supply chains. They can only be what they are supposed to be by the continuous utilization of an enormous mass of semiconductors. Behind their backs, they have been socializing the means of production to ever closer degree. These chips were formerly “commodity-like” when their main source of revenue was from consumer products like personal computers. But because their function is becoming more a part of the orchestration of many types of semiconductors at a mass scale, they take on the qualities of a phantom, fixed capital.
The 2017–2018 “'Cloud Boom’. . . driven by smartphone upgrades and early cloud adoption," ended in the glut of memory chips in 2019. Covid rescued the bust by expanding work from home. Companies were already moving to the cloud before the pandemic, and the shock simply accelerated the migration to cloud services. [XXI] This pressure led to a six-fold jump in average data center investment in the first half of 2020 alone. This was still before ChatGPT 3 was launched. The 2021–2022 surge in PC demand from the jump in work-from-home ended in an “inventory glut” of CPUs. [XXII]
The Capex boom since 2022 has seemed to fly in the face of financial constraints, but this is because the firm-level conditions of major tech companies have differed from that of the rest of the economy. When interest rates shot up in the last few years, many tech companies were cash rich and able to maintain an effectively low borrowing spread. By the time interest rates were cut again, companies such as Nvidia had to spend money because they had too much cash.
Data centers are now moving to a new phase. The buildout is currently changing its focus from training to inference. This will necessitate closer, low-latency edge centers to mediate the accumulated training at central data centers with real-time inference for consumers. This turn towards agentic AI creates a new need for CPUs to plan out agent swarms. As a result, Jensen Huang recently said at the 2026 NVIDIA GTC Keynote, that tokens — units of AI output — are the “new commodity.” It follows that AI models and firms that can produce tokens in the most energy efficient manner will capture more of the market. Again, the use-value is not this or that, but the whole coordination — AI models, semiconductors, cooling, energy — in a way that goes beyond the limits of the cooperation of labor. [XXIII]
For example, Larry Ellison infamously said that the code under Uber was so simple his “cat” could have written it. Whether this is true or not, it doesn’t matter. Uber’s strength is in the network. It only is capitalist, because it meets the scale and complexity required. Likewise, Nvidia does not necessarily have the fast chip on the market — Cerebras has a chip 21x faster, at a third of the cost — but Nvidia has CUDA and the associated network and support of a veteran ecosystem.
The explosion in the use of drone warfare, especially since 2022, has made data centers an essential part of the military complex. Data centers have since been targeted as “critical infrastructure” during the US-Israeli attack on Iran. The recent closure of the Strait of Hormuz has revealed yet another chokepoint: the world is dependent not just on oil, but helium to produce semiconductors. [XXIV]
Thus, we have all played a part in the development of AI and the Data Center boom. But we didn’t play it as a class. The turn towards more passive, structural or sociological determinations of class — wage-earner, level of education, having wealth — tries to make something absent into something else. The political formation of the class is objective and consequently, the absence of such has an objective effect. Whatever contradiction may be theoretically present, still expresses itself in a way that can be plausibly called contingent. To make the necessity explicit, would require the practical and organized engagement with the contradiction. In other words, something like an organized class to embody the problem and provide the basis for rendering it self-conscious.
This account might seem unsatisfactory because it seems to lack the connection with the struggle of the working class. It is more like a story of the white-collar class or a jeremiad of “techno-feudalism.” But this emptiness that echoes in history is the muteness of the unorganized class. Capitalism seems to continue without them, and sometimes we are led to ventriloquize in their absence. This means rather than the contradiction being engaged, it is turned into some set of grievances to be captured by the contending parties.
Already capitalist politics is organizing around the issue: Maine might be the first state to (temporarily) ban new data centers. But this current local resistance to the placement of data centers is really a fight over whether local labor is going to count in the AI production, not whether people will participate in AI production.
Ways of living continue to change. The compounding of the housing shortage and high mortgage rates have stalled residential development. Into the vacuum has stepped the cash rich tech companies, buying houses up for nearly $1 million each and razing them to open new land for data centers. But why stop there? Maine might follow several Asian countries in building underwater data centers, and Elon Musk has provided an extraterrestrial answer to the limits of our earth — build data centers in space and have them powered by solar power. What’s to stop him?
Part of the evidence that we are in an AI bubble is skepticism about the productivity gains — what is it exactly doing? Is it complementary to labor or labor-replacing? [XXV] Capitalist production is both. Either way, labor is going to plumb the depths to find what it means to be valuable in the new era. Rather than expecting Skynet or Black Friday, socialists should seek to be ahead of the curve of the changes in capitalism and see the new terrain as an opportunity to fight for the rights of labor.
[I]: “A worker who takes an anythingarian attitude towards the history of his own movement cannot be considered class-conscious.” V.I Lenin, May 4, 1914, “The Ideological Struggle in the Working-Class Movement”, CW 20.
[II]: Really, Fuhrman uses the broader “information processing equipment & software” and he notes that absent the AI boom, lower interest rates and electricity would likely go elsewhere in the economy. Furthermore, Goldman Sachs analysts point out that the AI capex spending, as a share of GDP (0.8%), is about half of what has been the share regularly in past technological booms (1.5%). This would suggest more capital investment could be within historic norms.
[III]: “The trade of Manchester may be divided into four periods. The first is that, when the manufacturers worked hard merely for a livelihood, without having accumulated any capital. The second is that, when they had begun to acquire little fortunes, but worked as hard, and lived in as plain a manner as before, increasing their fortunes as well by economy as by moderate gains. The third is that, when luxury began to appear, and trade was pushed by fending out riders for orders to every market town in the kingdom. The fourth is the period in which expense and luxury had made a great progress, and was supported by a trade extended by means of riders and factors through every part of Europe.” John Aiken, A Description of the Country from Thirty to Forty Miles round Manchester. Cambridge University Press, 2014: 181-182. Hence, the old view that “because he is a leader of industry that a man is a capitalist” gets transformed into “he is a leader of industry because he is a capitalist."
[IV]: The idea of being able to trace everything back to Labor is a bourgeois desideratum, most explicitly defended by Adam Smith, which I detailed in another article — “Marx’s Mercantilism?” The reason why the Ricardian Socialists are criticized by Marx for repeating “Smith’s Mistake” is because they deny the cooperation of machines. As Thomas Hodgskin puts it, capital is a “cabalistic” word that ascribes the subjectivity of “co-existing labor” to implements that would “rust” by themselves. This cooperation of machinery was not present in Smith’s time, as recognized by Robert Owen and Andrew Ure, and therefore, the one-sided character of the Ricardian Socialists is not recognizing the historical change.
[V] And as Eugene Fama has rightfully pointed out, how do you prove it is a bubble? One has to be able say what the true prices should be, in order to say how there is a bubble — but that knowledge would simultaneously imply the foresight to predict bubbles with precision. The phrase “bubble” seems to rather be defined on “twenty-twenty hindsight."
[VI]: See Hijink (2024): 111.
[VII]: See chapters 6 & 12 in McGee (2025).
[VIII]: AI workloads currently occupy 25% of the capacity of Data Centers and are expected to rise to 50% by 2030. See Kimberley Steele, January 6, 2026, “Global data center sector to nearly double to 200GW amid AI infrastructure boom,” JLL.
[X]: To avoid confusion, Moore’s law is about the density of transistors per chip, whereas Dennard Scaling is the efficiency gain from placing transistors together on the same chip.
[XII]: See Climent (2025:4) and Bakshi (2025: 5).
[XIII]: Exodus’s dedicated internet data center was perhaps the first in 1996, before going bankrupt in the dot-com bubble.
[XIV]: The development of lithography from visible to extreme ultraviolet light allowed Moore’s Law to be extended for some time, but the writing on Dennard scaling was already on the wall.
[XV]: Witt 2025: 67-70
[XVI]: ibid: 90.
[XVII]: The period of algorithmic hacking was associated with a boom in software. As Silicon Valley venture capitalist Marc Andreesen wrote in 2011, “Software is eating the world,” supplanting hardware’s dominance in the tech world. Two years later, famed investor Jim Cramer would coin the market dominators as “FANG” (later FAANG) — Facebook, Amazon, Netflix and Google (and later Apple). These companies grew based on their software. Soon came the public campaign: “Learn to Code.” The very success of this — the glut of programmers into the job market — was a condition for the emergence of AI. The trillions of lines of code were the material for AI to train on. In early 2026, the announcement of Anthropic's Claude “Cowork” AI agent led to the trillion-dollar sell-off in the software market, known as “SaaSpocalypse.” How was this AI created? Through all the hardware of the data centers. It seems hardware has made its revenge.
[XVIII]: AI itself has been subject to several “winters” since the 1960s. In particular, Marvin Minsky’s early criticism of the perceptron, the AI winter in 1974 following the “symbolic” AI boom and then “expert systems” getting caught up in the 1987 bubble. See Witt (2025: 60).
[XIX]: Shehabi, Arman, Sarah J. Smith, Eric Masanet, and Jonathan Koomey. "Data center growth in the United States: decoupling the demand for services from electricity use." Environmental Research Letters 13, no. 12 (2018): 124030.
[XX]: Karp and Zamiska (2025): 108.
[XXI]: See also Lydia Brissy, November 3, 2020, “Covid-19: a tipping point for the data centre industry?, Savalis.
[XXII]: Intel has recently had to internally reallocate production from PC Production to Server CPUs as a consequence of both trailing the data center boom and the scarcity of wafer availability.
[XXIII]: Even various data center applications differ in the technical ratio of different classes of computer chips. High-performance computing and inference tend to have balanced GPU-to-CPU ratios (e.g. 2:1 or 1:1), whereas AI and Machine Learning training tend to be skewed towards GPUs (e.g. 8:1 to 16:1).
[XXIV]: A recent McKinsey report found AI-related goods were the fastest growing sector of global trade last year. The US saw a 66% increase in AI-related goods that year alone. This is despite the approximately 30% drop in US-China trade last year.
[XXV]: For a debate on the labor productivity, see “Gen AI: Too Much Spend, Too little Benefit?”, Goldman Sachs, Global Macro Research, Issue 129, June 25, 2024, available at < https://www.goldmansachs.com/insights/top-of-mind/gen-ai-too-much-spend-too-little-benefit >.
Bibliography:
Hijink, Marc. Focus: The ASML Way. Inside the Power Struggle over the Most Complex Machine on Earth. Uitgeverij Balans, 2024.
Karp, Alexander C., and Nicholas W. Zamiska. The technological republic: hard power, soft belief, and the future of the West. Random House, 2025:
McGee, Patrick. Apple in China: The Capture of the World's Greatest Company. Simon and Schuster, 2025.
Witt, Stephen. The thinking machine: Jensen Huang, Nvidia, and the world's most coveted microchip. Penguin, 2025.












.png)





.png)

