The same pattern recurs throughout the history of human civilization: when a society’s maintainers — the craftsmen who repair aqueducts, the technicians who manufacture materials, the engineers who tend to server rooms — disappear due to resource misallocation, the systems they maintained collapse. And when posterity encounters the ruins, their response is not to reflect on “why we failed to keep them” but to marvel that “our predecessors’ skills were truly remarkable.” This cognitive cycle of awe without introspection causes the same rupture to repeat across different civilizations, different centuries, and different technological conditions. This paper examines multiple cases spanning from the disappearance of writing in the Bronze Age to the 340,000 unfilled data center positions in 2026, demonstrating that the root cause of every civilizational rupture is not that system complexity exceeded human capacity, but that society’s value-ordering systematically denied the significance of foundational maintenance work. “Complexity” is merely a fig leaf; lift it, and underneath lies resource misallocation. Every single time.
The Same Script, Performed for Millennia
The following cases span four thousand years, six civilizations, and three continents. On the surface, they bear no relation to one another — some concern writing, others aqueducts, one involves nuclear warhead materials, another data centers. Yet they all share the same structure: maintainers disappear → knowledge ruptures → systems fail → posterity cannot comprehend what was once possessed.
Not a Complexity Problem, but a Value-Ordering Problem
After every civilizational rupture, posterity’s explanations are strikingly similar: “The systems were too complex,” “The external shocks were too severe,” “The loss of technology was inevitable.” These explanations share a common function — they absolve everyone of responsibility for the rupture.
Yet looking back at each case, not a single one was fundamentally caused by complexity.
Roman aqueducts were not complex — they were stone channels driven by gravity, technology mastered centuries earlier. They collapsed because the Empire directed its resources toward military expansion and court consumption rather than aqueduct maintenance. FOGBANK was not complex — it was an aerogel with a specific impurity. The knowledge was lost because no one considered it worthwhile to document the methods of technicians approaching retirement. Jackson’s water pipes were not complex — they failed because successive administrations channeled budgets toward visible prestige projects rather than invisible underground infrastructure.
“Complexity” is a fig leaf. Lift it, and underneath lies resource misallocation. Every society possesses sufficient resources to sustain its maintainers. Every society simply chooses not to.
In his classic work The Collapse of Complex Societies, Joseph Tainter argues that sociopolitical systems require energy to sustain themselves; increasing complexity means rising per-capita costs; and investment in complexity as a problem-solving strategy inevitably reaches a point of diminishing marginal returns. Collapse can be understood as the loss of energy necessary to maintain social complexity.
Tainter’s framework is correct, but it requires a critical supplementary dimension: the loss of energy is not due to an insufficient total supply, but to a problem of distribution. The Roman Empire on the eve of collapse was not lacking in wealth — wealth had concentrated in the hands of a few aristocrats while the public finances that maintained shared infrastructure were drained. The AI industry of 2026 is not lacking in capital — $710 billion in annual capital expenditure proves that capital is available to an almost absurd degree. The problem is that all of this capital is flooding into the application layer, while the electricians and technicians who maintain AI’s physical infrastructure cannot even secure basic social respect.
Rupture does not occur because energy is exhausted. It occurs because energy flows to the wrong places.
Awe Without Introspection: The Cognitive Trap of Civilizations
After every rupture, posterity’s reaction follows the same pattern:
Classical Greeks gazing at Mycenaean walls — “The Cyclopes must have built them.” Medieval people confronting Roman aqueducts — “Giants must have constructed these.” Modern observers learning about FOGBANK — “They actually forgot how to make nuclear warhead materials.” Today’s people hearing about AI data center staffing shortages — “They actually can’t find people to maintain the servers.”
Awe. Every generation is in awe.
But not a single generation has pursued the question that truly matters: When those people were still here, why didn’t we keep them?
Posterity marvels at its predecessors’ craft, publishes papers studying their achievements, builds museums exhibiting their artifacts — and does everything except reflect on the mechanism that caused the rupture. That same mechanism then continues operating, producing the next rupture. This is not a tragedy of history. It is a cognitive trap.
Why no introspection? Because introspection means acknowledging something profoundly uncomfortable — the problem is not how brilliant our predecessors were, but that our own generation’s value-ordering is flawed. We directed money, attention, and respect toward those who needed them least, and allowed those who needed them most to disappear in silence.
Admitting this is too painful. Far easier to say, “The ancients were truly remarkable — a pity their technology was too complex and was lost” — this way, no one bears responsibility. “Complexity” is not merely a fig leaf; it is also a disclaimer. It disguises a value-choice problem as a technological-fate problem, enabling an entire society to collectively avoid the question that truly needs answering.
The Social Function of Awe
Awe itself is not harmful. The problem is that within social narratives, awe serves as a substitute — it substitutes for introspection. When a society admires the engineering achievement of Roman aqueducts in a museum, it simultaneously derives a form of psychological comfort: “At least we recognize the value of these things.” But recognizing value and protecting the people who create value are two entirely different things. Museums commemorate achievements that have already died, not maintainers who are still alive.
In contemporary society, this substitution mechanism manifests as follows: tech media celebrates the greatness of Linux, but no one pays attention to maintainers’ salaries or mental health. Industry conferences bestow “lifetime achievement awards,” but the projects maintained by the recipients still lack funding. Society substitutes symbolic respect for substantive support, then assumes the problem has been solved.
Kong Yiji’s Scholar’s Robe: The Cultural Root of Knowledge Rupture
Kong Yiji, a character in Lu Xun’s fiction, is a man who has studied the classics but earned no official rank. He stands at a tavern drinking among manual laborers, wearing the long scholar’s robe that marks the educated class. He would rather starve in his tattered robe than remove it and take up physical labor. In his value system, the identity of “a learned man” matters more than survival itself.
This literary figure resurfaced as a pop-culture symbol in China in 2025. With 11.58 million fresh graduates, elevated youth unemployment, and 230,000 unfilled positions in the semiconductor industry alongside an urgent need for data center technicians, young people would still rather remain unemployed than work production lines in wafer fabs, wear cleanroom suits to calibrate parameters, or work night shifts in server rooms. Because in today’s social value system, “sitting in an office building as an AI product manager” is a hundred times more respectable than “maintaining the physical infrastructure that keeps AI running.”
But this is not exclusively a Chinese problem. In America, it is called “blue-collar stigma.” An experienced HVAC technician can earn over $150,000 per year; a specialized data center electrician can reach $300,000. Yet young people still flood toward universities and white-collar positions, even as those positions are being automated by AI. As one HVAC technician told CBS: “The trades have been overlooked, so now there’s a gap that needs to be filled.”
Kong Yiji’s scholar’s robe and the naming conventions of the tech industry are fundamentally the same thing — a systematic denial of “foundational work.” The former is worn by an individual; the latter is worn by an entire industry. The effect is identical: making everyone believe that “real value” resides at the top, not at the bottom.
This cultural psychology is the deepest root of civilizational rupture. Technology can be documented; money can be redirected through policy. But if a society culturally denies the dignity of foundational maintenance work, it will never attract enough people to fill those roles. FOGBANK’s technicians retired not because the government paid them poorly — but because no one considered the act of “documenting how they did their work” worth the investment. Jackson’s pipe workers received no budget not because the city was broke — but because no voter considered underground water pipes a cause worth voting for.
Every society possesses sufficient resources to sustain its maintainers. The problem is never “whether there is money,” but “whether it is deemed worthwhile.”
The AI Era: Civilizational Rupture at Accelerated Speed
Previous civilizational ruptures unfolded over centuries, sometimes millennia. The Roman aqueduct system took several centuries to decline from its peak to abandonment. Mycenaean construction knowledge dissipated gradually across multiple generations. Though lamentable, this pace at least afforded certain peripheral communities a time window to preserve partial knowledge.
The rupture of the AI era is occurring at unprecedented speed.
What makes this rupture even more lethal is that previous ones affected only specific regions. When Rome’s aqueducts failed, Eastern technologies survived. When Mycenaean knowledge vanished, Egypt and Mesopotamia preserved theirs. But the infrastructure of the AI era is globally shared — the same data center clusters serve users worldwide, the same chip supply chain spans every AI company. When foundational maintainers are simultaneously in shortage across the globe, there is no “other civilization” to carry the knowledge forward.
Moreover, this rupture has an unprecedented accelerant — AI itself. The convenience and glamour that AI creates at the application layer is accelerating the trend of young people abandoning foundational work. Every “AI will change the world” narrative implicitly signals that “foundational work no longer matters.” Every success story of an AI product manager reinforces the belief that “value resides at the top, not at the bottom.” AI is not merely the victim of this rupture — it is simultaneously its accelerant.
In previous collapses, the ruins remained — stone aqueducts could be visited, megalithic walls could be measured, and posterity at least knew “something once existed here.” But code is not like stone. When the last person capable of writing an operating system kernel from scratch retires, when the last technician who understands how to debug a liquid cooling system departs — what vanishes will leave behind no ruins for future generations to marvel at. It will simply, silently, cease to exist.
Breaking the Cycle: From Awe to Action
If human civilization has a bug that has persisted for millennia, it is this: we learn to cherish things only after losing them, and our way of cherishing is to stand in awe — not to reflect. To break this cycle, we must act while we still have what we are about to lose.
First, acknowledge that the problem is not complexity, but value-ordering
Stop explaining ruptures with “the technology was too complex,” “the times changed,” or “it was inevitable.” The root cause of every rupture is the same — society decided that maintainers were not worth the investment. This is not fate; it is a choice. Acknowledging it as a choice means we can make a different one.
Second, make the work of maintainers visible
The greatest danger is not the disappearance of maintainers, but the invisibility of their disappearance. The Log4j maintainer crisis was not seen until the vulnerability erupted. Jackson’s pipe crisis did not make the news until 150,000 people lost water. FOGBANK’s knowledge rupture was not discovered until warheads needed refurbishment. If we could establish an “Infrastructure Maintainer Health Index” — continuously monitoring the number, age distribution, and knowledge-transmission status of maintainers for critical systems — we could at least issue warnings before collapse.
Third, build institutional frameworks for knowledge transmission
The most profound lesson of FOGBANK is not “knowledge can be lost” but “no one thought recording the knowledge was worth the investment.” Facilities were dismantled, documentation was not preserved, and technicians retired without training successors. If one million dollars had been spent at the time on detailed process documentation and video records, the subsequent tens of millions spent on reverse engineering would have been unnecessary. Knowledge transmission does not happen naturally — it requires institutional investment, documentation, and safeguards.
Fourth, redefine what is “respectable”
This is the hardest step, but also the most fundamental. As long as “wearing overalls and working night shifts” ranks below “wearing a suit and making PowerPoints” in a society’s value system, the maintainer shortage cannot be solved through salary increases alone. Wages address economic incentives but cannot address social identity. A society must acknowledge at the level of cultural narrative that the people who keep civilization running and the people who create new things possess equal — or even greater — value. Because without the former, everything the latter creates is built on air.
Human civilization does not need more innovators — it needs respect for its maintainers. Not museum-style respect, not awards-ceremony respect, but respect that manifests on pay stubs, in social standing, and in the moment when a young person choosing a career thinks, “I want to be that kind of person too.” Until this respect is established, the same script will continue to play out. Not because humanity is not smart enough, but because humanity has consistently directed its intelligence to the wrong places.
A Final Note: This Paper Is Itself Evidence
This paper was produced through collaboration between a human author and AI (Claude Opus 4.6). During the writing process, AI searched across millennia of civilizational rupture cases, synthesized data, organized logic, and generated text. But the judgment of “where to look” — starting from a screenshot of an AI morning briefing, questioning the honesty of naming conventions, leaping to the maintainer paradox, leaping again to civilizational rupture patterns throughout history, and ultimately distilling the insight that “this is not a complexity problem, but a value-ordering problem” — all of this came from the human author.
This itself is an empirical demonstration of the paper’s thesis: AI can process information but cannot determine what information deserves attention. It can optimize within an existing framework but cannot step outside a framework to establish a new coordinate system. This capacity to step outside the frame is precisely the most precious quality of human maintainers — not what they can do, but their knowledge of when something must be done and what must not be lost.
And if one day even this judgment is abandoned because it is “not respectable enough” for anyone to cultivate and pass on, then it will not be merely another rupture — it will be the last rupture of human civilization. Because this time, there will be no posterity left to stand in awe.
[1] Palladium Magazine (2024). “Why Civilizations Collapse.” Analysis of knowledge loss across civilizations.
[2] Wikipedia / Multiple sources. “Fogbank.” Nuclear weapon interstage material, knowledge loss and recovery (2000-2009).
[3] The War Zone (2020). “Fogbank Is Mysterious Material Used In Nukes That’s So Secret Nobody Can Say What It Is.”
[4] Scitales. “Fogbank: How the United States Forgot How to Make Its Nuclear Weapons.”
[5] U.S. Government Accountability Office (2009). Fogbank production delays and knowledge loss assessment.
[6] CollapseLife (2025). “Infrastructure is the collapse indicator no one is talking about.” Jackson, MS case study.
[7] Tainter, J. (1988). The Collapse of Complex Societies. Cambridge University Press.
[8] Wikipedia. “Societal Collapse.” Overview of Tainter’s complexity-energy framework.
[9] PNAS (2012). “Critical perspectives on historical collapse.” 12 case studies of societies under stress.
[10] Uptime Institute (2025). Annual Global Data Center Staffing and Recruitment Survey.
[11] Introl (2026). “340,000 Unfilled Data Center Jobs Threaten AI Boom.”
[12] IEEE Spectrum (January 2026). “AI Data Centers Face Skilled Worker Shortage.”
[13] CNBC / Randstad (March 2026). AI data center skilled trade worker shortage analysis.
[14] CBS News (August 2025). “Data center demand is booming. Can the supply of trade workers keep up?”
[15] Fortune (April 2026). “This talent CEO says laid-off tech workers are ignoring a $300K trade job.”
[16] School of Public Affairs, Zhejiang University (2025). “Kong Yiji-Style Youth Anxiety: The Divergence Between High Education and Employment Difficulties.”
[17] China Semiconductor Industry Association (2024). Industry talent demand of 790,000, with a gap of 230,000.
[18] World Politics Substack (2026). “How the aqueducts made Ancient Rome possible.” Maintenance workforce analysis.
[19] Medievalists.net (2020). “Changing Landscapes: Roman Infrastructure in the Early Middle Ages.”
[20] RSIS International (2024). “Civilization Collapse: Analyzing Historical Civilizations.” Maya, Indus Valley, Roman cases.