이조글로벌인공지능연구소 (LEECHO Global AI Research Lab) & Opus 4.6 · Thought Paper

The Age of Noise!
Commercialized Distortion of Information Flow

When SEO rankings replace truth rankings, when AI Slop drowns out human insight, when the attention economy rewards noise over signal — we are witnessing Gresham’s Law in the information ecosystem

2026.04.02
·
이조글로벌인공지능연구소 & Opus 4.6
·
v1.0


Abstract

This paper proposes the conceptual framework of “Gresham’s Law of Information,” systematically analyzing how commercial incentive structures distort global information flows. Through examining SEO ranking systems, the proliferation of AI-generated content (AI Slop), the bad-money-drives-out-good dynamics of the attention economy, and the case study of information distortion during the 2026 Iran conflict, this paper argues that structural defects in the contemporary digital ecosystem are enabling low-quality, high-volume information refuse to crowd out genuine human insight. The visibility of information has fundamentally decoupled from its veracity. This paper calls for reconstructing the value assessment framework of the information ecosystem and establishing information filtering capability as a core literacy of the digital age.

Section 01

Gresham’s Law of Information: Bad Money Drives Out Good

When the cost of producing junk information approaches zero, truth gets crowded out of circulation

In the 16th century, English financier Thomas Gresham explained an observation to Queen Elizabeth: when debased currency circulates at the same face value as sound currency, the good money disappears from circulation. People hoard coins with high gold content and spend the clipped, debased ones. This principle is replaying with striking precision in the information domain 500 years later.

In the digital information marketplace, “face value” is search ranking and platform recommendation placement. A deep investigative report requiring three days of cross-verification and an AI-generated keyword-stuffed article produced in ten minutes may occupy the same “face value” position in search results — or the latter may even rank higher. When low-quality information enters the consumer’s field of vision at the same “exchange rate” as high-quality information, good money is inevitably driven out.

Daniel Boorstin foresaw this trend as early as 1979: in an age of information overload, information tends to drive knowledge out of circulation. What he could not have predicted was that AI technology would accelerate this process by orders of magnitude.

Core Thesis

The crisis of the information ecosystem lies not in a lack of information, but in the fact that the cost of producing junk information approaches zero while the cost of producing truthful information — risk, time, professional expertise — remains high. This cost asymmetry is the fundamental driver of bad money driving out good.

Section 02

SEO: The Commercial Hijacking of Truth Ranking

How search engine optimization has thoroughly decoupled information visibility from information veracity

Search Engine Optimization (SEO) is one of the most powerful distortive forces in contemporary information flows. SEO is fundamentally a commercial ranking system, not a truth ranking system. Whoever invests more resources in optimization — keyword density, backlinks, click-through rates, dwell time — ranks higher. The result is that information visibility has completely decoupled from information veracity.

In December 2025, an SEO practitioner deliberately published an AI hallucination in their LinkedIn newsletter — a completely fictitious “Google March 2026 Core Update.” This piece of false information not only reached the first page of Google search results but was also adopted by Google’s own AI Overview feature and presented to search users as fact. Subsequently, multiple websites published detailed “recovery strategy” articles, even fabricating technical details such as the “Gemini 4.0 Semantic Filter.”

Another case from late 2025 was even more systemic. Within hours of President Trump’s mention of a “$1,776 Warrior Dividend,” dozens of newly registered or long-dormant domains began publishing nearly identical articles, all aggressively optimized for search visibility rather than accuracy. Within seven days, researchers detected 160 content farm articles, compared to only 73 thoroughly researched and accurate reports in the same period.

160
SEO farm articles within 7 days (“Warrior Dividend” incident)

73
Verified accurate reports in the same period

93%
Google AI Mode searches ending in zero clicks

85%
Sources retrieved by ChatGPT that go uncited

The essence of the problem is this: SEO is not distorting individual search results — it is systematically reshaping the channels through which humanity accesses information. When the determining factor of information visibility is not accuracy and depth but keyword density and backlink count, the entire direction of information flow has been hijacked by commercial logic.

Section 03

AI Slop: Zero-Marginal-Cost Information Pollution

When the marginal cost of content production approaches zero, information channels are industrially flooded

“AI Slop” — named Word of the Year by both Merriam-Webster and the Australian National Dictionary in 2025 — refers to low-quality content mass-produced by AI. The original meaning of “slop” is swill fed to pigs, a metaphor that precisely describes the nature of the information being consumed.

Research firm Kapwing’s analysis of YouTube recommendations found that when simulating with 500 brand-new, non-personalized accounts, 21% of initial recommended videos were entirely AI-generated content. Beyond that, another 33% was classified as “brainrot” — meaningless repetitive content designed purely to trigger dopamine responses. This AI Slop alone generates approximately $117 million in annual advertising revenue on YouTube.

Key Data

Mentions of AI-generated low-quality content grew ninefold in 2025 compared to 2024. Amazon was forced to limit each author to no more than 3 book publications per day — considering the effort traditionally required to publish a single book, this “restriction” is itself absurd.

The technological foundation of AI Slop has evolved from simple text-to-speech to fully automated “content manufacturing pipelines”: using OpenAI’s Sora or Kling 2.1 to generate video footage, ElevenLabs for voice synthesis, and Shotstack for automated editing. Unlike earlier spam content that was easily identifiable by its low resolution, AI Slop in 2026 is high-definition, visually stimulating, and convincing at first glance.

The economic logic of this phenomenon is exceedingly simple: when generative AI tools reduce the marginal cost of content production to near zero, even minuscule engagement can generate positive returns through advertising, affiliate marketing, or platform monetization mechanisms. The math does not care whether content is authentic or synthetic, high-quality or garbage — it only cares about engagement.

AI Tools
Zero-Cost Content
Algorithmic Recommendation
User Engagement
Ad Revenue
More AI Content

This is a self-reinforcing closed loop: AI trained on junk information produces more junk information, search engines push AI-generated junk to the top, people make decisions based on that junk, and the engagement junk information receives further validates the algorithm’s “correctness.”

Section 04

Incentive Distortion in the Attention Economy

Platforms don’t reward accuracy — they reward emotional stimulation

Information pollution is not a byproduct of technology gone awry — it is the inevitable product of the attention economy’s business model. Sociologist Vaidhyanathan has characterized social media platforms as the superposition of three machines: a pleasure machine (providing micro-satisfactions to lure users back repeatedly), an attention machine (algorithmically maximizing dwell time), and a surveillance machine (harvesting behavioral data for precision ad targeting).

Within this architecture, the value of content is determined not by its truthfulness but by the engagement it generates. One of the core findings of misinformation research is that false information contains significantly more negative emotion than non-manipulative content, and people in emotionally aroused states are more susceptible to accepting false information. This means platform algorithms — optimized for engagement — inherently favor content that appeals to emotion, provokes anger, and creates division, rather than calm, accurate, in-depth analysis.

Incentive Dimension Platform / Advertiser Demands Information Quality Requirements Conflict Level
Speed Faster is better; capture traffic first Requires time for verification ★★★★★
Emotion Trigger strong emotional reactions Maintain calm objectivity ★★★★★
Volume More content is better Depth over breadth ★★★★
Simplification Easy to understand and share Preserve complexity and nuance ★★★★
Freshness Continuously update headlines Same facts don’t need constant repackaging ★★★

A January 2026 Harvard Business Review study estimated that for a company with 10,000 employees, workplace AI Slop causes approximately $9 million in annual productivity losses. This is at the enterprise level alone — the cognitive cost to society at large cannot be measured in monetary terms.

Section 05

Case Study: Information Distortion in the 2026 Iran Conflict

When war meets information noise, truth becomes the first casualty

The U.S.-Israeli military operations against Iran that began on February 28, 2026 provide an exemplary case study of how the commercialized information ecosystem systematically fails at critical moments.

Mainstream media performance was disappointing. The “rolling live coverage” model of CNN, NBC, and other major outlets is essentially a low-density, high-frequency content production strategy: a single statement — “Trump says the war will end in two or three weeks” — can be rewritten under five different headlines, embedded in five separate “updates,” each counting as new content, each generating fresh page views and advertising revenue. The actual information increment is zero.

Meanwhile, genuinely valuable information sources were marginalized. Frontline reporters risking their lives before the ruins of Tehran, OSINT (Open Source Intelligence) communities tracking specific deployment records through ADS-B flight data — such as the 107th Fighter Squadron’s transfer of 12 A-10s from New Jersey to RAF Lakenheath in the UK — this primary information ranked far below SEO-optimized articles parroting White House talking points in search engine results.

Battlefield Information Contrast

The Pentagon claimed to have destroyed 90% of Iran’s ballistic missile and drone capabilities. On the same day, Iran launched what the Israeli military described as “the most intense strike since the beginning of the war” against Tel Aviv. These two pieces of information are clearly contradictory, yet mainstream media rarely performed cross-verification or follow-up questioning.

This reveals a deep structural problem: in the commercialized information ecosystem, the function of war reporting has shifted from “informing the public of the truth” to “continuously producing consumable content.” The goal of news is no longer to reduce audience uncertainty but to maintain sustained audience attention — two objectives that are fundamentally in conflict.

11,000+
U.S. targets struck in Iran

16
MQ-9 Reaper drones lost (as of April 1)

2,000+
Deaths reported by the Iranian side

4%
Iran’s internet connectivity (near-total blackout)

Internet access within Iran was cut to 4% of normal levels, independent journalists were virtually unable to enter, and casualty figures depended entirely on wildly divergent claims from each side. In this informational black box, the actual battlefield situation could only be pieced together from primary data on social networks — flight records, coordinates, timestamps, on-the-ground footage.

Section 06

Practitioner Surplus and the Industrialization of Noise Production

Too many people profit from publishing messages; too few are accountable for accuracy

The industrialized production of information noise has an easily overlooked driving force: a severe surplus of practitioners. The proliferation of AI tools means that anyone with basic smartphone knowledge can become a “content creator.” This has not only lowered the barrier to entry but fundamentally altered the economics of information production.

In traditional journalism, a reporter’s output is constrained by their time, professional training, and ethical standards. In AI-assisted content production, these constraints have nearly all vanished. A single person can use ChatGPT to generate dozens of “news analyses” per day, illustrate them with Midjourney, and distribute them across dozens of platforms via automation tools — every step “legitimately” participating in the information ecosystem, every step injecting noise into the information channel.

Supply Chain Analysis

The production chain for information noise has become highly specialized: some identify trending keywords, others generate AI content, others optimize SEO rankings, others handle multi-platform distribution, and others monetize through advertising. Each link in the chain pursues its own profit, and not a single link is responsible for the accuracy of the final information product.

The economic incentive structure of this chain is: those who produce information monetize through attention, not through accuracy. The advertising model rewards clicks, not truth; platform algorithms reward emotion, not depth; practitioners win through quantity, not quality. When the return on publishing junk information exceeds the return on publishing truthful information, Gresham’s Law becomes inevitable.

Trending Keywords
AI Batch Generation
SEO Optimization
Multi-Platform Distribution
Ad Monetization

Social networking services (SNS) are both an amplifier of this problem and a partial antidote. On one hand, social platforms’ algorithmic recommendation mechanisms accelerate the spread of junk information; on the other, SNS remains one of the few channels that can still transmit primary information — provided the recipient possesses sufficient filtering and judgment skills.

Section 07

Dead Internet Theory and the Collapse of the Information Ecosystem

When over half of internet content is AI-generated, “authenticity” itself becomes a scarce commodity

The “Dead Internet Theory” — once dismissed as a fringe conspiracy theory — is being validated by data. Twenty-one percent of YouTube’s recommended content is AI-generated; over 50% of internet content is AI-driven; AI chatbots spread false information 35% of the time on controversial news topics. These figures mean the internet is transforming from a platform for human communication into an environment dominated by machine-generated content.

The Nieman Journalism Lab predicted in late 2025 that 2026 would be the year AI-generated content surpasses human-created content in volume. This is not merely a story about technological capability — it represents a fundamental shift in how value is defined within the information ecosystem. When “content” becomes functionally infinite, how will journalism survive?

2022
ChatGPT launches, making AI content generation capabilities publicly accessible

2024
AI Slop phenomenon attracts widespread attention; Amazon limits authors to 3 book publications per day

2025
“Slop” named Word of the Year; AI Slop mentions grow ninefold; YouTube acknowledges 21% AI content in recommendations

2026
AI content volume projected to surpass humans; over 50% of internet content AI-driven; trust crisis reaches full scale

Perhaps the most ironic case emerged in April 2026 — as this paper was being written. YouTube CEO Neal Mohan’s early-2026 annual open letter prominently declared a commitment to combating low-quality AI-generated content, stating that YouTube was “actively building systems to reduce the spread of low-quality AI content.” Yet upon opening YouTube’s homepage — owned by Google — the very first advertisement was a promotion for Google’s own product, Gemini, featuring AI-generated watermelon-doodle elephants spraying water. The platform that pledged to fight AI Slop was promoting the tools for producing AI Slop in its most prominent position. This is not an isolated contradiction but the perfect encapsulation of the attention economy’s core paradox: platforms simultaneously claim to protect content quality while pushing AI content generation tools as a core business strategy — because the math is simple: more content means more engagement, and more engagement means more ad revenue, regardless of whether that content was created by humans or churned out by AI.

Field Observation · April 2026

YouTube homepage: the first ad is for Gemini — AI-generated watermelon-doodle elephants spraying water. The YouTube CEO declared at the start of the year: reject AI Slop. Google’s right hand fights AI Slop while its left hand sells the weapons that produce it. This is the truth of the attention economy: rules constrain everyone except those who write them.

A notable counter-trend is emerging: users are beginning to pay for “less content.” The Brave browser has added features to block YouTube Shorts; an artist has created the “Slop Evader” browser extension that displays only search results from before November 2022 (before ChatGPT’s release). We once paid for access to content, then paid to remove advertisements, and now we are entering the era of paying to filter out noise.

Section 08

Reconstructing Information Value: Filtering Ability as Core Literacy

In the age of noise, the ability to identify signal is scarcer than the ability to access information

Confronting the systemic degradation of the information ecosystem, traditional calls to “improve media literacy” are no longer sufficient. What we need is a new cognitive framework and institutional architecture.

First, establish information filtering capability as the core literacy of the digital age. In an era of information scarcity, the critical ability was accessing information; in an era of information overload, the critical ability is filtering information. Knowing what constitutes primary data, what is second-hand processing, and what is pure noise — this judgment is becoming a scarce resource that only a select few possess.

Second, redesign the incentive mechanisms for information distribution. The existing advertising-driven model is fundamentally incompatible with information quality. Platforms need to establish mechanisms that incorporate information accuracy into algorithmic weighting, rather than relying solely on engagement metrics. The EU’s Digital Services Act requires large online platforms to assess and mitigate systemic risks posed by their services, but in an environment where the attention economy rewards viral spread, enforcement remains far from adequate.

Third, protect and support primary information producers. The reporters standing before the ruins of Tehran, the OSINT analysts spending hours cross-referencing flight data, the ordinary citizens transmitting on-the-ground footage through restricted networks — these people are the most fragile and most precious nodes in the information ecosystem. Their work should not be marginalized by algorithms.

Fourth, establish a tiered evaluation system for information sources. Not all information is created equal. The gap between flight data records, satellite imagery, and on-the-ground video versus SEO-optimized “analysis articles” should be explicitly labeled in how information is presented.

Conclusion

The age of noise will not end on its own. But in every system where bad money drives out good, a tipping point exists — when the debased currency depreciates enough to trigger systemic collapse, people begin actively seeking sound currency. We may be approaching that tipping point. When “less content” becomes a premium product, when “100% human-created” becomes a marketing selling point, when users actively install noise-filtering tools — these are all self-repair signals from the information ecosystem. The question is how much truth we will lose before the repair is complete.

References & Citations
  1. Gresham’s Law — Britannica; Wikipedia. Principles and historical evolution of Gresham’s Law.
  2. Finnell, J. “Gresham’s Law in the 21st Century.” Southern Librarianship, Vol.10, No.1. Application of Gresham’s Law in the information domain.
  3. Boorstin, D. (1979). White House Conference on Library and Information Science. “Information tends to drive knowledge out of circulation.”
  4. Bolster AI (2026). “How a Government Announcement Became an SEO Goldmine for Content Farms.” Commercial hijacking of government announcements by SEO content farms.
  5. Goodey, J. (2026). LinkedIn experiment on AI hallucination ranking on Google first page. SEO misinformation ranking experiment.
  6. Kapwing Study (2025). 21% of YouTube recommendations are AI-generated. YouTube AI content proportion study.
  7. Merriam-Webster; Australian National Dictionary (2025). “Slop” named Word of the Year.
  8. Meltwater (2025). 9x increase in mentions of “AI slop” compared to 2024. AI Slop mention growth data.
  9. Harvard Business Review (2026). AI Slop costs ~$9M/year for a 10,000-person company. Enterprise-level AI Slop cost estimate.
  10. Diaz Ruiz, C. (2023/2025). “Disinformation on Digital Media Platforms: A Market-Shaping Approach.” New Media & Society. Market-shaping of digital disinformation.
  11. Vaidhyanathan, S. (2022). “Anti-Social Media.” The triple-machine model of social media.
  12. NewsGuard (2025/2026). AI chatbots spread false information 35% of the time. AI chatbot misinformation dissemination rate.
  13. Nieman Journalism Lab (2025). “In 2026, AI will outwrite humans.” Prediction that AI content volume will surpass human output.
  14. KR Institute (2025). “AI Slop I: Pollution in Our Communication Environment.” AI pollution in the communication environment.
  15. Future Center UAE (2026). “Trends in AI-Generated Content in 2026.” 2026 AI content trend analysis.
  16. Stimson Center (2026). “AI in the Age of Fake (Imagined) Content.” Fake content and regulation in the AI era.
  17. Air & Space Forces Magazine (2026). MQ-9 operations and losses in Iran. MQ-9 loss reports in the Iran conflict.
  18. Stars and Stripes (2026). A-10 deployment and combat patch authorization.
  19. Godes, D. (2026). “Will the Truth Free Us from Misinformation?” Management Science. Whether truth can liberate us from misinformation.
  20. Digital Watch Observatory (2026). “AI Slop’s Meteoric Rise.” The rapid rise of AI Slop and regulatory challenges.

The Age of Noise! Commercialized Distortion of Information Flow
이조글로벌인공지능연구소 (LEECHO Global AI Research Lab) & Opus 4.6 · April 02, 2026

댓글 남기기