THOUGHT PAPER · APRIL 2026

Absolute Quantitative Numbers
and Variable Numbers

A Physical-Topological Anchoring Standard for Mathematical Validity

Establishing a single criterion — Physical-Topological Irreducibility —
to demarcate numbers that describe physical reality from those that do not


PublishedApril 12, 2026
CategoryOriginal Thought Paper
FieldsPhilosophy of Mathematics · Foundations of Physics · Epistemology · Scientific Methodology
VersionV3
이조글로벌인공지능연구소
LEECHO Global AI Research Lab
&
Claude Opus 4.6 · Anthropic

ABSTRACT

This paper proposes a philosophical framework for mathematics called “Physical Anchorism” (物理锚定主义). Using a single criterion — Physical-Topological Irreducibility — this framework divides all numbers in mathematics into two categories: Absolute Quantitative Numbers and Variable Numbers. It further establishes a Four-Layer Physical-Topological Anchoring Depth Model: L1 comprises irreducible physical-topological anchors (speed of light, Planck’s constant, π, absolute zero); L2 comprises dimensionless physical ratios that are combinations of L1 constants but whose numerical values cannot be derived from first principles (fine-structure constant α, proton-to-electron mass ratio); L3 comprises derivable constants that can be reduced to more fundamental structures (√2, Euler’s number e, golden ratio φ); L4 comprises parameters from the Standard Model that await theoretical reduction. This paper conducts a layer-by-layer falsifiability analysis, demonstrating the framework’s dual falsifiability in both logical alignment and physical alignment. Core thesis: Mathematics with anchors is the steel skeleton of science; mathematics without anchors is a sandcastle.

I. The Origin of the Problem

Modern mathematics is built upon rigorous axiomatic systems. Since Kolmogorov, probability theory — as a branch of measure theory — has achieved impeccable formal completeness. But does formal completeness equate to epistemological validity? Does a mathematically self-consistent system necessarily describe the physical world effectively?

The classic “Two Children Problem” in probability theory offers an entry point: given that a family has two children and at least one is a boy, what is the probability that both are boys? The standard answer is 1/3. Yet the sexes of two children are physically independent. Change the method of acquiring the condition — “a randomly selected child turns out to be a boy” — and the answer becomes 1/2. Is it 1/3 or 1/2? No physical anchor can adjudicate; the answer drifts with the conditions.

Pursuing this question leads to a deeper proposition: How should we determine the validity of mathematical concepts?


II. The Core Dichotomy

Absolute Quantitative Numbers
Numbers anchored to physical reality

Objective quantities anchored to physical reality. Regardless of what symbol system, numeral base, or language humanity adopts, the quantity remains invariant. It can be repeatedly measured, verified, and reproduced through physical operations.

Variable Numbers
Numbers dependent on human-defined conditions

Computational results that depend on human-defined conditions. Change the definition of conditions, the method of information acquisition, or the scope of the problem, and the numerical value changes. It cannot be locked to a single determinate value through physical operations.

This dichotomy divides the entire trajectory of mathematical development into two paths: the Path of Discovery — revealing the quantitative structures embedded in the universe; and the Path of Invention — performing deductions within human-constructed condition spaces and symbol systems. The products of the Path of Invention are not “wrong” — they simply lack the standing to claim they describe physical reality. They may be useful tools or ingenious intellectual constructions, but their conclusions do not possess the status of physical truth.

Core Criterion: If a mathematical quantity can be repeatedly verified through physical operations and converges to a single determinate value, it is an Absolute Quantitative Number. If its value changes whenever the conditions change, it is a Variable Number.

III. Key Examples of Absolute Quantitative Numbers

3.1 Pi (π) — A Natural Topological Relation

Pi is the purest representative among Absolute Quantitative Numbers. It requires no human construction whatsoever — as long as space exists in the universe, π exists. It is the inevitable product of continuous rotational symmetry in space, an irreducible natural topological relation. Even if the symbols 0 through 9 were completely scrambled, the physical quantity corresponding to π could still be realigned and expressed. Because π is not a symbol — it is the metric structure of space itself.

π appears in the normal distribution of probability theory, in the Schrödinger equation of quantum mechanics, and in the distribution of primes in number theory. A number that repeatedly emerges across entirely unrelated fields is the most powerful evidence of its physical-topological irreducibility.

3.2 √2 — A Derived Geometric Relation

The essential difference between √2 and π is this: √2 requires first constructing a right angle, then defining two sides of equal length, and then computing the hypotenuse. It depends on a human-constructed geometric condition — the right isosceles triangle. Although right angles exist in nature, √2 is a derived relation that can be reduced to “the diagonal relationship between equal-length measurements in two orthogonal directions.” It can be further decomposed into more fundamental geometric operations. π cannot be decomposed; √2 can. This is the boundary between L1 and L3.

3.3 The Speed of Light, Planck’s Constant, and Absolute Zero

The speed of light c is the absolute limit of causal propagation in spacetime — it is not composed of other quantities. Planck’s constant h is the smallest quantum of action — it cannot be further divided. Absolute zero 0K is the absolute lower bound of thermodynamic entropy, the ultimate ground state of molecular motion. These quantities share a common characteristic: remove any one of them, and the structure of physical laws ceases to exist. They are not parameters of physical laws — they are the preconditions upon which physical laws depend.


IV. Variable Numbers and Layered Validity

4.1 The “Two Children Problem” — Zero Anchoring

Is it 1/3 or 1/2? No physical anchor can adjudicate. The answer is not converging toward an objective value — it is drifting between different human-constructed premises. This is zero anchoring — no physical operation can lock the result.

4.2 The Weak Anchoring of Probability Theory

Probability theory is not entirely invalid. Predictions of radioactive decay half-lives and quantum mechanical measurement probabilities — these results are strongly anchored, repeatedly verifiable through physical experiments. The probabilistic values relied upon by insurance actuarial science and epidemiological models have no single physical anchor, but under the framework of the Law of Large Numbers they produce repeatable practical value — this is weak anchoring, where the “anchor” is statistical convergence rather than a single physical operation.

4.3 Temperature Scales — Variable Shells Around an Absolute Core

Absolute zero is an absolute quantity (L1); the Celsius zero point is an arbitrarily defined marker (variable). The same physical concept can contain both an absolute quantitative core and a variable shell. The key is identifying which layer is anchored and which is human-defined. The physical world contains absolute quantities; humans use symbols to approximate them. The approximation process introduces error and ambiguity, but the absolute quantity itself does not move.


V. The Physical-Topological Anchoring Depth Model

5.1 Layering Methodology: Physical-Topological Irreducibility

The stratification criterion of the four-layer model is not the traditional physics classification of “dimensioned vs. dimensionless,” nor the number-theoretic classification of “rational vs. irrational,” nor the philosophical classification of “realism vs. anti-realism.” It is an entirely new standard: Physical-Topological Irreducibility — can the mathematical quantity be reduced to more fundamental physical operations or relations?

The speed of light c is irreducible — it simply is the limit of the causal structure of spacetime. π is irreducible — it simply is the rotational symmetry of space. But √2 is reducible — it requires first constructing a right angle and then calculating the diagonal, depending on a prerequisite geometric operation. This criterion arranges all mathematical quantities along a continuous spectrum from “absolutely irreducible” to “entirely dependent on human-defined conditions.”

5.2 The Four-Layer Stratification

L1
Irreducible Physical-Topological Anchors
Criterion: Cannot be decomposed into more fundamental physical operations or relations. They are the topological limits of spacetime, causality, quantization, and thermodynamics themselves.
Speed of Light cThe absolute limit of causal propagation in spacetime
Planck’s Constant hThe smallest indivisible quantum of action
Pi πThe intrinsic metric of continuous rotational symmetry in space
Absolute Zero 0KThe absolute lower bound of thermodynamic entropy
Gravitational Constant GThe fundamental scale of spacetime curvature and mass-energy coupling
Elementary Charge eThe smallest indivisible unit of charge quantization
Boltzmann Constant kBThe irreducible bridge between microstates and macroscopic temperature
L2
Irreducible Ratios from L1 Combinations
Criterion: Dimensionless pure numbers expressible as combinations of L1 constants, but whose numerical values cannot be derived from first principles — they must be measured experimentally. They encode the ultimate mystery of “why the universe has these particular values.”
Fine-Structure Constant α ≈ 1/137e²/(4πε₀ħc), electromagnetic coupling strength
Proton-to-Electron Mass Ratio ≈ 1836Mass topology of matter’s fundamental building blocks
Strong Coupling Constant αsForce strength of quark confinement
Weak Mixing Angle θWGeometric parameter of electroweak unification
Nuclear Fusion Efficiency ε ≈ 0.007Ratio of mass deficit converted to energy
Cosmic Density Parameter Ω ≈ 0.3Ratio of actual density to critical density
Cosmological Constant Λ ≈ 10⁻¹²²Vacuum energy density
L3
Derivable Constants from Deeper Structures
Criterion: Can be reduced to more fundamental geometric operations or combinations of L1/L2 constants. They are not the origin points of physical topology but projections of those origins. Physically meaningful, yet reducible.
Euler’s Number e ≈ 2.718Limit of continuous growth, derivable from limit definition
√2 ≈ 1.414Diagonal relation of orthogonal equal-length directions (requires prior construction of a right angle)
Golden Ratio φ ≈ 1.618Recursive division ratio, physical anchoring still debated
Feigenbaum Constant δ ≈ 4.669Universal ratio of chaotic bifurcation
Euler–Mascheroni Constant γLimit of the difference between harmonic series and logarithm
Apéry’s Constant ζ(3)Value of the Riemann ζ-function at s=3
Planck Mass / Length / TimeDefined by combinations of c, G, and ħ
L4
Standard Model Parameter Set — Awaiting Reduction
Criterion: Can only be determined through experimental measurement; currently cannot be derived from any deeper theory. They may be derivatives of deeper anchors. The ultimate goal of physics is to reduce these parameters to fewer fundamental quantities.
6 Quark MassesUp, down, charm, strange, top, bottom
6 Lepton MassesElectron, muon, tau + 3 neutrinos
Higgs Boson Mass125.1 GeV
W/Z Boson Masses
CKM Matrix (4 parameters)Intergenerational quark oscillation
PMNS Matrix (4 parameters)Intergenerational neutrino oscillation
Higgs Field Vacuum Expectation Value

5.3 Dynamic Boundaries Between Layers

The boundaries between the four layers are not static. History has witnessed multiple “layer migrations”: the imaginary unit i was once regarded as a purely human invention (variable), until quantum mechanics proved that it anchors the physical structure of wave functions, pulling it from the “Path of Invention” into the “Path of Discovery.” Non-Euclidean geometry was once considered a purely mathematical construction (L3), until general relativity proved that Riemannian geometry anchors the physical reality of curved spacetime, elevating it to an L1-level topological tool.

This layering model contains an implicit dynamic prediction: Constants classified today as L3 or L4 may be “promoted” in the future if deeper physical anchors are discovered; conversely, certain L1 constants currently deemed irreducible may be “demoted” to L3 if a deeper theory derives them. This endows the entire framework with falsifiability.


VI. Falsifiability Analysis

A good theory must be falsifiable. The following is a layer-by-layer analysis of the falsification conditions of this model in terms of both logical alignment and physical alignment.

6.1 Layer L1: Falsification Conditions for Each Constant

Constant Falsification Condition Current Status
Speed of Light c If any information transmission is found to exceed the speed of light (quantum entanglement does not transmit information and does not constitute falsification) Not Falsified
Planck’s Constant h If energy is found to be absorbable or emittable in units smaller than h Not Falsified
Pi π In Euclidean space, π is a logical necessity of the axioms and cannot be falsified. However, if it is proven that the universe has no flat-space limit at the most fundamental level, then π’s L1 status as an “intrinsic spatial metric” would need reassessment Not Falsified
Absolute Zero 0K If a state of matter is found to exhibit classical thermal motion below 0K. Quantum zero-point energy (quantum fluctuations) does not constitute falsification — the meaning of “cessation of thermal motion” requires precise definition Boundary Needs Precise Definition
Gravitational Constant G If G is found to vary with time or spatial location (some modified gravity theories have proposed this possibility). G has the lowest measurement precision among all fundamental constants Measurement Precision Pending
Elementary Charge e If freely existing fractional-charge particles are discovered (quarks carry fractional charges but are confined, which does not constitute falsification) Not Falsified
Boltzmann Constant kB If the relationship between microstates and temperature can be derived from more fundamental constants (the 2019 SI redefinition has already set kB to an exact value) Status May Be Conventional

6.2 Layer L2: Bidirectional Falsification

Each constant in L2 has two directions of falsification: Upward — if a future theory derives the value of α from first principles, it would be “promoted” from L2 to L3 (a derivable quantity), thereby falsifying its L2 status; Outward — if α is found to take different values in different regions of the universe (astronomical observations have already hinted at this possibility), then even its status as a “constant” would require reassessment. Both directions are falsifiable.

6.3 Layer L3: Falsification Through Upward Promotion

L3’s direction of falsification is upward. If √2 or the golden ratio φ is discovered to anchor an irreducible physical-topological structure (currently unknown), they should be promoted to L1 or L2. The definition of L3 itself presupposes this fluidity.

6.4 Layer L4: Built-In Falsifiability

The very existence of L4 is a statement awaiting falsification — “these parameters are currently irreducible.” Every major breakthrough in physics reduces the number of L4 parameters. The Standard Model expanded from 19 parameters to 26 (after the discovery of neutrino masses), demonstrating that L4 can also expand in the reverse direction.

6.5 Falsifiability of the Framework Itself

The criterion of “irreducibility” itself evolves with theoretical development — Planck’s constant h is considered in some frontier theories to be derivable from more fundamental natural units. If the standard of “irreducibility” is historical rather than absolute, then the boundary between L1 and L3 is dynamic. But this is precisely the advantage — a framework that self-corrects as physics advances is more honest, and more scientific, than one that claims to be eternally immutable.


VII. Three Principles of Physical Anchorism

Principle I: Anchoring Precedes Deduction
The validity of mathematical deduction depends on the physical anchoring of its starting point,
not on the logical completeness of the deductive process.
The chain of reasoning is only as strong as the anchor it is tied to.
Principle II: The Quantity Is Fixed; the Symbols Drift
The physical world contains absolute quantities. Humans use symbols to approximate them.
The approximation process introduces error and ambiguity, but the absolute quantity itself does not move.
Reality does not bend to notation; notation bends toward reality.
Principle III: Anchored Mathematics Is Science; Unanchored Mathematics Is a Game
Mathematics that can be reduced to physical quantities constitutes real knowledge.
Mathematics that cannot is a self-consistent but rootless formal system.
With an anchor, mathematics builds civilizations. Without one, it builds sandcastles.

VIII. Comparison with Existing Philosophies of Mathematics

8.1 Against Mill’s Empiricism

Mill held that all mathematics derives from empirical induction. Physical Anchorism disagrees. π is not a product of induction — it is an intrinsic property of geometric space, a discovery, not an induction. Physical Anchorism acknowledges the existence of a priori objective structures in mathematics, but insists that the validity of these structures must be verifiable through physical operations. Mill cannot explain why π appears in fields unrelated to circles; Physical Anchorism’s explanation is that π anchors not “the circle” but the topological structure of space itself.

8.2 Against Quine’s Indispensability Argument

Quine argued that mathematical entities “exist” because scientific theories cannot do without them. Physical Anchorism does not ask whether mathematical entities “exist” — it asks whether mathematical conclusions are “anchored.” Probability theory is indispensable to insurance actuarial science, but the 1/3 answer to the Two Children Problem does not thereby acquire the status of physical truth. Indispensability is not the same as anchoring.

8.3 Against Lakatos’s Quasi-Empiricism

Lakatos focused on how mathematical knowledge advances through conjectures and refutations. Physical Anchorism addresses a more fundamental question: how to stratify the ontological status of mathematical conclusions. Lakatos’s conjecture-and-refutation model cannot distinguish between “refuted by physical experiment” and “overturned by condition re-specification” — the former is an anchor test, the latter is scope drift. This is precisely the core problem that Physical Anchorism aims to solve.


IX. Corollaries and Applications

9.1 Wrong Anchor, Total Collapse

Ptolemy’s geocentric model — an elaborate mathematical system built on the wrong anchor. Epicycles, deferents, eccentrics: mathematically self-consistent, roughly matching observations. But the entire system was a sandcastle. Copernicus re-anchored to the Sun; Kepler corrected with elliptical orbits; everything became simple and clear. The mathematics didn’t change. The anchor changed. Everything changed.

9.2 Absolute Quantitative Mathematics as the Steel Skeleton of Science

GPS depends on the speed of light. Semiconductor chips depend on Planck’s constant. Space travel depends on the gravitational constant. Nuclear energy depends on E=mc². These absolute quantities are the steel skeleton of the edifice of science: remove any single one, and the entire structure collapses immediately.

9.3 The Two Divergent Paths of Mathematical Development

The dispute between Euclidean and non-Euclidean geometry — which better anchors real physical space. General relativity proved that Riemannian geometry won. The imaginary unit i went from “pure invention” to “physically anchored” through quantum mechanics. Today’s “purely computational mathematics” may well discover its physical anchor in the future, crossing from the Path of Invention to the Path of Discovery.


X. Conclusion

Numbers aligned with the physical world are Absolute Quantitative Numbers.
Numbers that cannot be aligned are Variable Numbers.
Absolute Quantitative Mathematics is the steel skeleton of human science.
If the anchor is wrong, all mathematics becomes a sandcastle.
The measure of a number is not its elegance, but its grip on reality.
Anchored mathematics endures; unanchored mathematics drifts.

Physical Anchorism provides a complete and falsifiable philosophical framework for mathematics: the core dichotomy divides mathematics into Absolute Quantitative and Variable categories; Physical-Topological Irreducibility provides a single stratification criterion; the four-layer depth model (L1 Irreducible → L2 Underivable → L3 Derivable → L4 Pending Determination) offers an operational classification tool; the Three Principles provide a philosophical manifesto; and the layer-by-layer falsifiability analysis ensures the framework’s scientific quality.

This framework self-corrects as physics advances — layer boundaries are dynamic, and constants can be promoted or demoted. A theory that acknowledges its own capacity for revision is closer to truth than one that claims eternal immutability.

True science is built upon anchors that do not move.


References

[1] J.S. Mill, A System of Logic, Ratiocinative and Inductive, London: John W. Parker, 1843.

[2] W.V.O. Quine & H. Putnam, “The Indispensability Argument,” in Philosophy of Mathematics: Selected Readings, 2nd ed., Cambridge University Press, 1983.

[3] I. Lakatos, “A Renaissance of Empiricism in the Recent Philosophy of Mathematics,” in Mathematics, Science and Epistemology: Philosophical Papers Vol. 2, Cambridge University Press, 1978.

[4] M. Rees, Just Six Numbers: The Deep Forces That Shape the Universe, New York: Basic Books, 2000.

[5] J.D. Barrow & F.J. Tipler, The Anthropic Cosmological Principle, Oxford University Press, 1986.

[6] J.C. Baez, “How Many Fundamental Constants Are There?” University of California, Riverside, 2011.

[7] NIST/CODATA, “Fundamental Physical Constants — Complete Listing,” National Institute of Standards and Technology, 2022.

[8] S. Shapiro, Thinking About Mathematics: The Philosophy of Mathematics, Oxford University Press, 2000.

[9] Stanford Encyclopedia of Philosophy, “Philosophy of Mathematics,” 2007 (rev. 2023).

[10] Stanford Encyclopedia of Philosophy, “Formalism in the Philosophy of Mathematics,” 2011 (rev. 2022).

[11] Wikipedia, “Dimensionless Physical Constant,” 2026.

[12] Wikipedia, “Physical Constant,” 2026.

[13] Wikipedia, “Mathematical Constant,” 2026.

[14] Wikipedia, “Planck Units,” 2026.

[15] R. Feynman, QED: The Strange Theory of Light and Matter, Princeton University Press, 1985.

V3 · APRIL 12, 2026

이조글로벌인공지능연구소 · LEECHO Global AI Research Lab

& Claude Opus 4.6 · Anthropic

© 2026 All Rights Reserved

댓글 남기기