What is Mathematics?
How working with Lean changed my understanding of what mathematics actually is — from classical definitions to directed acyclic graphs of theorems.
The Classical Definition
Ask a mathematician what mathematics is, and you’ll get a range of answers:
- “The science of quantity.” — Aristotle
- “The queen of the sciences.” — Gauss
- “Mathematics is the study of patterns and structures.” — most modern textbooks
The standard definition from philosophy of mathematics goes something like: mathematics is the study of abstract objects — numbers, shapes, functions, spaces — using logical reasoning and proof. It’s the discipline where we establish truths that are absolutely certain, derived from axioms through chains of deduction.
This definition is fine. It’s also incomplete. It describes the activity of mathematics without revealing its structure.
Working with Lean changed how I see that structure.
Proving Theorems in Lean
Lean is a proof assistant — a programming language where you write mathematical proofs that a computer verifies down to the last logical step. No hand-waving, no “it’s obvious,” no implicit appeals to intuition. Either the proof is complete and correct, or the compiler rejects it.
When you prove a theorem in Lean, something interesting happens. Say you want to prove that addition of natural numbers is commutative: for all natural numbers a and b, a + b = b + a. You might start the way you would on paper:
By induction on
b. Base case:a + 0 = 0 + a. Inductive step: assumea + b = b + a, then showa + succ(b) = succ(b) + a.
On paper, this might take half a page. You’d invoke a few “obvious” facts and move on.
Lean doesn’t let you do that. For the base case, you need:
a + 0 = a— true by the definition of addition (adding zero does nothing)0 + a = a— this is not true by definition, because addition is defined by recursion on the second argument; this requires its own proof by induction
For the inductive step, you need:
a + succ(b) = succ(a + b)— true by definitionsucc(b) + a = succ(b + a)— again, not definitional; requires its own lemma
Each of those lemmas depends on more basic facts. And those depend on even more basic facts. The chain continues all the way down to the axioms — the foundational rules hardcoded into Lean’s logical framework.
You don’t just prove one theorem. You build an entire tree of dependencies.
The Directed Acyclic Graph
This is the insight that changed how I think about mathematics.
Every axiomatic system gives rise to a directed acyclic graph (DAG) of theorems.
The nodes of the graph are of two kinds:
- Axiom nodes: statements accepted without proof — the roots of the graph, with no incoming edges
- Theorem nodes: statements with proofs — each one connected by directed edges to the results it depends on
There is a directed edge from node A to node B when the proof of B uses A. Axioms have no incoming edges. Every theorem traces back, through a chain of dependencies, to the axioms.
The graph is acyclic because circular reasoning is forbidden: theorem A can’t depend on theorem B if B already depends on A. If it could, the system would be inconsistent — you could prove anything, and the mathematics would be meaningless.
A Simple Example
Let’s build a small piece of this graph from scratch, using the natural numbers.
Lean’s type theory gives us the following axioms for natural number arithmetic (simplified and presented informally):
Axiom 1 — Zero. 0 is a natural number.
Axiom 2 — Successor. Every natural number n has a successor succ(n). This gives us 1 = succ(0), 2 = succ(1), and so on.
Axiom 3 — Addition (recursive definition).
- 3a:
n + 0 = n— adding zero yieldsn - 3b:
n + succ(m) = succ(n + m)— adding a successor is the successor of the sum
Axiom 4 — Induction. If a property P holds for 0, and whenever P holds for n it also holds for succ(n), then P holds for all natural numbers.
These four axioms are our foundation. Now let’s derive some theorems and see the DAG emerge.
Lemma 1: 0 + 0 = 0
This looks trivially obvious, but notice: Axiom 3a says n + 0 = n. It tells us what happens when zero is the second argument. By substituting n = 0, we get 0 + 0 = 0.
Depends on: Axiom 3a
Lemma 2: 0 + succ(n) = succ(0 + n)
Axiom 3b says m + succ(n) = succ(m + n). Substitute m = 0: we get 0 + succ(n) = succ(0 + n).
Depends on: Axiom 3b
Theorem: 0 + n = n for all n
This is Nat.zero_add in Lean’s Mathlib. The proof is by induction on n:
- Base case (
n = 0): We need0 + 0 = 0. This is Lemma 1. - Inductive step: Assume
0 + n = n(the inductive hypothesis). We need to show0 + succ(n) = succ(n).- By Lemma 2:
0 + succ(n) = succ(0 + n) - By the inductive hypothesis:
0 + n = n, sosucc(0 + n) = succ(n) - Therefore:
0 + succ(n) = succ(n)∎
- By Lemma 2:
Depends on: Axiom 4 (induction), Lemma 1, Lemma 2
The DAG
Here’s the directed acyclic graph for this small fragment:
Each arrow means “is used in the proof of.” The axioms sit at the bottom with no dependencies. The lemmas depend on axioms. The theorem depends on lemmas and an axiom. Every node is either a starting point (axiom) or is fully justified by the nodes below it.
This is a tiny fragment. The full proof of Nat.add_comm (commutativity of addition) depends on Nat.zero_add (our theorem above), plus Nat.succ_add (which has its own dependency subtree), plus induction again. And Nat.add_comm is itself used in thousands of further theorems about arithmetic, algebra, number theory, and beyond.
Lean’s mathematical library, Mathlib, contains over 180,000 declarations — theorems, lemmas, and definitions — each one a node in this graph, each one traceable back to the axioms.
Mathematics as Structure
This perspective reframes what mathematics is.
Mathematics is not just “the study of abstract objects.” It is the construction and exploration of directed acyclic graphs that grow from axiom sets.
Each choice of axioms defines a different graph — a different mathematical universe:
- Start with Euclid’s five postulates, and you grow the DAG of Euclidean geometry — triangles, circles, parallel lines, the Pythagorean theorem.
- Replace the parallel postulate, and you get hyperbolic or elliptic geometry — entirely different DAGs where parallel lines behave differently and the angles of a triangle don’t sum to 180°.
- Start with the ZFC set theory axioms, and you get the vast DAG that underpins most of modern mathematics — from real analysis to abstract algebra to topology.
- Start with Lean’s type theory (the Calculus of Inductive Constructions), and you get a DAG that is largely equivalent to ZFC’s but formally distinct — and mechanically verifiable.
Different axioms, different mathematical universes. Each dashed box represents a DAG growing from its own axiom set. The dots represent the vast number of theorems reachable from those axioms — ZFC’s graph is by far the largest.
The axioms you choose determine the universe of theorems available to you. Some axioms are independent of others — meaning there exist theorems you can only reach by including specific axioms. The Axiom of Choice is the most famous example: include it, and you can prove that every vector space has a basis, that there exists a well-ordering of the reals, and many other results. Exclude it, and those theorems simply don’t exist in your graph.
This is a profound realization. Mathematics isn’t one monolithic structure. It’s a family of structures, each defined by its axioms, each with its own frontier of reachable theorems.
Why This Matters
This isn’t just a philosophical curiosity. Understanding mathematics as a DAG has practical implications.
For formalization
When you formalize mathematics in Lean, you are literally constructing this DAG. Every theorem declaration adds a node. Every lemma invoked in a proof adds a directed edge. The compiler verifies that the graph is acyclic and that every edge is valid — that every theorem really does follow from what it claims to depend on.
Mathlib’s dependency graph is, in a very concrete sense, the largest rigorously verified mathematical structure ever built. And it’s growing daily.
For education
We typically teach mathematics as a linear sequence: arithmetic → algebra → calculus → analysis. But the real structure is a graph, not a line. Many concepts are independent and can be learned in parallel. Understanding the dependency structure reveals what you actually need to know before tackling a new topic — and, just as importantly, what you don’t.
A student who understands the DAG can see that group theory doesn’t require calculus, that combinatorics doesn’t require topology, and that logic can be studied from day one. The linear curriculum is a convenience, not a necessity.
For research
Mathematical research is the act of extending the DAG — adding new nodes at the frontier. You take existing theorems and axioms, and you explore what new results can be derived.
Sometimes you discover that a new axiom produces a richer graph. Sometimes you find surprising shortcuts — a theorem everyone thought required deep machinery can be proven from more elementary results. Sometimes you find that two distant branches of the graph are connected by an unexpected theorem, revealing deep structural relationships.
The Clay Millennium Problems, the Riemann Hypothesis, Goldbach’s conjecture — these are theorems that we believe should be nodes in the graph, but for which no one has yet found the path from the axioms. Finding that path is what mathematical research is.
The View from the Leaves
When you work on paper, you’re usually standing at the leaves of the DAG, looking at theorems and their immediate neighbors. You invoke lemmas by name, trust that they’re correct, and move on. The deeper structure is invisible — a lemma might depend on ten other lemmas, each depending on ten more, all the way down to the axioms. But on paper, you don’t see any of that.
When you work in Lean, you see the graph. You feel it. Every sorry (an unfinished proof) is a gap in the DAG — a missing node that blocks everything above it. Every import connects your work to a subtree of dependencies you can inspect. Every red squiggly line means you claimed a dependency that doesn’t actually hold.
You learn, viscerally, that mathematics is not a collection of isolated facts. It is a single, vast, interconnected structure — rooted in axioms and growing outward, theorem by theorem, into territory that no one has mapped yet.
That, to me, is what mathematics is.