A few days ago, when the weight of school was too much and everything felt worn thin, I called my mother. We talked for awhile and I went for a walk. I just needed to move, to let the air fill my lungs and clear my head, to feel like the world had more to offer than textbooks and expensive coffee. I walked until I found myself in front of the bookstore. I didn’t know I was going there, but I went in anyway.
Inside, they were having a sale. Seventy percent off. A small nudge, a lifeline thrown by the universe. I slipped on my headphones, and let the rhythm of the music guide my hands through the boxes of books. I wasn’t looking for anything in particular—just the feel of pages under my fingertips, the solid weight of human ingenuity in the written form. Most books were in Dutch. The English ones were technical, part of the For Dummies series, simple and lifeless.
Then I found a book. Or maybe it found me.
"Scientonomy: The Challenges of Constructing a Theory of Scientific Change."
A strange book. Niche. It felt like something you’d find in a forgotten corner of a university library, gathering dust. Not the kind of thing you pick up when you’re looking for comfort, or distraction, or meaning.
But I bought it anyway.
I still don’t know why. Maybe it was curiosity. Maybe it was the feeling that it had found me, not the other way around. Or maybe it was just simple, undeniable math—it used to be 80 euros, and now it was 24. That was reason enough.
Scientonomy
After opening it to a random page and reading a bit, I understood why it found me. The chapter detailed the mathematical formula for scientific change.
Scientonomy is the study of how scientific knowledge evolves over time. In each episode, a scientific consensus emerges—an accepted framework of truths that guides inquiry and shapes understanding. Over time, the unquestionable dogma becomes the vestigial remanent of more evolved ideas.
For instance, the acceptance of probabilistic models in quantum mechanics marked a Cambrian explosion in the evolutionary trajectory of science. Transitioning from the deterministic framework of classical physics to the probabilistic nature of quantum mechanics fundamentally challenged our worldview. This change, a seemingly organic shift in perspective, follows a deeply rooted pattern, characteristic of many other paradigm shifts in science.
We can think of the current scientific consensus (Mt) as a mosaic:
a collection of accepted theories (propositions explaining phenomena)
modal methods, or rules governing how theories can change or be accepted
enabling constraints, which are assumptions that shape what kinds of theories and methods are considered acceptable. Constraints reduce the degrees of freedom, simplifying the problem, but also allowing for solution spaces to reveal hidden symmetries and conservation laws. Often though, a relaxation (Δκ) of these constraints is often necessary for a new mosaic (Mt+1) to emerge.
anomalies (A) stress the current mosaic. When unresolvable within the existing framework, they catalyze the introduction new methods and/or the modification of constraints.
A community C at time t accepts a theory T only if T meets the criteria defined by the community’s current methods. If a community uses a modal method—a rule for determining which theories are accepted—that method is itself part of the current explanatory framework (MC,t), or "mosaic."
However, this mosaic is not a fixed structure. When a new theory is accepted, it means prior theories T or modal methods were inadequate. Thus, for a theory to be genuinely new, it could not have been part of the previous mosaic MC,t.
Thus, a simplified relation for the change in mosaics shows that
Where:
MC,t = Current mosaic accepted by a community C at time t
Δμ = Change in modal methods (introduction of new methods)
Δκ = Relaxation or modification of enabling constraints
A = Accumulation of anomalies
This can be further expanded on by the Law of Requisite Variety which defines how response variety and perturbation variety relate to scientific change:
Response variety (VR) is the range of methods, models, and theoretical tools available to a community to address and explain phenomena. It represents the internal complexity of the scientific system—how many different ways scientists can approach, model, or solve a problem. The response variety is often limited by enabling constraints, so that theories fit within the current mosaic.
Perturbation variety (VΠ) is the range of anomalies and unexplained phenomena of the external environment that challenge existing theories. It represents the external complexity of the scientific world—how many different problems or irregularities need to be addressed. Often as science progresses, perturbation variety increases when the existing mosaic fails to explain new observations.
The Law of Requisite Variety states:
For a scientific community to maintain control (predictive power) and stability, the response variety must be equal to or greater than the perturbation variety.
When perturbation variety exceeds response variety, the system experiences instability, leading to a paradigm shift as new methods and theories are introduced to restore balance.
This is all very abstract. Let’s apply it to the shift from Newtonian physics to quantum mechanics.
Before quantum mechanics, the scientific mosaic of physics was dominated by Newtonian mechanics and Maxwell’s electromagnetism, which were rooted in determinism and continuity.
Key Enabling Constraints in Classical Physics:
Determinism: The belief that, given complete knowledge of a system's initial conditions, its future behavior can be predicted with certainty.
Continuity of physical quantities: Physical variables like energy and momentum were assumed to change in a continuous manner.
Locality: Causal interactions were assumed to occur locally—effects happen through direct contact or within the speed of light limit.
Objective reality: The belief that physical systems have well-defined properties independent of observation.
Classical physics, with its deterministic worldview, was once seen as a near-complete account of reality. But as new phenomena—such as blackbody radiation and the photoelectric effect—defied explanation within this framework, it became clear that the underlying assumptions were flawed. These anomalies stressed the enabling constraints to the breaking point, forcing a shift in the modal methods of physics.
Key Anomalies and Modal Method Shifts:
Blackbody radiation: Classical physics predicted that a blackbody should emit infinite energy at short wavelengths, which didn’t match experimental observations.
Max Planck introduced the idea of quantized energy levels, where energy is emitted in discrete packets (quanta).
Photoelectric effect: Classical wave theory couldn’t explain why light below a certain frequency, regardless of intensity, couldn’t eject electrons from a metal surface.
Albert Einstein proposed that light consists of particles (photons), each carrying a quantum of energy.
Atomic spectra: Classical models couldn’t explain why atoms emitted light at discrete frequencies rather than a continuous spectrum.
Niels Bohr introduced the concept of quantized orbits for electrons, where they could only occupy specific energy levels.
Wave-Particle duality: Electrons and other particles exhibited interference patterns characteristic of waves, even when fired one at a time.
Led to the development of wave mechanics by Erwin Schrödinger and the uncertainty principle by Werner Heisenberg.
Relaxing of Enabling Constraints
While the accumulation of anomalies and experimental evidence played a role in the emergence of a new scientific framework, these alone were not sufficient to bring about a new mosaic of understanding. Scientific progress is not a simple reaction to data; it depends fundamentally on the creation of better explanations. The anomalies posed problems, but it was the philosophical shift—the rethinking of the very foundations of knowledge—that made the acceptance of probabilistic models in quantum mechanics possible. This shift involved a relaxation of the enabling constraints that had long governed how physicists conceived of reality, causality, and measurement. To accept the probabilistic models of quantum mechanics, physicists had to:
abandon determinism and realize that nature itself might be intrinsically probabilistic
accept the observer effect and that the act of measurement could affect the system being measured
realize that quantum mechanics does not describe an objective reality independent of observation but rather the probabilities of measurement outcomes
In the theory change model, the shift from classical mechanics to quantum mechanics involved modifying enabling constraints and introducing new modal methods that made probabilistic models acceptable.
New Modal Methods (Δμ): Schrödinger’s wave mechanics, Heisenberg’s matrix mechanics.
Relaxation of Enabling Constraints (Δκ): Acceptance of probabilistic models, non-locality, and observer-dependent reality.
The New Mosaic (Mt+1): Quantum mechanics as the dominant paradigm, with new theories like the Copenhagen Interpretation.
Just as physics was once confined by the rigid determinism of classical mechanics, biology today finds itself tethered to mechanistic models and reductionist approaches. In both cases, the prevailing scientific mosaics provided powerful tools for understanding the world, but they also imposed constraints on how new phenomena could be interpreted. Biology is now encountering its own set of anomalies—vast, complex datasets from omics technologies, the biomarker paradox, and overlooked phenomena like bioelectric fields and quantum effects. Just as physics had to relax its enabling constraints to embrace the uncertainty of the quantum world, biology may need to undergo a similar paradigm shift to account for these complexities.
Key Enabling Constraints in “Classical” Biology:
Mechanistic explanations: Biological phenomena are explained through mechanistic models that describe clear cause-and-effect relationships at the molecular, cellular, and systemic levels. Since the rise of molecular biology, the field assumes that to understand life, it must be dissected into its components (genes, proteins, pathways) to map out the precise mechanisms that underly biological functions.
Hypothesis-driven science: Scientific inquiry follows an old rulebook: formulate a hypothesis, design experiments to test it, analyze results. The linear process of “theory before data” ensures scientific rigor and falsifiability.
Current modal methods of controlled experiments, statistical hypothesis testing, and mechanistic modeling reinforce the enabling constraints and dominant theories in biology. However, key anomalies are revealing our blind spots with these methods.
Key Anomalies:
Omics revolution- the vast amounts of data generated from high-throughput sequencing and omics technologies has introduced complexities traditional models cannot usefully capture. Unexplained patterns in gene expression, protein interactions, and metabolic pathways outpace the explanatory power of existing biological theories.
Biomarker paradox- with over 100,000 biomarkers in published studies, only a few (~100) are used in clinical practice. They are identified in preclinical and omics studies, but fail in clinical trials. The context-dependency of biomarkers are often oversimplified and systems-level interactions are ignored.
Dogmatic (DNA → RNA → protein) focus- the data we collect is based on inherent biases of what we believe to be important and predictive. Studies continually focus on the DNA, RNA, protein pipeline as the basis of biological research. But this leaves out so much—bioelectric fields, mechanical forces, quantum effects, biophotons, and more. These phenomena might not be as peripheral as we believed them to be.
A Modal Method Shift—AI-Integrated Biology
AI models directly challenge the assumptions of the current mosaic in biology. They provide predictions and probabilities, identifying complex, non-linear relationships, but lack the familiar, transparent, mechanistic explanations. AI, with its black-box models and probabilistic predictions, defies this paradigm.
AI’s success implies that predictive power might not always require a mechanistic explanation. But does that constitute understanding? The current mosaic in biology resists this notion, holding tightly to the belief that science advances through dissecting systems into their fundamental parts and mapping their interactions.
Current crisis
Despite its disruptive advances in other fields, AI still operates within the enabling constraints of the current mosaic. The scientific community frames the problems, defines the variables, selects the datasets, and interprets the outcomes through existing theoretical lenses. AI excels at optimizing within these predefined problem spaces—it solves problems efficiently, but it doesn’t generate new ones. It can identify patterns in biological data, but it doesn’t inherently know which patterns matter. The hard problem in AI is not about solving existing problems but about formulating them. AI doesn’t inherently know which features or variables are important—it relies on human-designed representations.
We are using AI to replicate our minds— the equivalent of breeding faster horses for transportation instead of engineering cars. We are stuck in its skeuomorphic comfort, talking to it like it’s another person, feeding it prompts and questions because that’s what feels familiar. AI today optimizes complex biological problems faster but doesn’t inherently redefine those problems.
What if we stopped trying to make it like us? Ultimately, it’s a statistical pattern-matching machine, generating responses based on vast amounts of data. But because its interface mimics human conversation, we instinctively interact with it as if it shares our way of thinking—as if it reasons, understands, or possesses consciousness in the same sense that we do. This illusion becomes a crutch as we grasp what is possible, but it also limits us. We are projecting our cognitive structures onto a system that doesn’t operate by them.
Relaxing of Enabling Constraints
The previous constraints have served as useful scaffolding, but the next mosaic shift requires an acceptance of new beliefs:
data-first philosophy, using an exploratory approach without specific questions, embracing serendipitous data-discovery
acceptance of pattern recognition without immediate mechanistic understanding
focus on data reproducibility and robustness rather than achieving statistical significance in isolated experiments and publishing metrics
In the end, relaxing these enabling constraints is not about abandoning rigor or descending into chaos. It’s about recognizing that our current frameworks are insufficient to grapple with the complexity of life.
After extensive reading, my eyes were heavy. I let a podcast play in the background, Modern Wisdom’s interview with Alain de Botton. They were talking about emotions—how we drift through life convinced that the way we feel is normal, that our reactions are natural, automatic, just the way things are. But they’re not.
Our minds are quantum sensory compressors, over-trained on childhood experiences. While our external environment has matured, our internal dialogue, instinctive reactions, and cognitive scaffolds often remain underdeveloped. We don’t even realize we are serving the needs of our 5 year-old self until we verbalize them and allow language to provide a mirror to see our own reflection. We fold the complexity of the universe into pocket-sized shapes of familiar mental patterns. We see so much yet notice so little. Sitting with ourselves is the modal method we need to take what is intellectually compressed in our minds and expand it, study it, change it.
Therapy also does that. It pulls at the threads we’ve grown used to, the ones we don’t notice anymore. It shows us the patterns we’ve lived by without question. The ones we’ve mistaken for truth. Therapy has a way of exposing implicit assumptions, of discovering and finding doors you always thought were walls. Personal evolution happens through subtracting, when we peel away the layers, the masks, the little lies that feel like truths because we’ve worn them so long they’ve molded to our skin. Science is the external echo of the internal quest: progress comes from realizing the constraints we thought enabled us, the frameworks we clung to like lifeboats, the beautifully wrapped equations and tables, were cages all the same. Both processes demand courage: to face the beliefs we didn’t even know we were carrying, the stories we’ve told ourselves for so long they felt like facts.
Each discovery, whether in science or in ourselves, is like taking the scraps and weaving something new. To draw analogies, thick and thin, stretch comparisons till they groan at the seams, pull the threads, till the differences tumble out, tucked deep like lost coins in old coat pockets. We weave a new world from the scraps—the odd ends and loose threads, from the things that were never meant to fit, never meant to meet. Both science and life require the same bravery: the willingness to question the fixed and embrace the fluid. To see that the walls confining us were always of our own making, and the doors to new worlds—whether in a dusty bookstore or the quiet corners of our minds—have been there all along, quietly waiting for us to turn the knob.