The Limits of Looking Ahead

The case for quantitative models for projections

Dr. Elliott More

8/4/20253 min read

Peering into the future has never been easy. In a world of interlocking systems, feedback loops and global shocks, forecasting even the near term feels increasingly like a dark art. And yet decisions must be made—by policymakers, planners and corporate sustainability teams—based on what might lie ahead.

The instinctive human approach is to reason our way through the problem. From a young age we learn to argue our case, verbally or internally, often with great confidence. But while rhetorical reasoning feels logical, it often conceals bias, fallacy and wishful thinking. In complex systems, verbal logic alone is seldom enough.

Take the so-called “single effect trap”. If action A reduces outcome B, and B is known to drive outcome C, then surely A must reduce C. It sounds perfectly reasonable. But in systems with non-linear dynamics, feedback mechanisms and competing influences, such logic falls apart.

A Case of Unintended Consequences

Consider the example of abstinence-only sex education in the United States. Its logic was disarmingly simple. Teaching young people to delay sex until marriage (A) would reduce the number of teens having sex (B). With fewer sexual encounters, the number of unintended pregnancies and sexually transmitted infections (C) should fall.

What this model neglected was D: that those who did have sex would now be doing so without adequate knowledge or access to contraception. In practice, abstinence-only education appears to have increased the rate of unprotected sex, which in turn fuelled precisely the outcomes it sought to avoid.

A 2015 study found no link between abstinence-only education and reduced STI rates. In fact, the rise of the abstinence-only movement in the mid-2000s coincided with a reversal in the long-term decline in teen pregnancy. Meanwhile, comprehensive sex education—which acknowledges system complexity—has been shown to reduce teen birth rates more reliably.

The lesson is clear: verbal reasoning, particularly when linear, is prone to failure in complex environments. Simple stories can be powerfully persuasive, even when they are wrong.

Modelling the Real World—Imperfectly

This is not to suggest that mathematical models are perfect. Far from it. The real world contains non-linearities, tipping points and chaotic dynamics that even the most robust equations cannot fully capture. Chaos theory tells us that small uncertainties in starting conditions can snowball into wildly different outcomes, placing limits on how far ahead any model can reliably project.

But models, for all their flaws, are preferable to gut feeling. They enforce internal consistency, quantify uncertainty, and help us test assumptions rigorously. Without quantitative modelling, we are left to the fireballs of our own intuition—and intuition, as behavioural science has shown time and again, is a poor guide in complex systems.

Complexity, Uncertainty and Carbon

The world of greenhouse gas (GHG) emissions is one such system. A company’s future emissions depend not only on what it chooses to do internally—retrofitting buildings, electrifying fleets, redesigning processes—but also on a web of external trends:

  • The rate of grid decarbonisation

  • Shifts in transport and mobility patterns

  • Changes to national policies or carbon pricing

  • Technology cost curves and supply chain evolution

Each of these influences both activity data and emission factors. A firm that fails to model these external trends—choosing instead to focus solely on its internal actions—may be blindsided. It could invest heavily in measures that turn out to be redundant, or miss opportunities to benefit from broader shifts already underway.

Worse, it might suffer unintended consequences akin to those of abstinence-only education: an action intended to reduce emissions could inadvertently drive them up if feedback loops or trade-offs are ignored.

Quantitative Models, with Caveats

That is why quantitative modelling, with explicit assumptions and uncertainty bounds, is essential. Not to predict the future with pinpoint precision, but to understand how different forces interact, and what outcomes are plausible under varying scenarios.

At Viable Pathway, our emissions projections do not claim clairvoyance. Rather, they model how emissions might evolve if certain conditions hold—accounting for both internal strategies and exogenous drivers. They are tools for planning, not prophecies.

Models have limits. But so too do verbal arguments, especially when informed by linear thinking. The real world is messier, more entangled, and more surprising than our intuitions allow. It demands tools that can cope with that messiness—and challenge our assumptions rather than reinforce them.

Conclusion: Modelling with Humility

There is wisdom in recognising that no model is perfect, but some are useful. Knowing the limits of projection is not a reason to avoid it—it is a reason to do it better, and more transparently. Linear thinking is comforting, but inadequate. In a complex world, modelling with humility is not a weakness. It is the foundation of sound decision-making.