From wild business expectations to accountable business choices
Most business leaders have experienced a bold initiative launched with confidence, only to watch it unravel in execution. The revenue targets were ambitious but achievable. The timeline was aggressive but realistic. Yet somehow, six months later, the project is over budget, behind schedule, and delivering only a fraction of the promised benefits.
Business expectations are rarely the problem; the problem begins when those expectations move into execution without being tested, specified, and constrained. Modern business environments are characterized by uncertainty, interdependence, and tight resource boundaries. Rapid technological change has altered the structure of environmental uncertainty for many businesses. Digital platforms reconfigure industry boundaries, artificial intelligence shifts cost curves, regulatory frameworks struggle to keep pace with innovation, and customer behaviour adapts faster than traditional planning cycles can respond. Under such conditions, uncertainty is no longer just stochastic variation around a stable trend; it becomes structural and discontinuous. As a result, traditional forecasting methods, which rely on the assumption of stability and continuity, frequently prove insufficient. In such uncertainty, systematic evaluation is not optional, it becomes a core strategic discipline. This complex and dynamic nature of the business environment highlights the need for adaptive decision-making (Jijidiana Bakhary et al., 2024).
Traditional planning forecasting extrapolates from historical data using trend analysis, seasonal adjustments, or linear regression. This implicitly assumes that the structural drivers of the past will persist into the future (West, 1996). However, this premise may not hold in rapidly changing business environments. To address uncertainty in decision-making, it is essential to construct probability-weighted models that account for a range of possible future scenarios. Techniques such as system dynamics modelling, Monte Carlo analysis, and agent-based simulation offer sophisticated means of projecting requirements and outcomes (Sterman, 2009).
The Problem of Wild Business Expectations
Most of the transformational initiative, growth target, digital strategy, and product roadmap begins as a wild expectation. The question is never whether wild expectations will be generated. The question is whether the organisation will subject them to the structured discipline required to convert them into traceable, accountable choices or whether it will carry them directly into execution, where their wildness will be revealed, expensively, by reality.
According to the findings of Flyvbjerg & Gardner (2023), projects go over budget, over time, and under benefits, repeatedly, with a statistical consistency that is, in his words, ‘remarkable, exhibiting a level of significance rarely seen in the typically variable realm of human behaviour.’ Specifically, across his dataset 8.5% of projects meet or exceed their time and budget expectations simultaneously, and a mere 0.5% deliver on time, on budget, and with full promised benefits. In the high-stakes arena of modern commerce, “wild business expectations” are more than just optimistic goals, they are structural risks. It typically stems from optimism bias and the planning fallacy. Research consistently shows that decision-makers rely on overly favorable assumptions, leading to systematically inflated revenue projections and underestimated costs (Son & Rojas, 2011; Täuscher & Chafac, 2016). The primary cause of project failure is not technical difficulty, market change, or bad luck. They are delusional planning and strategic misrepresentation at the expectation-formation stage (Bent Flyvbjerg, 2023).
When a business leader sets an ambitious target such as “30% market share,” it is not simply a statement of intent. It is an implicit forecast, built on a series of underlying assumptions: that previous growth drivers will continue to operate, that competitors will not respond aggressively, and that the organisation’s internal capacity will expand as required.
However, empirical evidence shows that these beliefs frequently do not hold true in practice. The mistaken conviction that such assumptions will predictably deliver the desired outcomes is challenged by the requirement for structural discipline. To correct this, it is essential to explicitly document, in writing, how the current initiative diverges from these assumptions and why those differences might justify a different distribution of outcomes.
While a wild business expectation can be viewed as a narrative, a story about future possibilities but a traceable business decision is anchored as a data point, grounded in documented rationale and evidence.
The transition from wild business expectations to accountable choices requires a “pipeline” that acts as a cognitive filter for the business. This pipeline ensures that every business model choice is made with a clear understanding of the associated trade-offs.
Opting for rapid growth necessitates a compromise (trade-off) on short-term profitability.
To establish accountability, the rationale behind each decision must be explicitly documented. This includes recording the assumptions that underpin the choice, specifying the scenarios in which the decision was tested, and identifying the criteria that would render the decision invalid. By doing so, the organisation constructs a robust framework for accountability, making the escalation process transparent. Furthermore, this transparency allows decision-makers and reviewers to directly assess whether the original conditions that justified the decision remain valid. It provides a systematic mechanism to revisit and evaluate choices, ensuring that decisions are not only traceable but also adaptable to changing circumstances. The filteration of wild expectations into strcutured business requirements is facilitated by three primary lenses:

| Filter | What It Does | Answers | Prevents |
|---|---|---|---|
| Specificity | Convert vague aspirations into measurable claims | “What exactly do we mean?” | Fuzzy goals that cannot be tested |
| Feasibility | Evaluate expectations across multiple scenarios | “What could go wrong?” | Optimism bias and planning fallacy |
| Trade-off | Record what was decided and what was rejected | “What are we giving up?” | Escalation of commitment and blame culture |
Specificity converts vague aspirations into precise, measurable claims. Its empirical importance derives from a basic principle of decision theory: a claim cannot be feasibility-tested, scenario-modelled, or trade-off-compared unless it is specific enough to have a definable truth value.
The claim ‘we will become the market leader in digital banking’ has no testable content until it specifies: in what geography, by what metric, within what timeframe, against which competitors.
Feasibility testing forces a specific expectation to be evaluated not just in the best-case world but across a structured range of plausible conditions. Scenario planning methodology, refined through Shell’s experience from the 1970s through today (Pierre Wack, 1985), has accumulated substantial evidence that organisations that subject strategic decisions to multi-scenario stress testing make better decisions than those that plan against a single baseline.
Trade-off documentation is the mechanism that records what was decided and what alternatives were rejected. Its empirical basis rests on two streams of research: Walsh and Ungson’s foundational work on organisational memory (Walsh & Ungson, 1991). They demonstrated that organisations that systematically retain and access records of past decisions make better subsequent decisions. The absence of documented trade-offs is a primary driver of decision failure, because undocumented decisions cannot be reviewed, learned from, or adjusted when circumstances change (Nutt, 2002). Thus, accountability without documentation is retrospective blame assignment; accountability with documentation is prospective learning governance.
Wild expectations fail not because business leaders are unintelligent or malicious. It fails because of optimistic ambition, lack of organisational approval, and the absence of structural process of premature commitment to inadequately specified, unfeasibility-tested, and undocumented expectations. Furthermore, Tetlock (2017) research on forecasting accuracy also demonstrated that the single most reliable predictor of good forecasting performance was not intelligence, expertise, or experience. It was the structural practice of keeping explicit records of forecasts and systematically comparing them to outcomes.
Quantitative Modeling of Business Strategy Using System Dynamics
Once strategy is expressed as specific, bounded, and accountable propositions, a new class of questions becomes not just answerable but necessary:
- How do these propositions interact?
- What happens when the variable most favourable to one strategic choice is simultaneously the most unfavourable to another?
- How does the strategy perform not at its central estimate but across the full distribution of conditions?
- What is the probability that the strategy meets its own success criteria?
These are questions that no amount of qualitative reasoning can answer reliably. They require computation. The most intellectually powerful form of business strategy-building is system dynamics (Sterman, 2009). System dynamics models a business system as:
- a network of stocks such as accumulated quantities customers, revenue, capabilities, inventory and reputation,
- flow rates at which stocks change like acquisition, attrition, investment, depreciation, and
- feedback loops such as the causal circuits through which a system’s outputs become inputs to its future behaviour.
Components of businesses also contain interconnected variables such as price influences demand, demand influences capacity, capacity influences error rates, and error rates influence customer retention.

System dynamics models make these causal chains explicit. Instead of asking whether a new pricing strategy will “probably work”, modeling helps to evaluate probability distributions of revenue, cost volatility, and service-level degradation. Consequently, businesses also operate within fixed budgets, staffing limits, regulatory requirements, and capacity ceilings. By incorporating these factors and constraints into the modeling process, only strategic alternatives that adhere to such restrictions are quantitatively assessed. This approach prevents consideration of unrealistic growth scenarios and maintains focus on viable business options.
System dynamics is not simply a computational exercise appended to business analysis. It is the natural and necessary extension of the three filter lenses discussed earlier. It requires that every flow of revenue, stocks or processes be defined with a specific unit and a relationship with other variables. This makes the models operate within a clearly defined boundary, comprising endogenous variables whose dynamics can be explained by the model, and exogenous variables that originate externally. Feasibility testing is an essential phase that identifies the conditions under which the strategy should operate, thereby defining the model’s parameters and the boundaries of the application.
System dynamics models are artefacts of institutional reasoning and learning. They encode, in mathematical form, the organisation’s current best understanding of how its strategic environment works. The trade-offs are documented and ensures that the assumptions embedded in the model are recorded explicitly, the alternatives considered in model design are preserved, and that the reasoning behind structural choices is available for audit and revision (Sterman, 2009).
Setting the Foundations of a Conceptual Business Model
Before any quantitative simulation can be meaningfully applied, the business requirements must be first established along with a scenario architecture within which a simulation can operate. Scenarios are specific conditions, events, and external factors under which a business model will operate. It is a process of imagining several informed, plausible, and alternative future environments in which decisions about the future can be played out. Scenario architecture emerged from strategic management and systems thinking research. It defines what future states are conceivable, what structural forces shape them, and which variables meaningfully shift outcomes. Its purpose is to change current thinking, improve decision-making, enhance human and organizational learning, and improve performance (Chermack, 2004).
If price increases by 10%, will demand decrease by 5%?
Scenarios force all parts of a foundational model to interact simultaneously. It introduces and quantifies uncertainty to determine stress capacities of different business process relationships. Moreover, it is the bridge between qualitative insights in the form of narratives and mathematical quantification to build actionable strategies (Aila Alessa, 2024). With it, simulation becomes a structured experimentation within defined future logics. It shapes the model’s boundary conditions, variable interactions, and evaluation thresholds.
By building a set of distinct, internally consistent scenarios such as rapid growth or stagnation, the scenarios are sampled using specific numbers and the model calculates specific result numbers out. This produces a distribution of potential outcomes in terms of revenue, profit or cash flow.
Without the scenarios the model variables are just placeholders. The scenario gives them concrete, numerical values, creating a specific economic reality. Hirsch et al. (2013) argue, quantified scenarios increase the impact foresight has on strategic planning by linking qualitative narratives directly to the strategy process. A key concern is ensuring strong model-reality fit, meaning the simulation accurately mirrors real-world business processes to produce valid outcomes (Lin et al., 2024). Scenario construction operates at three levels.

Macro
At the macro level, scenarios describe the external business environment in which the system will operate.
| Driver | Variable | Measures |
|---|---|---|
| Regulatory change | compliance cost factor | % increase in operating cost |
| Economic cycle | demand multiplier | demand index (0.7–1.3) |
| Competitive entry | market share erosion rate | % loss of customers |
| Demographic shift | customer base growth rate | annual growth % |
The scenarios at this level helps to set the boundary conditions and constraints of the business model. Macro factors rarely appear directly in formulas; instead, they shift the parameters of meso and micro variables (Manuj et al., 2009). These drivers are also called exogenous variables, while operational variables like meso & micro drivers are endogenous variables.
Meso
Scenarios from the Meso level represents organizational strategic responses and resource configurations. These translate into capacity and structural variables.
| Driver | Variable | Measures |
|---|---|---|
| Staffing change | number of workers | employees per shift |
| Strategic pivot | product mix ratio | % of orders by category |
| Budget cycles | marketing spends | monthly spend |
| M&A activity | integration delay | months to process alignment |
The scenarios constructed at this level helps to test whether the organization can function under different staffing or budget constraints. They are organizational dependencies of success.
Micro
At the micro level, scenarios describe the behavioural variability of the system’s users.
| Driver | Variable | Measures |
|---|---|---|
| Adoption rates | user conversion probability | % of users |
| Usage patterns | transactions per user | events/day |
| Error frequencies | defect probability | % |
| Support demand | tickets per user | requests/day |
Micro scenarios describe day-to-day behavioral variability, which becomes the stochastic inputs of simulation. The scenarios constructed at this level helps to test whether the users will behave as assumed. They are the behavioral assumptions of the underlying forecasts.
When structured properly, the three layers form a causal hierarchy of variables. This hierarchical structure is commonly used in system simulations where environmental uncertainty influences operational variables. The next step is to encode dependencies explicitly as scenarios are mostly about correlation.
In a college canteen “high demand” correlates with “higher error rate” and “longer service time”, while “supplier delay” correlates with “menu restrictions” and therefore “order mix.”
Documentation of requirements, constraints and trade-offs
In data-driven decision process, the conceptual model is rarely correct in its first formulation. Instead, it evolves through iterative feedback loops between assumptions, simulation outcomes, and stakeholder interpretation. Thus, documenting requirements, trade-offs, and constraints creates a structured record that allows the conceptual model to be refined systematically as new insights emerge from simulation experiments. Without a documentation a simulation becomes an exploratory exercise without accountability.
Requirements define the objectives that the conceptual model must accomplish and establish the specific outcomes that the simulation needs to assess. They guide model refinement, by indicating where the conceptual model must be reconsidered. Trade-offs represent the tensions among competing factors. Operations strategy research also highlights that factors such as cost, quality, speed, and flexibility cannot all be maximised simultaneously (Slack & Lewis, 2024). Trade-offs helps to observe how different priorities influence system outcomes. If reducing waiting time increases operating cost beyond acceptable thresholds then the simulation revealed a structural tension embedded in the system. Constraints define the boundaries within which decisions must operate. These boundaries include regulatory restrictions, budget limitations, technological capacity, or staffing availability. Constraints ensure that simulated scenarios remain feasible and helps to identify bottlenecks in the system. Decision theory also suggests that business problems become meaningful when choices are made under constraints with measurable consequences.
Simulation models operate through cycles of experimentation and refinement. Each iteration of the model produces results that can challenge initial assumptions. Therefore, documentation is essential, as it provides a reference for evaluating whether the simulation is correct or the model has missed variables or the business expectations are unrealistic. Over time, documentation also creates traceability between stakeholder expectations, system design choices, and observed outcomes. Such traceability not only strengthens the credibility of simulation-based decisions but also ensures that future analysts can understand, replicate, and improve the reasoning that guided earlier decisions.
Moreover, experimentation projects tend to drift from a business strategy when requirements, trade-offs, and constraints are not documented clearly. Feature accumulation, misinterpretation of objectives, and evolving assumptions can gradually reshape projects into systems that no longer address the original strategic problem. Comprehensive documentation acts as the stabilizing mechanism within the project lifecycle. By linking stakeholder objectives to measurable requirements and maintaining traceability throughout design, implementation, and evaluation, organizations ensure that solutions remain aligned with their strategy. In complex environments where uncertainty and competing priorities are inevitable, disciplined documentation is therefore essential for sustaining strategic coherence and enabling continuous learning.
The central challenge of modern strategy is not generating ambitious ideas but subjecting them to disciplined scrutiny before they are carried into execution. Specificity forces expectations to become measurable claims. Feasibility subjects these claims to future scenarios rather than a single optimistic projection. Documenting trade-offs records what the organization deliberately chose and what it consciously rejected. When these mechanisms operate together, strategy moves from narrative ambition to accountable decision-making. In this way, an organization does not eliminate uncertainty; it learns to navigate it systematically.
A failure does not begin in a strategy execution. It began months earlier, when a bold business expectation was allowed to pass for a decision.
References
- Aila Alessa. (2024). Developing an End-to-End Scenario Simulation Tool for Strategic Decision-Making in a Large Cap OEM—Simulating Multiple Futures to Understand Business Implications of Key Strategic Variables [Aalto University]. https://urn.fi/URN:NBN:fi:aalto-202501292192
- Bent Flyvbjerg. (2023, September 25). What You Should Know About Megaprojects and Why: An Overview. https://www.socialsciencespace.com/2023/09/what-you-should-know-about-megaprojects-and-why-an-overview/
- Chermack, T. J. (2004). Improving decision-making with scenario planning. Futures, 36(3), 295–309. https://doi.org/10.1016/S0016-3287(03)00156-3
- Flyvbjerg, B., & Gardner, D. (2023). How big things get done: The surprising factors that determine the fate of every project, from home renovations to space exploration and everything in between (First edition). Currency.
- Hirsch, S., Burggraf, P., & Daheim, C. (2013). Scenario planning with integrated quantification – managing uncertainty in corporate strategy building. Foresight, 15(5), 363–374. https://doi.org/10.1108/FS-09-2012-0064
- Jijidiana Bakhary, N., Azman, N., & Elabjani, A. (2024). Decision-Making Under Uncertainty: Lessons from Renewable Energy Sector Professionals. Journal of Resource Management and Decision Engineering, 3(2), 32–40. https://doi.org/10.61838/kman.jrmde.3.2.5
- Lin, G.-Y., Wang, Y.-M., & Wang, Y.-S. (2024). Developing and validating an instrument to measure model-reality fit in business simulation-based learning contexts. The International Journal of Management Education, 22(3), 101074. https://doi.org/10.1016/j.ijme.2024.101074
- Manuj, I., Mentzer, J. T., & Bowers, M. R. (2009). Improving the rigor of discrete ? event simulation in logistics and supply chain research. International Journal of Physical Distribution & Logistics Management, 39(3), 172–201. https://doi.org/10.1108/09600030910951692
- Nutt, P. C. (2002). Why decisions fail: Avoiding the blunders and traps that lead to debacles (1st ed). Berrett-Koehler Publishers.
- Pierre Wack. (1985, September). Scenarios: Uncharted Waters Ahead. https://hbr.org/1985/09/scenarios-uncharted-waters-ahead
- Slack, N., & Lewis, M. (2024). Operations Strategy (Seventh edition). Pearson.
- Son, J., & Rojas, E. M. (2011). Impact of Optimism Bias Regarding Organizational Dynamics on Project Planning and Control. Journal of Construction Engineering and Management, 137(2), 147–157. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000260
- Sterman, J. D. (2009). Business dynamics: Systems thinking and modeling for a complex world (Nachdr.). Irwin/McGraw-Hill.
- Täuscher, K., & Chafac, M. (2016). Supporting business model decisions: A scenario-based simulation approach. International Journal of Markets and Business Systems, 2(1), 45. https://doi.org/10.1504/IJMABS.2016.078107
- Tetlock, P. E. (2017). Expert political judgment: How good is it? How can we know? (New edition). Princeton University Press.
- Walsh, J. P., & Ungson, G. R. (1991). Organizational Memory. The Academy of Management Review, 16(1), 57. https://doi.org/10.2307/258607
- West, K. D. (1996). Asymptotic Inference about Predictive Ability. Econometrica, 64(5), 1067. https://doi.org/10.2307/2171956
I am an interdisciplinary educator, researcher, and technologist with over a decade of experience in applied coding, educational design, and research mentorship in fields spanning management, marketing, behavioral science, machine learning, and natural language processing. I specialize in simplifying complex topics such as sentiment analysis, adaptive assessments and data visualizatiion. My training approach emphasizes real-world application, clear interpretation of results and the integration of data mining, processing, and modeling techniques to drive informed strategies across academic and industry domains.
Discuss