Category: Case Studies

  • Decision Intelligence in Route Optimisation

    Decision Intelligence in Route Optimisation

    A 6-Week PoC with FedEx European Linehaul

    Executive Summary

    Decision Intelligence moves an organisation beyond the fixed-plan trap toward proactive, automated resilience in route optimisation. By evaluating the strategic trade-offs between explainable Stochastic Optimisation and scalable Reinforcement Learning, we proved that move-level agility is the key to maintaining flow in a high-uncertainty environment.

    Key Takeaways:

    • Beyond Rigid Scheduling: Shifting from historical templates to dynamic, operational-time decision-making to maximise capacity utilisation.
    • The Technical Showdown: Comparing the audit trails of Stochastic Programming against the autonomous adaptability of Reinforcement Learning.
    • Predictive Simulation: Utilising a road-based “digital sandbox” to test courses of action and mitigate risks before committing resources.
    • Tangible ROI: Delivering financial returns by improving linehaul utilisation and significantly reducing the need for costly ad-hoc transport.

    The Challenge: The Friction of Fixed Planning

    In the high-stakes corridors of European logistics, fixed plans are often the first casualty of reality. For an Operations Director managing a distribution network across the UK and EU, the daily friction is visceral. You are constantly forced to ask: “Should I delay this trailer, so it leaves full, or stick to the schedule? Do I need to commission an expensive ad-hoc truck to cover this surge, or will the bottleneck clear itself?”

    When package volumes fluctuate unpredictably at major hubs, static schedules become more than just an inconvenience—they become a drain on margins and a threat to service levels. At Decision Lab, we operate under a foundational truth: the success of an organisation is nothing but the sum of all its decisions. To help global leaders move beyond the fixed-plans trap, we conducted a six-week Proof of Concept (PoC) with FedEx. This project tackled real-world complexity head-on, proving that Decision Intelligence is the key to transitioning from reactive firefighting to proactive, automated resilience.

    Static Schedules are the Enemy of Efficiency

    The core challenge identified within FedEx’s European network was the inherent limitation of pre-defined linehaul schedules. These schedules were designed for averages, whereas logistics are often defined by exceptions. When incoming and outgoing package volumes at European hubs diverged from the forecast, rigid plans could not adapt.

    A dynamic approach, powered by operational-time decision-making, is the only way to maintain flow in a volatile environment. By rerouting assets and scheduling departures based on real-time parcel traffic rather than historical templates, an organisation can achieve step-changing improvements in capacity utilisation.

    Our expertise in AI, ML, simulation, and mathematical optimisation helps organisations cut through complexities in strategic, tactical and operational processes.

    The Solution: Bridging the Gap with Decision Intelligence

    The choice between technical approaches is rarely straightforward; in this case it was a strategic balancing act between Explainability and Scalability. During our PoC, we evaluated two competing methodologies: explainable Stochastic Programming and scalable Reinforcement Learning (RL).

    FeatureStochastic ProgrammingReinforcement Learning (RL)
    Primary StrengthFast solving speed; mathematically explainable and provable.Reacts to high uncertainty using World Models and Graph Neural Networks.
    Logic BasisLocates the best strategy to optimise expected outcomes over uncertainty.Uses a dynamics model to predict the optimum next action.
    AdaptabilityMulti-objective handling: Uniquely suited for balancing cost vs. customer service levels.Observation-size invariant: Handles environments with variable data lengths and network nodes.
    Strategic RiskConsulting intensive: Very sensitive to human-built heuristics, which are expensive and time-consuming to develop.Compute intensive: Requires significant hardware resources for training the World Model.

    While Stochastic Programming offers a clear audit trail for every decision, RL provides the adaptability required for massive, interconnected networks. The right choice depends on whether your organisation prioritises a provably optimal solution or a highly performant best-effort that can autonomously learn the shifting dynamics of global markets.

    Don’t Just Predict—Simulate the Impact

    One of the most powerful tools developed for FedEx was a road-based, hub-to-hub package movement simulator. This provides a digital sandbox where controllers can explore alternative COAs (Courses of Action) before committing resources.

    Our completely data-driven deployment method allows us to build these simulations without the months of manual coding traditionally required. By accessing relevant operational and transport data directly, we can simulate supply chain environments to predict the ripple effects of a delay or reroute.

    This tool predicts the impact of different actions, helping to mitigate risks and optimise routes.

    Data Maturity is the Ultimate Competitive Moat

    For large-scale firms with a £200M+ turnover, the transactional backbone—usually an ERP or MRP system like SAP or Oracle—is necessary but insufficient. To achieve true antifragility, you must layer Decision Intelligence over these systems.

    Antifragility is the ability to not just survive volatility, but to actually improve because of it. By utilising a World Model within an RL framework, the system treats every fluctuation in package volume as a learning opportunity, refining its dynamics model to better anticipate future shocks. This requires three layers of data maturity:

    • Strategic Level: Long-term high-level routes, fleet capacity, and cost-per-mile data.
    • Operational Level: Real-time visibility into items currently loaded or waiting at the depot.
    • Historical Level: Deep archives of how volumes fluctuated in similar time slots in the past.

    The Result: Antifragility and Bottom-Line Returns

    In the C-suite, the value of AI is measured by the bottom line. The FedEx project was not an academic exercise; it was focused on delivering the financial returns demanded by an industry with tight margins. The PoC demonstrated that an autonomous planning agent directly impacts:

    • Improved Linehaul Utilisation: Driving higher Overall Equipment Effectiveness (OEE) across the fleet.
    • Reduced Rescheduling: Eliminating the administrative friction and cost of mid-stream plan changes.
    • Minimised Ad-hoc Linehauls: Directly de-risking Operational Expenditure (OPEX) and informing more accurate Capital Expenditure (CAPEX) by reducing the need for emergency transport.

    Conclusion: Toward the Global Digital Twin

    The ultimate evolution of this journey is an advanced road-based package movement digital twin. By connecting multiple hubs in real-time, organisations can create a living model of their entire network that learns, adapts, and optimises itself.

    What is the sum of your organisation’s decisions? How many of your current logistics choices are being left to a fixed plan that no longer fits your reality? In a world of increasing volatility, the goal is no longer just to have a plan—it is to have a system that provides decision clarity and reliable value.

    Transform your logistics operations today. Reach out directly via out contact page, or connect with us on LinkedIn to start a conversation about de-risking your future.


  • Project T-DA: Securing the Shadows with Temporal-Aware AI

    Project T-DA: Securing the Shadows with Temporal-Aware AI

    Sector: Critical Infrastructure Security
    Origin: Decision Lab’s AI Innovation Lab 

    Executive Summary

    • Delivers 24/7 autonomous situational awareness on standard edge hardware, removing the need for costly server infrastructure refits.

    In the high-stakes world of critical infrastructure protection, the gap between a routine patrol and a security breach is often measured in seconds. Yet, traditional surveillance systems are manipulated by intelligent behaviour. Standard algorithms rely on invariant detections (classification of known objects) and often lack the ability to determine action and intent. While these algorithms can effectively detect and classify inanimate objects that might be inherently harmful or broken, they falter when detecting complex actions such as abnormal or harmful patterns of behaviour.  

    Emerging from our AI Innovation Lab, the Threat-Detection using Autoencoders (T-DA) programme was designed to close this gap. By combining state-of-the-art Computer Vision with novel temporal awareness, we delivered an unsupervised Deep Learning solution capable of learning the ‘pattern of life’ aspects of behaviour. The result is a system that doesn’t just see movement but understands context—differentiating between a scheduled patrol and an unscheduled intrusion without requiring massive, labelled datasets

    The Challenge: The Signal in the Noise 

    Our client, responsible for the security of high-sensitivity sites, faced a strategic pain point: operational blindness caused by data overload. Their existing surveillance infrastructure relied on simple motion detection and rule-based triggers. 

    These legacy systems suffered from two critical failures relating to Pattern of Life (PoL): 

    Pattern of Life anomalies diagram.Action: suspicious, harmful
Time: unusual, after hours
Object: weapon, mask
Role: insider, impersonator

    High False Positive Rate: Innocent environmental changes (e.g., wind-blown debris) or routine scheduled events triggered constant alarms, leading to operator fatigue and desensitisation. 

    Contextual Blindness: The systems could not distinguish between visually similar but contextually different events. A guard walking a perimeter at 14:00 is routine; a person walking the same path at 03:00 could be a threat. Standard models saw only ‘person walking’. 

    The client required a solution that could autonomously detect anomalies in real-time, operate on resource-constrained edge devices, and—crucially—learn what normal looks like without needing thousands of manually labelled “threat” examples. 

    The Solution: Temporal-Aware Deep Learning

    Decision Lab deployed a cutting-edge unsupervised anomaly detection pipeline that fundamentally changes how machines perceive security footage.

    Traditional systems require training on thousands of examples of threats (which are rare and varied). Instead, we taught the model what normality looks like. By learning the standard pattern of life, the system can autonomously flag any event that deviates—whether it is a known threat type or an entirely new anomaly.

    A flowchart diagram illustrating an anomaly detection pipeline using an autoencoder. The flow moves from left to right: Input: "Raw sensor datapoints" and "Temporal factors" feed into a "Datapoint fusion" block. Processing: This feeds into an "Autoencoder" block, which splits into two paths: a training loop and an inference path. Training Pipeline: A lower section labeled "Autocoder training pipeline" details a cycle: "Encoder" points to "Latent representation" (highlighted in green), which flows to "Decoder," then "Loss (MSE/Binary Cross-entropy)," "Back propagation," "Update weights," and finally loops back to "Encoder." Output: The Autoencoder also connects to an "Inference" block (in blue). "Inference" points to an "Anomaly detection (flag)" block. Model State: An arrow points from "Inference" up to a "Trained model" block, which then points down to the "Anomaly detection (flag)" block.

    1 . The Core Architecture

    To ensure our solution remained modal-agnostic, we experimented with various encoder models, ranging from standard LSTMs to Gaussian Mixture Models. For the Proof of Concept (PoC), we implemented a Vision Transformer (ViT)-based Variational Autoencoder (VAE).

    • The Encoder (ViT): Unlike standard CNNs that look at localised pixels, the ViT uses self-attention mechanisms to capture global contextual information from video frames.
    • The Decoder: This component attempts to reconstruct the video frames from the encoder’s summary.
    • The Trigger: If the model cannot accurately reconstruct a scene (resulting in a high reconstruction error), it indicates the event is not in its learned database of normal behaviours, instantly triggering an anomaly alert.

    2. Innovation: Temporal Integration

    Standard computer vision models are time-blind; they see a person walking but do not know if it is 14:00 (routine) or 03:00 (suspicious). To solve this, we engineered a novel Cyclic Time Encoding mechanism.

    • Cyclic Encoding: We encoded timestamps using sine and cosine functions. This captures the periodic nature of time (24-hour cycles) more effectively than linear numbers.
    • Contextual Conditioning: This time vector modulates the model’s latent space. effectively teaching the AI that Activity A is normal at Time X, but anomalous at Time Y.

    This approach yielded two critical capabilities:

    1. Temporal Anchoring: We successfully introduced a temporal factor—contextual metadata that anchors the model in time, rather than relying solely on visual pixel data.
    2. Scalable Context: While this PoC used timestamps, the architecture can ingest any form of metadata. The model can be conditioned on geographical data (weather, pressure), or solution-specific constraints (security clearance levels, job titles), making T-DA highly adaptable across defence, supply chain, and rail verticals.

    3. Edge Deployment (SWaP Optimised)

    Meeting the strict requirements for defence operations, we optimised the model using FP16 (16-bit floating point) precision.

    • Size Reduction: This compressed the model size by 50%.
    • Performance: The system runs efficiently on resource-constrained edge devices (e.g., drones, remote sentries) without sacrificing detection accuracy.
    • Security: Data is processed locally, reducing bandwidth requirements and closing the attack surface associated with cloud transmission.

    Results & Impact 

    The T-DA project successfully demonstrated that autonomous systems can reduce the cognitive load on human operators while enhancing threat detection. 

    • High Precision: The ViT-based VAE achieved an ROC AUC of 0.855 on general visual anomaly detection, validating the unsupervised approach. 
    • Operational Efficiency: The move to FP16 precision resulted in a 50% reduction in model size and significant runtime memory savings, enabling deployment on standard edge hardware rather than expensive server racks. 

    Reduced Fatigue: By automating the detection of contextually specific anomalies, the system significantly reduced the time security personnel spent reviewing false alarms, allowing them to focus on genuine threats.

    FOCUS: AI TRiSM (Trust, Risk, and Security Management) 

    As part of Decision Lab’s commitment to Responsible AI (read our full series here), the T-DA project was developed in strict alignment with the AI TRiSM framework. In high-stakes defence environments, an AI model must be as trustworthy as the officers using it. 

    1. Trust: Explainability beyond the Black Box 

    A security operator cannot act on an alert they don’t understand. We moved beyond simple “anomaly scores” by integrating Explainable AI (XAI) techniques. 

    • Heatmaps: The system provides real-time reconstruction error heatmaps, visually highlighting exactly where in the frame the anomaly is occurring (e.g., highlighting a specific backpack or unauthorised vehicle). 
    • Contextual Logic: We explored the integration of LLMs to generate natural language explanations, translating complex vector data into clear summaries: ‘Unusual activity detected: Person running at 02:45 AM (high deviation from routine).’ 

    2. Risk: Proactive Reliability 

    Unsupervised models can drift over time if the environment changes. We mitigated this risk through: 

    • Synthetic Anomaly Injection: To rigorously test the system before deployment, we developed a methodology to inject synthetic temporal anomalies into the data, ensuring the model could catch threats that hadn’t happened yet. 
    • Bias Audits: We conducted formal audits of the training data to ensure the normal baseline didn’t inherently bias the model against specific demographics or harmless behaviours. 

    3. Security: ModelOps & Data Integrity 

    Security is paramount not just in the physical site, but in the digital pipeline. 

    • Data Protection: We implemented encrypted channels for all video streams and strict access controls for training data. 
    • ModelOps: A robust lifecycle management framework was established, including version control for model weights and automated drift detection to trigger retraining. This ensures the model adapts to new patterns of life securely and transparently. 

    Learn more in our AI TRiSM blog series.

    Conclusion 

    The T-DA project illustrates the power of the Decision Lab Innovation Lab to translate theoretical AI advances into robust, deployable security solutions. By treating time as a critical feature of reality, we moved surveillance from reactive monitoring to proactive threat detection. 

    Project T-DA Key Facts:

    • Developer: Decision Lab
    • Primary Tech: Vision Transformer (ViT) & Variational Autoencoder (VAE)
    • Innovation: Cyclic Time Encoding (Temporal Awareness)
    • Use Case: Unsupervised anomaly detection for critical infrastructure.
    • Performance: 0.855 ROC AUC with 50% model compression via FP16.

    Would you like to explore how Decision Lab can streamline your operations? Contact our Innovation Team today

  • Optimised Production and Sustainable Capacity Planning

    Optimised Production and Sustainable Capacity Planning

    GSK’s Journey to Enhanced Supply Chain Antifragility

    The orange GSK logo, with a flowing, interconnected design that symbolises the link between science, technology, and talent to drive biopharma innovation.

    The Challenge: Navigating Volatility in Pharma & Life Sciences Manufacturing

    Global pharmaceutical manufacturers seek to optimise production and capacity planning. Challenges such as changeover downtime, CAPEX tied up in production lines, and the need to rapidly scale or reconfigure capacity to meet fluctuating demand and forecasts are common. In a sector where agility and resilience are paramount, these pain points directly impact operational efficiency, financial performance, and the ability to achieve critical sustainability goals.

    GSK, a global biopharmaceutical leader with a vast manufacturing footprint, encountered a specific, yet broadly relevant, challenge at its Irvine site. While a clear correlation existed between batch production schedules and energy consumption, the unpredictable nature of machine usage made accurate energy forecasting incredibly difficult. GSK sought greater foresight to understand the true financial and environmental impact of its production planning decisions and to strategically integrate renewable energy sources. This scenario is a microcosm of the larger need for supply chain antifragility in the pharma sector – the ability not just to withstand disruption, but to improve and adapt.

    The Solution: Predictive Analytics & Simulation for Antifragile Operations

    Decision Lab partnered with GSK to address these complexities by developing a cutting-edge hybrid model. This innovative solution seamlessly integrated machine learning (ML) with advanced simulation, creating a powerful decision-support tool that connects production planning directly with energy consumption forecasting. This approach empowered GSK to achieve smarter, more sustainable, and antifragile operations.

    Key Components of the Decision Lab Solution:

    • Machine Learning (ML) for Precision Forecasting:
      • A Python-based model meticulously analysed historical energy data, establishing a precise baseline for consumption under various production plans.
      • A Random Forest Regressor was employed to predict electricity and steam usage with high accuracy, taking into account machine schedules and their inherent variability.
    • Simulation for What-If Scenario Planning:
      • Outputs from the ML model were fed into a dynamic AnyLogic simulation environment . This allowed GSK to virtually assess the impact of introducing additional renewable energy sources, such as wind and solar, into the site’s energy infrastructure. The simulation established a real-time link between production cycle plans and energy usage.
      • The model facilitated extensive “what-if” scenario testing, enabling GSK to explore alternative manufacturing plans and simulate unforeseen operational disruptions or system failures, using Monte Carlo simulations to account for inherent variability and uncertainty.
      • Detailed tracking of machine utilisation and energy consumption identified inefficiencies and provided actionable insights for energy reduction. The model incorporated variables such as electricity, steam, gas usage, and renewable energy generation for a holistic view.
      • AnyLogic’s robust capabilities and seamless Python integration via Pypeline were crucial, enabling real-time data exchange and dynamic, accurate modelling.

    The Impact: Building a Resilient, Sustainable, and Cost-Efficient Future

    The implementation of Decision Lab’s predictive analytics and simulation solution provided GSK with significant strategic advantages, directly contributing to a more antifragile supply chain and addressing core manufacturing pain points:

    • Enhanced Energy Consumption Forecasting: Achieved highly accurate predictions for energy usage, enabling better budgeting and proactive planning.
    • Optimised Renewable Energy Integration: Data-driven insights guided strategic investments in solar and wind energy, accelerating progress towards sustainability goals.
    • Streamlined Production Plans: The ability to link production schedules with energy impact allowed for optimising manufacturing plans, leading to reduced energy costs and environmental footprint.
    • Reduced Operational Costs and Emissions: Through intelligent planning and renewable energy adoption, GSK realised potential cost savings and significant reductions in greenhouse gas emissions.
    • Increased Operational Resilience: The simulation capabilities allowed GSK to proactively evaluate and prepare for potential system failures and disruptions, minimising downtime and energy waste. This directly mitigates the risk of CAPEX being tied up in underutilised or vulnerable lines.
    • Improved Capacity Planning Agility: By understanding the energy implications of different production plans, GSK gained greater flexibility in ramping up or changing capacity to meet demand, addressing a key pain point for global pharma manufacturers.

    This successful collaboration with GSK underscores Decision Lab’s expertise in delivering intelligent, data-driven solutions that not only meet immediate operational needs but also build a foundation for long-term supply chain antifragility in the demanding Pharma & Life Sciences sector. It provides a scalable blueprint for other mid-to-large global pharma manufacturers striving for optimised production, sustainable capacity planning, and enhanced resilience in an ever-evolving market.

    Want to learn how Decision Lab can help your organisation build an antifragile supply chain? Contact us today to discuss your specific challenges in production and capacity planning.

  • Forging an Antifragile Future for UK Medicines Manufacturing

    Forging an Antifragile Future for UK Medicines Manufacturing

    Executive Summary

    The UK’s world-leading pharmaceutical manufacturing sector faces a critical inflection point. Confronted with volatile energy markets, mounting operational costs, and the NHS’s ambitious decarbonisation targets, the industry requires a paradigm shift from rigid, energy-intensive processes to a more dynamic, resilient, and sustainable model. In response, Innovate UK launched its Sustainable Medicines Manufacturing Grand Challenge.

    The Decision Lab, leading a strategic consortium of industry giants and academic experts, was awarded funding to pioneer a solution. Our project, EcoSynth, introduces an AI-powered energy orchestrator that transforms production planning from a static, siloed function into an intelligent, integrated system. By synchronising production schedules with real-time energy data, carbon intensity forecasts, and on-site renewables, we are building an antifragile manufacturing ecosystem. Initial pilots have already demonstrated the potential for 7-12% energy savings, up to 20% reduction in Scope 2 CO₂ emissions, and a 10% uplift in asset utilisation. This initiative is not just about incremental improvements; it is about creating a replicable, scalable blueprint to secure the UK’s position as a global leader in low-carbon, resilient medicines manufacturing.

    The Challenge: A High-Stakes Environment

    Pharmaceutical manufacturing is the backbone of UK life sciences, yet it operates under immense pressure. The sector spends over £1 billion annually on energy and is responsible for an estimated 5% of Europe’s industrial CO₂ emissions. This environmental footprint is coupled with significant economic and regulatory challenges:

    • Energy Volatility: Unpredictable energy prices create major financial risks for manufacturers whose validated processes cannot be easily adjusted.
    • NHS Net Zero Targets: The NHS, a primary customer, has mandated that suppliers publish carbon reduction plans and footprints, making decarbonisation a commercial imperative.
    • Operational Rigidity: Traditional planning systems are blind to energy and carbon data. Schedules are optimised for throughput and compliance, leaving significant cost and carbon savings untapped and exposing the supply chain to disruption.

    The challenge was clear: to move beyond isolated efficiency measures and fundamentally redesign the operational DNA of medicines manufacturing to be smarter, more adaptive, and inherently sustainable.

    Our Solution: The EcoSynth AI-Powered Orchestrator

    Leveraging our deep expertise in energy management, strategic production planning, and optimisation, The Decision Lab convened a powerhouse consortium including GSK, AstraZeneca, the Centre for Process Innovation (CPI), Cell and Gene Therapy Catapult (CGTC), Aston University, Industrial Systems and Control, and Transformational Energy.

    Together, we are developing EcoSynth: an AI-powered energy orchestrator that serves as an intelligent co-pilot for production planners.

    EcoSynth integrates disparate data streams—from manufacturing execution systems (MES), grid carbon intensity forecasts, and on-site energy assets—into a single, dynamic decision-support environment. Its core innovation lies in its ability to:

    1. Forecast and Model: Predict energy demand and carbon impact at an asset level.
    2. Optimise Schedules: Recommend adjustments to production sequences to align with periods of low-cost, low-carbon energy, without compromising GMP compliance or throughput.
    3. Simulate Scenarios: Allow planners to test strategies and build resilience against market or grid disruptions, creating a truly antifragile operational capability.

    This project, funded by Innovate UK, moves beyond theory. It involves live pilots at up to eight diverse manufacturing sites, including a pioneering National Readiness Demonstrator that integrates on-site hydrogen generation to stress-test the system in a next-generation energy environment.

    Business Outcomes and Strategic Impact

    The EcoSynth project is already delivering significant, measurable results that create a powerful business case for change.

    Quantifiable Performance Gains:

    • 7-12% reduction in total site energy consumption.
    • Up to 20% reduction in Scope 2 CO₂ emissions through intelligent scheduling.
    • 10% uplift in asset utilisation by smoothing production flow and reducing downtime.
    • An estimated £100M+ in potential annual savings if scaled across the UK pharma sector.

    Strategic Business Benefits:

    • Enhanced Resilience: By creating flexibility in energy use, the system makes the supply chain less vulnerable to price shocks and grid instability, enhancing its antifragility.
    • Competitive Advantage: Directly supports compliance with the NHS Net Zero Supplier Roadmap, securing and strengthening crucial commercial relationships.
    • Increased Productivity: Frees up production capacity and capital by optimising existing assets rather than requiring expensive new infrastructure.
    • UK Economic Strength: Anchors high-value jobs and intellectual property in the UK, aligning with the government’s Life Sciences Vision and Advanced Manufacturing Plan.

    By proving that sustainability and productivity are not mutually exclusive, we are providing a practical, scalable blueprint for deep industrial decarbonisation. This positions the UK to lead the global transition to cleaner, more efficient, and resilient medicines manufacturing.

  • Strategic Capacity Planning for a Revolutionary Pharmaceutical Development

    Strategic Capacity Planning for a Revolutionary Pharmaceutical Development

    Executive Summary

    A global pharmaceutical leader was on the verge of launching a revolutionary, disease-modifying treatment for debilitating neurodegenerative condition. While the new therapy offered unprecedented hope, it also presented a monumental challenge: preparing a national health system for the surge in demand for diagnostics and treatment administration. The existing infrastructure was not equipped to handle the complex patient pathway required, threatening to create significant bottlenecks and delay patient access.

    Decision Lab partnered with the client to develop a sophisticated discrete-event simulation model. This powerful decision-support tool allowed stakeholders to visualise the entire patient journey, from initial GP referral to treatment. By modelling various scenarios, the tool identified critical constraints in diagnostic capacity (MRI, PET, CSF tests) and infusion services.

    The Challenge: Preparing for a Paradigm Shift in Neurological Care

    The introduction of the first-ever treatments designed to tackle the underlying causes of a progressive neurological condition marked a pivotal moment in medicine. Our client, a pioneer in this field, recognised that the success of their new drug depended not just on its efficacy, but on the healthcare system’s ability to deliver it.

    The new treatment required a complex and resource-intensive diagnostic process involving PET scans, MRI scans, and specialist consultations to confirm eligibility. Furthermore, ongoing monitoring was necessary to manage potential side effects. Projections indicated that up to 280,000 patients in England alone could be eligible, placing an unprecedented strain on a system already facing capacity constraints.

    The core challenge was to understand and mitigate the risks posed by these new demands. The client needed to:

    • Identify and quantify potential bottlenecks in the diagnostic and treatment pathway.
    • Forecast the impact of a significant increase in patient referrals on existing resources.
    • Develop strategies to optimise patient flow and build a resilient, or antifragile, healthcare ecosystem.
    • Communicate these complex challenges to healthcare payers and providers to facilitate proactive service redesign.

    Without a clear, evidence-based view of the future, the launch risked being hampered by long waiting lists, delayed diagnoses, and inequitable patient access.

    The Solution: A Strategic Partnership in Simulation

    Decision Lab fosters strategic partnerships with our clients, helping understand the intricate details of their challenges. In this case, we collaborated closely with the client and their data partners to design and build a bespoke discrete-event simulation model. This wasn’t just about delivering a tool; it was about co-creating a solution to a strategic problem.

    Our process involved:

    • Deep-Dive Discovery: We held intensive workshops with the client’s clinical and market access teams to map out the complex “as-is” patient pathway and a hypothetical optimised “ideal” pathway.
    • Agile Development: Using an agile methodology, we built the simulation in iterative sprints. This allowed for continuous feedback and ensured the model accurately reflected the nuances of the UK healthcare environment.
    • Data Integration: The model was populated with robust, real-world data, including primary care activity, hospital episode statistics, and findings from clinical literature, to provide a credible and reliable foundation for analysis.

    The resulting decision-support tool, integrated into a user-friendly web platform, empowered the client to:

    • Simulate Patient Flow: Model the journey of thousands of patients through the system over a three-year period.
    • Test Scenarios: Compare the “as-is” pathway against optimised models, adjusting over 50 variables, including patient numbers, resource availability (e.g., MRI hours per week), and pathway logic.
    • Visualise Outcomes: Generate clear, intuitive dashboards and reports that highlighted key metrics such as average time-to-diagnosis, waiting list sizes for specific tests, and total infusion hours required.

    This simulation provided a virtual sandbox where different strategies could be tested and their outcomes measured, turning uncertainty into actionable insight.

    Results and Business Outcomes: Building a Resilient, Antifragile System

    The simulation model delivered immediate and significant value, transforming our client’s conversations with healthcare stakeholders from speculative to strategic. The key business outcome was the ability to build a compelling, data-driven narrative for change.

    Key Metrics and Outcomes:

    • Quantified Bottlenecks: The model precisely calculated the impact of increased demand. For one scenario, it showed that with a 25% increase in patient referrals, the waiting list for memory assessments would grow by over 200% within three years under the current system.
    • Evidence for Optimisation: By simulating an ‘ideal’ pathway that co-located diagnostic services, the model demonstrated a potential 47% reduction in the average time to diagnosis and a 35% reduction in the average time to treatment initiation.
    • Strategic Resource Planning: The tool provided clear data on resource utilisation, showing, for example, the exact number of additional weekly MRI hours and infusion clinic appointments needed to meet demand, enabling targeted investment discussions.

    By using the simulation, our client helped healthcare systems become more antifragile—not just robust enough to withstand the shock of new demand, but capable of adapting and growing stronger. They could proactively identify pressures and design more efficient, streamlined services. This strategic foresight ensured that the launch of their ground-breaking treatment would be defined by patient benefit, not by system failure, cementing their position as a true partner to the healthcare community.

  • Antifragile Pharmaceutical Production

    Antifragile Pharmaceutical Production

    Executive Brief

    The Challenge: AstraZeneca’s long-range strategic production planning was constrained by a fragmented and manual Excel-based process. This created a fragile system, slow to react to market volatility and unable to provide the data-driven confidence needed for multi-billion-pound investment decisions. The key business risk was inefficient capital allocation and a potential inability to meet future global demand in a complex, uncertain environment.

    The Solution: Decision Lab partnered with AstraZeneca in a deeply collaborative engagement to co-create a dynamic simulation twin of their global manufacturing network. This unified, web-based platform integrates three powerful simulation models (Portfolio, Demand, and Capacity) into a single, seamless user experience. By automating data flows and enabling rapid, sophisticated scenario analysis, the solution replaced a reactive process with a proactive, strategic capability.

    The Outcome: The partnership delivered an antifragile production strategy, empowering AstraZeneca to not just withstand uncertainty, but to gain advantage from it. The platform achieved a high System Usability Scale (SUS) score of 75.0, ensuring strong adoption. The company can now model complex “what-if” scenarios in a fraction of the time, turning strategic planning into a source of competitive advantage and ensuring confident, optimised decision-making for the next decade.

    Building an Antifragile Future: How AstraZeneca Partnered with Decision Lab to Revolutionise Production Strategy

    The pharmaceutical landscape is defined by volatility—patent cliffs, pipeline uncertainties, and fluctuating global demand. For a leader like AstraZeneca, navigating this requires more than just resilience; it demands an antifragile strategy. An antifragile system is one that doesn’t just resist shocks, but learns and improves from them. Faced with the limitations of traditional, static planning tools, AstraZeneca partnered with Decision Lab to transform their long-range production and capacity planning, embedding agility and data-driven confidence into the core of their global operations. This partnership moved them beyond simple forecasting towards a future-proofed manufacturing network designed to thrive on uncertainty.

    The Challenge: From Reactive Planning to Proactive Strategy

    AstraZeneca’s existing strategic planning process relied on a complex web of disconnected, manually-intensive spreadsheets. This approach was not only slow but also dangerously susceptible to data inconsistencies and errors, making it difficult to model future uncertainties effectively. The process involved multiple stakeholders across different business units and partner organisations, each with their own data and assumptions, making alignment a significant challenge.

    Answering critical “what-if” questions—such as the impact of a clinical trial success, a change in the R&D pipeline, or a supply chain disruption—was a time-consuming ordeal that could take weeks. This reactive posture created significant business risk, potentially leading to inefficient capital allocation, delayed product launches, and a fragile manufacturing network unable to adapt quickly to market shocks or seize emerging opportunities. The need was clear: a fundamental shift from a rigid, retrospective process to a dynamic, forward-looking strategic capability that could provide a single source of truth for the entire organisation.

    The Partnership & Solution: A Collaborative Simulation Twin

    Decision Lab’s philosophy is built on strategic partnership. We embedded a dedicated team of simulation, software, and AI experts to work in close, agile collaboration with key stakeholders from AstraZeneca and their partners, including GSK. This ensured the solution was not just technically robust, but also deeply aligned with their commercial and operational goals. Through iterative development and continuous user feedback, we co-created a solution that addressed their unique challenges head-on.

    The result is a sophisticated web-based platform—a dynamic simulation twin of their entire manufacturing network. This unified system seamlessly integrates three powerful, interconnected simulation models, hosted on the AnyLogic Cloud model management platform for scalability and performance:

    • The Portfolio Model: Looks at the current R&D pipeline and stochastically generates a range of plausible future asset lifecycles, accounting for the inherent uncertainty of clinical development.
    • The Demand Model: Takes the outputs from the Portfolio model and translates them into a detailed, 10-year quarterly forecast of active ingredient requirements across the globe.
    • The Capacity Model: This is the strategic core. It takes the demand forecast and evaluates a vast array of manufacturing options, investment plans, and supply chain configurations to determine the most efficient and robust way to meet that demand.

    Our team engineered a custom data pipeline using web sockets and a modern ReactJS front-end. This automates the flow of data between the models, eliminating manual errors and creating a single, trusted source of truth. The platform allows AstraZeneca to simulate, stress-test, and optimise their strategy against countless future scenarios. This builds an inherently antifragile operational backbone that doesn’t just withstand volatility, but allows the organisation to learn and improve from it.

    Business Outcomes & Impact: Confidence in the Face of Complexity

    The impact of this transformation is profound, providing AstraZeneca with unprecedented strategic agility and confidence. The solution’s immediate value was confirmed by its end-users, achieving a System Usability Scale (SUS) score of 75.0. This rating is well above the industry average of 68, signifying high acceptance and usability across both technical and senior leadership teams, ensuring the tool is actively used to drive critical decisions.

    By replacing a slow, manual process with an interactive simulation twin, AstraZeneca has drastically enhanced its workflow efficiency. They have minimised the risk of data errors and freed up their internal experts to shift their focus from laborious data wrangling to high-value strategic analysis and interpretation.

    Most importantly, they can now proactively model the future, turning uncertainty into a competitive advantage. The leadership team can explore the long-term impact of M&A activity, pressure-test their network against potential supply disruptions, and confidently make multi-billion-pound capital investment decisions. This strategic capability ensures their production network is not just prepared for the future, but is actively shaped to thrive in it.

  • Building an Antifragile Pharmaceutical Production Facility

    Building an Antifragile Pharmaceutical Production Facility

    GSK Lab Capacity Planning: Saving Millions, Building Resilience

    The orange GSK logo, with a flowing, interconnected design that symbolises the link between science, technology, and talent to drive biopharma innovation.

    In a world of volatile demand and stringent regulatory standards, GSK embarked on a lab capacity planning project for a multi-million-pound new sterile manufacturing facility in Parma, Italy.

    The challenge was immense: how to ensure a continuous supply of critical biopharmaceutical drugs during the construction phase while designing a facility robust enough to handle future uncertainties. This case study explores how Decision Lab partnered with GSK to create a dynamic simulation model, a digital twin that not only optimised the facility’s design for efficiency and cost-effectiveness but also laid the foundation for an antifragile production system.

    The Lab Capacity Planning Challenge: Navigating Complexity and Uncertainty

    The GSK Parma facility is a cornerstone of their global manufacturing network, specialising in bringing new biopharmaceutical products to market. The need to upgrade their sterile production line to meet new regulatory requirements presented a significant operational and financial challenge. The primary concerns were:

    • Minimising Supply Chain Risk: The construction of the new facility could not disrupt the supply of crucial drugs to patients worldwide.
    • Managing Volatile Demand: The demand for biopharmaceutical products is notoriously difficult to predict, and the new facility needed to be able to adapt to these fluctuations.
    • Optimising Investment: With a multi-million-pound investment at stake, every decision regarding equipment, staffing, and operational procedures needed to be thoroughly vetted to maximise return and minimise waste.

    To address these challenges, GSK recognised the need for a powerful decision-support tool that could provide a clear vision of the future facility’s performance before a single brick was laid. They turned to Decision Lab and the power of simulation modelling.

    The Solution: A Digital Twin for a Dynamic Future

    Decision Lab, in close collaboration with over 30 experts from GSK, developed a sophisticated simulation model using AnyLogic software (see their case study). This wasn’t just a static blueprint; it was a living, breathing digital twin of the future production facility. The solution was delivered in two phases:

    • Phase 1: Process Modelling: This phase focused on creating a detailed discrete-event simulation of the entire production process, from compounding to crimping. This allowed for the analysis of complex interactions between machinery, staff, and various production scenarios. The model accounted for a vast array of parameters, including equipment size and capacity, staffing levels, shift patterns, and cleaning schedules.
    • Phase 2: Visualisation and the Digital Twin: The second phase brought the model to life. By importing CAD files, Decision Lab created a visually rich, 2D and 3D representation of the facility. This digital twin included features like operator-activity heatmaps to optimise cleaning protocols in the sterile environment and Gantt charts to visualise production schedules. This enhanced visualisation was crucial for engaging stakeholders at all levels, transforming the model from a “black box” into a transparent and intuitive decision-making tool.
    Heat map from the simulation of the GSK Parma facility.

    This powerful simulation allowed GSK to:

    • Explore What-If Scenarios: The model was used to evaluate over 200 different business scenarios, testing various facility designs and operational strategies in a risk-free environment. This enabled them to identify potential bottlenecks and process sensitivities before they became costly realities.
    • Optimise Resource Allocation: The simulation provided precise answers to critical questions about the optimal number and type of machines, staffing levels, and batching patterns. This data-driven approach ensured that the facility was designed for maximum efficiency and throughput.
    • Build a Compelling Business Case: The detailed insights generated by the model provided a solid foundation for the business case, smoothing the approval process and allowing the project to move forward without delays.
    3D simulation view of the GSK Parma facility.

    The Results: A 20% Reduction in CAPEX and a Foundation for Antifragility

    The collaboration between Decision Lab and GSK yielded impressive results:

    • 20% Projected Reduction in Capital Expenditure: By accurately sizing equipment and avoiding over-purchasing, the simulation model is expected to reduce capital expenditure by a significant 20%. This translates to a double-digit reduction in CAPEX, a testament to the power of data-driven decision-making.
    • Increased Confidence and Faster Approval: The robustness of the model and the clarity of its outputs instilled a high level of confidence in the project, leading to a faster approval timeline.
    • A Tool for the Future: The digital twin is not a one-off solution. GSK plans to continue using the model for ongoing planning and to explore the potential of machine learning for further process optimisation. The model has the capability to be extended to people and material flow for the final operational layout.

    This lab capacity planning project is a prime example of how simulation can be used to build antifragility into a manufacturing system. Coined by Nassim Nicholas Taleb, antifragility is the property of systems that thrive and grow when exposed to volatility, randomness, disorder, and stressors. By creating a digital twin, GSK was able to subject their future facility to a wide range of stressors and scenarios in a virtual environment. This allowed them to identify and address potential weaknesses, designing a system that is not just robust to change but can actually benefit from it. The ability to quickly re-evaluate capacity requirements in response to fluctuating demand forecasts is a key characteristic of an antifragile system.

    In the broader context of production and capacity planning, this project highlights a paradigm shift. Traditional, static planning methods are no longer sufficient in today’s dynamic and uncertain world. The future belongs to dynamic, data-driven approaches that embrace complexity and provide the foresight needed to make informed decisions. The GSK Parma lab capacity planning project is a powerful demonstration of how simulation and digital twin technology can empower organisations to not only survive but thrive in the face of uncertainty, building the resilient and antifragile production systems of the future.