Document List
On This Page

What-If Analysis with Simulation

What-If Analysis Using Simulation

What-if analysis is one of the most powerful applications of process simulation. By creating multiple simulation scenarios and comparing their results, you can make informed decisions about process changes before investing time and resources in implementation.


The Power of Comparison

Simulation alone provides valuable predictions, but the real insights come from comparison:

  • Compare before and after a proposed change
  • Compare simulated results with actual historical data
  • Compare multiple alternative scenarios

Comparing Datasets in ProcessMind

Because simulation output is a standard event log dataset, ProcessMind provides powerful tools for comparing any datasets—whether simulated, historical, or both.

How to Compare Datasets

  1. Navigate to the Compare view in your process
  2. Select the datasets you want to compare
  3. ProcessMind displays side-by-side metrics and visualizations

What Can You Compare?

MetricDescription
Throughput TimeTotal time from case start to completion
Waiting TimeTime cases spend waiting for resources
Processing TimeActual work time for activities
Case CountNumber of completed cases
Resource UtilizationHow busy resources are
Path DistributionWhich paths cases take through the process

Before/After Analysis

The most common what-if scenario is testing a process change:

Scenario: Adding Resources

Question: What happens if we add one more approval staff member?

Approach:

  1. Create a baseline simulation with current staffing
  2. Create a modified simulation with increased capacity
  3. Compare the results

Example Results

MetricBaseline+1 StaffImprovement
Avg. Throughput Time5.2 days3.8 days27% faster
Avg. Waiting Time2.1 days0.9 days57% reduction
Cases/Week15019530% more
Staff Utilization95%78%Less overloaded

Cost-Benefit Analysis

Combine simulation results with cost data to calculate ROI. If adding one staff member costs $60,000/year but increases throughput by 30%, you can quantify the business impact.


Comparing with Actual Data

One of ProcessMind’s most powerful features is comparing simulated predictions with real historical data.

Why Compare to Actual Data?

  1. Validate your model: If simulation matches historical data, your model is accurate
  2. Identify gaps: Differences reveal where your model needs refinement
  3. Measure improvements: Compare new actual data against simulated predictions

How to Compare Simulated vs. Actual

  1. Run a simulation for a historical time period
  2. Upload or select your actual historical data for the same period
  3. Use the Compare feature to view differences

Interpreting Differences

If Simulation Shows…Possible Cause
Faster than actualModel missing delays, underestimated complexity
Slower than actualOverestimated processing times, unnecessary constraints
Different pathsGateway probabilities don’t match reality
Resource overloadCapacity constraints configured incorrectly

Multi-Scenario Comparison

Compare more than two scenarios to find the optimal configuration:

Scenario: Finding Optimal Staffing

Question: How many approval staff do we need?

Approach: Create multiple simulations with varying staff levels

Example: Staffing Level Analysis

StaffThroughputUtilizationWait TimeCost
2100 cases/wk98%4.5 days$120K
3145 cases/wk85%1.8 days$180K
4160 cases/wk68%0.5 days$240K
5165 cases/wk55%0.2 days$300K

Insight: Adding a 3rd staff member provides the best value. The 4th and 5th show diminishing returns.


Common What-If Scenarios

Process Redesign

Test structural changes to your process:

  • What if we remove a manual approval step?
  • What if we add a quality check?
  • What if we parallelize sequential activities?

Demand Changes

Prepare for different demand levels:

  • What if demand increases by 50%?
  • What if we experience seasonal peaks?
  • What if a marketing campaign doubles incoming cases?

Resource Changes

Optimize your workforce and systems:

  • What if we cross-train employees for multiple activities?
  • What if we automate a manual task?
  • What if we outsource certain activities?

Process Timing

Explore schedule and timing impacts:

  • What if we extend operating hours?
  • What if we add a weekend shift?
  • What if we stagger lunch breaks?

Best Practices for What-If Analysis

1. Establish a Baseline

Always create a validated baseline simulation before testing changes. This provides your point of comparison.

2. Change One Variable at a Time

When possible, isolate variables to understand their individual impact. Changing multiple things simultaneously makes it hard to attribute improvements.

3. Use Realistic Parameters

Base your simulation parameters on actual data when available. Unrealistic inputs lead to unreliable outputs.

4. Consider Multiple Metrics

Don’t optimize for just one metric. Faster throughput might come at the cost of quality or employee burnout.

5. Document Your Scenarios

Keep clear records of what you changed in each scenario. This makes results reproducible and shareable.

6. Validate Before Acting

When possible, validate simulation predictions with small-scale tests before full implementation.


Presenting Results

When sharing what-if analysis results with stakeholders:

  • Lead with the question: What business problem are we solving?
  • Show the comparison: Visual side-by-side metrics are most impactful
  • Quantify the impact: Translate to business value (time saved, cost reduced, revenue gained)
  • Acknowledge uncertainty: Simulation is a prediction, not a guarantee

Next Steps

Interface Reference
Complete reference for all simulation settings and options.