Simulated NPV with Time-Series

Load Environment

Let's start by making sure all the tools we nned are loaded up. You will almost always need to load these packages up anytime you are build a Monte-Carlo model.

using Distributions
using Dates
using Gadfly
using StatsBase
using MCHammer
using DataFrames

Setup Inputs and Outputs

The next critical step is to setup key inputs, arrays and other important model parameters.

#Setup the Date Range for the analysis
dr = collect(Date(2019,1,01):Dates.Year(1):Date(2023,01,01))

#Setup Global Inputs
ForecastYrs = 5
Trials = 10000
Units = [5000, 10000, 17000, 18000, 20000]
InitialInvestment = 250000
Investment = [100000, 0, 0, 25000,0] #fill(0,ForecastYrs)

#Setup Outputs
Sensitivity_Tbl = []

ProjectNPV = []
USP = []
USC =[]
DR = []
OP =[]
Annual_CashFlows =[]
0-element Array{Any,1}

Build Simulation Model

Monte-Carlo simulation needs to generate a table of scenarios which are known as ***Trials***. A trial documents, in the form of a row, all of the inputs and calculated outputs for a particular scenario.

Using this results table allows you to runs all sorts of analysis, including sensitivity analysis and assigning probabilities to outcomes. To generate this table, you need to loop your equation/function as many times as you need and vary the inputs using probability distributions.

Another challenge to account for is that our example is a 5yr NPV model which requires building and analyzing the results over multiple periods. To extend the model, we are using MCHammer's GBMM function that allows to project a random walk forecast over how ever many periods you need, which extends automatically the model in Julia.

for i = 1:Trials

    UnitSellPrice = GBMM(80, 0.2, 0.1, ForecastYrs)
    UnitCost = GBMM(40, 0.1, 0.05, ForecastYrs)

    #Each period the discount rate is independent. If you use an additive method instead of multiplicative, you can end up with differences. These may or may not impact the decision. For simulation it is best to use the risk free rate.

        #Multiplicative Method
        DiscountRate = cumprod(rand(Normal(0.02,0.0075),ForecastYrs)+fill(1,ForecastYrs))#accumulate(+,rand(Normal(0.02,0.0075),ForecastYrs))+fill(1,ForecastYrs)

        #Additive Method
        #DiscountRate = accumulate(+,rand(Normal(0.02,0.0075),ForecastYrs))+fill(1,ForecastYrs)

        #a static DR
        #DiscountRate = accumulate(+, fill(0.02,ForecastYrs))+fill(1,ForecastYrs)

    #print(DiscountRate)

    #DCF Elements
    Annual_Sales = UnitSellPrice .* Units
    Annual_COGS = UnitCost .* Units
    OPEX = rand(TriangularDist(.2,0.5,0.35),ForecastYrs) .* Annual_Sales

    #Constant Dollar Cashflow
    #CashFlow_C = (Annual_Sales - Annual_COGS - OPEX - Investment)

    #Discounted CashFLow over multpile periods. This function uses arrays and DOT functions.

    CashFlow = (Annual_Sales - Annual_COGS - OPEX - Investment) ./ DiscountRate

    #Calculated Output
    Trial_NPV = sum(CashFlow)-InitialInvestment

#Convert Arrays to Scalars for sensitivity analysis
    push!(ProjectNPV, Trial_NPV)
    push!(USC, mean(UnitCost))
    push!(USP, mean(UnitSellPrice))
    push!(DR,  mean(DiscountRate))
    push!(OP,  mean(OPEX))
    push!(Annual_CashFlows,  CashFlow)

end

Setting up data for analysis and charting

Setup inputs/outputs(above) and output tables (below) for sensitivity analysis and charting. Since correlation is based on the same math as regression, the only way to calculate sensitivity on an Array > 1 (in this case multiple years) is to condense the array into a scalar value using either mean, sum or any other transform because what ever you pick will generate a similar or identical result.

Sensitivity_Tbl = DataFrame(hcat(ProjectNPV, USC, USP, DR, OP))
names!(Sensitivity_Tbl, [:ProjectNPV, :USC, :USP, :DR, :OP])
NPV_Sensitivity = cormat(Sensitivity_Tbl,1)
5×5 Array{Float64,2}:
  1.0        -0.268666     0.903604    -0.0151814    0.602799  
 -0.268666    1.0         -0.00907278  -0.00261068  -0.00344   
  0.903604   -0.00907278   1.0          0.0132132    0.835428  
 -0.0151814  -0.00261068   0.0132132    1.0          0.00932542
  0.602799   -0.00344      0.835428     0.00932542   1.0       

Stats

Generate model results and list all the outputs for your charting and analysis functions.

print("Project Mean: ", mean(ProjectNPV),"\n")
print("Project Std.Dev: ", std(ProjectNPV),"\n")
print("Prob. of Neg. NPV: ", GetCertainty(ProjectNPV,0,0),"\n")
print("NPV p10, p50, p90 : ", quantile(collect(Float64, ProjectNPV),[0.1,0.5,0.9]),"\n")
println("")
println("OUTPUTS: Annual_CashFlows, ProjectNPV, Sensitivity_Tbl")
println("date range = dr")
Project Mean: 2.526111648765865e6
Project Std.Dev: 1.0687607354120135e6
Prob. of Neg. NPV: 0.0017
NPV p10, p50, p90 : [1.20038e6, 2.46242e6, 3.91369e6]

OUTPUTS: Annual_CashFlows, ProjectNPV, Sensitivity_Tbl
date range = dr

To generate a complete list of percentiles, use the fractiles().

fractiles(ProjectNPV)
11×2 Array{Any,2}:
 "P0.0"    -5.90221e5
 "P10.0"    1.20038e6
 "P20.0"    1.6041e6 
 "P30.0"    1.93198e6
 "P40.0"    2.19735e6
 "P50.0"    2.46242e6
 "P60.0"    2.72582e6
 "P70.0"    3.03298e6
 "P80.0"    3.38188e6
 "P90.0"    3.91369e6
 "P100.0"   7.52754e6

##Visualizing the outputs

Looking at the Probability Distribution

histogram_chrt(ProjectNPV, "Five Year NPV")
Five Year NPV -1.4×10⁷ -1.2×10⁷ -1.0×10⁷ -8.0×10⁶ -6.0×10⁶ -4.0×10⁶ -2.0×10⁶ 0 2.0×10⁶ 4.0×10⁶ 6.0×10⁶ 8.0×10⁶ 1.0×10⁷ 1.2×10⁷ 1.4×10⁷ 1.6×10⁷ 1.8×10⁷ 2.0×10⁷ -1.20×10⁷ -1.15×10⁷ -1.10×10⁷ -1.05×10⁷ -1.00×10⁷ -9.50×10⁶ -9.00×10⁶ -8.50×10⁶ -8.00×10⁶ -7.50×10⁶ -7.00×10⁶ -6.50×10⁶ -6.00×10⁶ -5.50×10⁶ -5.00×10⁶ -4.50×10⁶ -4.00×10⁶ -3.50×10⁶ -3.00×10⁶ -2.50×10⁶ -2.00×10⁶ -1.50×10⁶ -1.00×10⁶ -5.00×10⁵ 0 5.00×10⁵ 1.00×10⁶ 1.50×10⁶ 2.00×10⁶ 2.50×10⁶ 3.00×10⁶ 3.50×10⁶ 4.00×10⁶ 4.50×10⁶ 5.00×10⁶ 5.50×10⁶ 6.00×10⁶ 6.50×10⁶ 7.00×10⁶ 7.50×10⁶ 8.00×10⁶ 8.50×10⁶ 9.00×10⁶ 9.50×10⁶ 1.00×10⁷ 1.05×10⁷ 1.10×10⁷ 1.15×10⁷ 1.20×10⁷ 1.25×10⁷ 1.30×10⁷ 1.35×10⁷ 1.40×10⁷ 1.45×10⁷ 1.50×10⁷ 1.55×10⁷ 1.60×10⁷ 1.65×10⁷ 1.70×10⁷ 1.75×10⁷ 1.80×10⁷ -2×10⁷ -1×10⁷ 0 1×10⁷ 2×10⁷ -1.2×10⁷ -1.1×10⁷ -1.0×10⁷ -9.0×10⁶ -8.0×10⁶ -7.0×10⁶ -6.0×10⁶ -5.0×10⁶ -4.0×10⁶ -3.0×10⁶ -2.0×10⁶ -1.0×10⁶ 0 1.0×10⁶ 2.0×10⁶ 3.0×10⁶ 4.0×10⁶ 5.0×10⁶ 6.0×10⁶ 7.0×10⁶ 8.0×10⁶ 9.0×10⁶ 1.0×10⁷ 1.1×10⁷ 1.2×10⁷ 1.3×10⁷ 1.4×10⁷ 1.5×10⁷ 1.6×10⁷ 1.7×10⁷ 1.8×10⁷ h,j,k,l,arrows,drag to pan i,o,+,-,scroll,shift-drag to zoom r,dbl-click to reset c for coordinates ? for help ? -300 -250 -200 -150 -100 -50 0 50 100 150 200 250 300 350 400 450 500 550 -250 -240 -230 -220 -210 -200 -190 -180 -170 -160 -150 -140 -130 -120 -110 -100 -90 -80 -70 -60 -50 -40 -30 -20 -10 0 10 20 30 40 50 60 70 80 90 100 110 120 130 140 150 160 170 180 190 200 210 220 230 240 250 260 270 280 290 300 310 320 330 340 350 360 370 380 390 400 410 420 430 440 450 460 470 480 490 500 -250 0 250 500 -260 -240 -220 -200 -180 -160 -140 -120 -100 -80 -60 -40 -20 0 20 40 60 80 100 120 140 160 180 200 220 240 260 280 300 320 340 360 380 400 420 440 460 480 500 Frequency
density_chrt(ProjectNPV, "Five Year NPV")
Five Year NPV -2.5×10⁷ -2.0×10⁷ -1.5×10⁷ -1.0×10⁷ -5.0×10⁶ 0 5.0×10⁶ 1.0×10⁷ 1.5×10⁷ 2.0×10⁷ 2.5×10⁷ 3.0×10⁷ -2.00×10⁷ -1.95×10⁷ -1.90×10⁷ -1.85×10⁷ -1.80×10⁷ -1.75×10⁷ -1.70×10⁷ -1.65×10⁷ -1.60×10⁷ -1.55×10⁷ -1.50×10⁷ -1.45×10⁷ -1.40×10⁷ -1.35×10⁷ -1.30×10⁷ -1.25×10⁷ -1.20×10⁷ -1.15×10⁷ -1.10×10⁷ -1.05×10⁷ -1.00×10⁷ -9.50×10⁶ -9.00×10⁶ -8.50×10⁶ -8.00×10⁶ -7.50×10⁶ -7.00×10⁶ -6.50×10⁶ -6.00×10⁶ -5.50×10⁶ -5.00×10⁶ -4.50×10⁶ -4.00×10⁶ -3.50×10⁶ -3.00×10⁶ -2.50×10⁶ -2.00×10⁶ -1.50×10⁶ -1.00×10⁶ -5.00×10⁵ 0 5.00×10⁵ 1.00×10⁶ 1.50×10⁶ 2.00×10⁶ 2.50×10⁶ 3.00×10⁶ 3.50×10⁶ 4.00×10⁶ 4.50×10⁶ 5.00×10⁶ 5.50×10⁶ 6.00×10⁶ 6.50×10⁶ 7.00×10⁶ 7.50×10⁶ 8.00×10⁶ 8.50×10⁶ 9.00×10⁶ 9.50×10⁶ 1.00×10⁷ 1.05×10⁷ 1.10×10⁷ 1.15×10⁷ 1.20×10⁷ 1.25×10⁷ 1.30×10⁷ 1.35×10⁷ 1.40×10⁷ 1.45×10⁷ 1.50×10⁷ 1.55×10⁷ 1.60×10⁷ 1.65×10⁷ 1.70×10⁷ 1.75×10⁷ 1.80×10⁷ 1.85×10⁷ 1.90×10⁷ 1.95×10⁷ 2.00×10⁷ 2.05×10⁷ 2.10×10⁷ 2.15×10⁷ 2.20×10⁷ 2.25×10⁷ 2.30×10⁷ 2.35×10⁷ 2.40×10⁷ 2.45×10⁷ 2.50×10⁷ -2×10⁷ 0 2×10⁷ 4×10⁷ -2.0×10⁷ -1.9×10⁷ -1.8×10⁷ -1.7×10⁷ -1.6×10⁷ -1.5×10⁷ -1.4×10⁷ -1.3×10⁷ -1.2×10⁷ -1.1×10⁷ -1.0×10⁷ -9.0×10⁶ -8.0×10⁶ -7.0×10⁶ -6.0×10⁶ -5.0×10⁶ -4.0×10⁶ -3.0×10⁶ -2.0×10⁶ -1.0×10⁶ 0 1.0×10⁶ 2.0×10⁶ 3.0×10⁶ 4.0×10⁶ 5.0×10⁶ 6.0×10⁶ 7.0×10⁶ 8.0×10⁶ 9.0×10⁶ 1.0×10⁷ 1.1×10⁷ 1.2×10⁷ 1.3×10⁷ 1.4×10⁷ 1.5×10⁷ 1.6×10⁷ 1.7×10⁷ 1.8×10⁷ 1.9×10⁷ 2.0×10⁷ 2.1×10⁷ 2.2×10⁷ 2.3×10⁷ 2.4×10⁷ 2.5×10⁷ h,j,k,l,arrows,drag to pan i,o,+,-,scroll,shift-drag to zoom r,dbl-click to reset c for coordinates ? for help ? -5×10⁻⁷ -4×10⁻⁷ -3×10⁻⁷ -2×10⁻⁷ -1×10⁻⁷ 0 1×10⁻⁷ 2×10⁻⁷ 3×10⁻⁷ 4×10⁻⁷ 5×10⁻⁷ 6×10⁻⁷ 7×10⁻⁷ 8×10⁻⁷ 9×10⁻⁷ -4.0×10⁻⁷ -3.8×10⁻⁷ -3.6×10⁻⁷ -3.4×10⁻⁷ -3.2×10⁻⁷ -3.0×10⁻⁷ -2.8×10⁻⁷ -2.6×10⁻⁷ -2.4×10⁻⁷ -2.2×10⁻⁷ -2.0×10⁻⁷ -1.8×10⁻⁷ -1.6×10⁻⁷ -1.4×10⁻⁷ -1.2×10⁻⁷ -1.0×10⁻⁷ -8.0×10⁻⁸ -6.0×10⁻⁸ -4.0×10⁻⁸ -2.0×10⁻⁸ 0 2.0×10⁻⁸ 4.0×10⁻⁸ 6.0×10⁻⁸ 8.0×10⁻⁸ 1.0×10⁻⁷ 1.2×10⁻⁷ 1.4×10⁻⁷ 1.6×10⁻⁷ 1.8×10⁻⁷ 2.0×10⁻⁷ 2.2×10⁻⁷ 2.4×10⁻⁷ 2.6×10⁻⁷ 2.8×10⁻⁷ 3.0×10⁻⁷ 3.2×10⁻⁷ 3.4×10⁻⁷ 3.6×10⁻⁷ 3.8×10⁻⁷ 4.0×10⁻⁷ 4.2×10⁻⁷ 4.4×10⁻⁷ 4.6×10⁻⁷ 4.8×10⁻⁷ 5.0×10⁻⁷ 5.2×10⁻⁷ 5.4×10⁻⁷ 5.6×10⁻⁷ 5.8×10⁻⁷ 6.0×10⁻⁷ 6.2×10⁻⁷ 6.4×10⁻⁷ 6.6×10⁻⁷ 6.8×10⁻⁷ 7.0×10⁻⁷ 7.2×10⁻⁷ 7.4×10⁻⁷ 7.6×10⁻⁷ 7.8×10⁻⁷ 8.0×10⁻⁷ -5×10⁻⁷ 0 5×10⁻⁷ 1×10⁻⁶ -4.0×10⁻⁷ -3.5×10⁻⁷ -3.0×10⁻⁷ -2.5×10⁻⁷ -2.0×10⁻⁷ -1.5×10⁻⁷ -1.0×10⁻⁷ -5.0×10⁻⁸ 0 5.0×10⁻⁸ 1.0×10⁻⁷ 1.5×10⁻⁷ 2.0×10⁻⁷ 2.5×10⁻⁷ 3.0×10⁻⁷ 3.5×10⁻⁷ 4.0×10⁻⁷ 4.5×10⁻⁷ 5.0×10⁻⁷ 5.5×10⁻⁷ 6.0×10⁻⁷ 6.5×10⁻⁷ 7.0×10⁻⁷ 7.5×10⁻⁷ 8.0×10⁻⁷ Frequency

What variables are most influential on my output distribution?

sensitivity_chrt(Sensitivity_Tbl, 1, 3)
% Contribution to Variance -3.5 -3.0 -2.5 -2.0 -1.5 -1.0 -0.5 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 -3.0 -2.9 -2.8 -2.7 -2.6 -2.5 -2.4 -2.3 -2.2 -2.1 -2.0 -1.9 -1.8 -1.7 -1.6 -1.5 -1.4 -1.3 -1.2 -1.1 -1.0 -0.9 -0.8 -0.7 -0.6 -0.5 -0.4 -0.3 -0.2 -0.1 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 3.0 -4 -2 0 2 4 -3.0 -2.8 -2.6 -2.4 -2.2 -2.0 -1.8 -1.6 -1.4 -1.2 -1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 2.2 2.4 2.6 2.8 3.0 Negative Positive impact h,j,k,l,arrows,drag to pan i,o,+,-,scroll,shift-drag to zoom r,dbl-click to reset c for coordinates ? for help ? DR USC OP USP Input Variables with Biggest Impact

What does my cashflow look like over time?

The trend chart is a median centered chart that establishes a 90% confidence interval for each period. Remember dr or the date range is specified at the top.

CashFlow forecast

trend_chrt(Annual_CashFlows, dr)
timestamp Jan 1, 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 Jan 1, 2015 2020 2025 2030 Jan 1, 2015 2020 2025 2030 Jan 1, 2015 2020 2025 2030 LowerBound p50 UpperBound variable h,j,k,l,arrows,drag to pan i,o,+,-,scroll,shift-drag to zoom r,dbl-click to reset c for coordinates ? for help ? -6×10⁶ -5×10⁶ -4×10⁶ -3×10⁶ -2×10⁶ -1×10⁶ 0 1×10⁶ 2×10⁶ 3×10⁶ 4×10⁶ 5×10⁶ 6×10⁶ 7×10⁶ 8×10⁶ -5.0×10⁶ -4.8×10⁶ -4.6×10⁶ -4.4×10⁶ -4.2×10⁶ -4.0×10⁶ -3.8×10⁶ -3.6×10⁶ -3.4×10⁶ -3.2×10⁶ -3.0×10⁶ -2.8×10⁶ -2.6×10⁶ -2.4×10⁶ -2.2×10⁶ -2.0×10⁶ -1.8×10⁶ -1.6×10⁶ -1.4×10⁶ -1.2×10⁶ -1.0×10⁶ -8.0×10⁵ -6.0×10⁵ -4.0×10⁵ -2.0×10⁵ 0 2.0×10⁵ 4.0×10⁵ 6.0×10⁵ 8.0×10⁵ 1.0×10⁶ 1.2×10⁶ 1.4×10⁶ 1.6×10⁶ 1.8×10⁶ 2.0×10⁶ 2.2×10⁶ 2.4×10⁶ 2.6×10⁶ 2.8×10⁶ 3.0×10⁶ 3.2×10⁶ 3.4×10⁶ 3.6×10⁶ 3.8×10⁶ 4.0×10⁶ 4.2×10⁶ 4.4×10⁶ 4.6×10⁶ 4.8×10⁶ 5.0×10⁶ 5.2×10⁶ 5.4×10⁶ 5.6×10⁶ 5.8×10⁶ 6.0×10⁶ 6.2×10⁶ 6.4×10⁶ 6.6×10⁶ 6.8×10⁶ 7.0×10⁶ -5×10⁶ 0 5×10⁶ 1×10⁷ -5.0×10⁶ -4.5×10⁶ -4.0×10⁶ -3.5×10⁶ -3.0×10⁶ -2.5×10⁶ -2.0×10⁶ -1.5×10⁶ -1.0×10⁶ -5.0×10⁵ 0 5.0×10⁵ 1.0×10⁶ 1.5×10⁶ 2.0×10⁶ 2.5×10⁶ 3.0×10⁶ 3.5×10⁶ 4.0×10⁶ 4.5×10⁶ 5.0×10⁶ 5.5×10⁶ 6.0×10⁶ 6.5×10⁶ 7.0×10⁶ value