Gillespie Stochastic Simulation Algorithm
Algorithm 1
For , the method for simulating the realization of a continuous Markov chain is called the Gillespie Stochastic Simulation Algorithm, or simply SSA.
Input
Receives initial values and .
Step 1.
Calculate .
Step 2.
Sample from the exponential distribution . The larger is, the smaller the expected value of becomes, and events occur more frequently.
Step 3.
Select with the probability of . The larger is, the higher the probability that event will occur.
Step 4.
Record and update as . Since is a continuous-time Markov chain, one can return to Step 1 and repeat. Stop the algorithm when enough data has been obtained.
Output
Obtains the realization of given .
Explanation
Example
The algorithm might look complicated with all the general expressions, but it’s actually pretty simple.
Consider a simple Birth-Death Process. This process starts with individuals and shows the increase or decrease in population according to the birth rate and the death rate . Whether is greater or smaller than will determine the pattern, but the frequency of events itself depends on their absolute values. It is plausible that events, whether births or deaths, happen more frequently in proportion to the current number of individuals, and the interval at which events occur when the population is follows . Once the timing of an event is decided, we choose which event, birth or death, will occur. At , if the population is , to breed at means , and to die means . These are proportional to the rate and the population at , , respectively, so . According to Step 3, decide which event will occur with the probability calculated by normalizing these.
Code
The following is the Julia code that directly translates the above example. With the given parameters, repeat the simulation ten thousand times starting with the initial population and compute the average at to compare with the theoretical prediction. It is written to be easy to read rather than efficient, so let’s check if the algorithm is implemented as described in the example.
using Distributions
using Plots
b = 2.0
d = 1.0
θ = b/(b+d)
result = plot(xlims = (0,2), ylims = (0, Inf))
Y = Int64[]
@time for _ in 1:10000
y = [2]
t = [0.0]
while t[end] < 2 && y[end] > 0
push!(t, t[end] + rand(Exponential(1/((b+d) * y[end]))))
if rand() < θ
push!(y, y[end] + 1)
else
push!(y, y[end] - 1)
end
end
if t[end] > 2
push!(Y, y[end-1])
else
push!(Y, 0)
end
plot!(result, t,y, label = :none, alpha = 0.01, color = :black)
end
plot!(result, 0:0.1:2,2exp.((b-d) .* 0:0.1:2), label = :none, color = :red, linewidth = 4, linestyle = :dot)
result
abs(mean(Y) - 2exp.(2(b-d)))
Visualizing the result looks like the above. The difference from the theoretical prediction is about 0.1, confirming that it has been calculated quite accurately.