Lecture 10
October 6, 2025
# set up distribution
mean_true = 0.4
n_cis = 100 # number of CIs to compute
dist = Normal(mean_true, 2)
# use sample size of 100
samples = rand(dist, (100, n_cis))
# mapslices broadcasts over a matrix dimension, could also use a loop
sample_means = mapslices(mean, samples; dims=1)
sample_sd = mapslices(std, samples; dims=1)
mc_sd = 1.96 * sample_sd / sqrt(100)
mc_ci = zeros(n_cis, 2) # preallocate
for i = 1:n_cis
mc_ci[i, 1] = sample_means[i] - mc_sd[i]
mc_ci[i, 2] = sample_means[i] + mc_sd[i]
end
# find which CIs contain the true value
ci_true = (mc_ci[:, 1] .< mean_true) .&& (mc_ci[:, 2] .> mean_true)
# compute percentage of CIs which contain the true value
ci_frac1 = 100 * sum(ci_true) ./ n_cis
# plot CIs
p1 = plot([mc_ci[1, :]], [1, 1], linewidth=3, color=:deepskyblue, label="95% Confidence Interval", title="Sample Size 100", yticks=:false, legend=:false)
for i = 2:n_cis
if ci_true[i]
plot!(p1, [mc_ci[i, :]], [i, i], linewidth=2, color=:deepskyblue, label=:false)
else
plot!(p1, [mc_ci[i, :]], [i, i], linewidth=2, color=:red, label=:false)
end
end
vline!(p1, [mean_true], color=:black, linewidth=2, linestyle=:dash, label="True Value") # plot true value as a vertical line
xaxis!(p1, "Estimate")
plot!(p1, size=(500, 500)) # resize to fit slide
# use sample size of 1000
samples = rand(dist, (1000, n_cis))
# mapslices broadcasts over a matrix dimension, could also use a loop
sample_means = mapslices(mean, samples; dims=1)
sample_sd = mapslices(std, samples; dims=1)
mc_sd = 1.96 * sample_sd / sqrt(1000)
mc_ci = zeros(n_cis, 2) # preallocate
for i = 1:n_cis
mc_ci[i, 1] = sample_means[i] - mc_sd[i]
mc_ci[i, 2] = sample_means[i] + mc_sd[i]
end
# find which CIs contain the true value
ci_true = (mc_ci[:, 1] .< mean_true) .&& (mc_ci[:, 2] .> mean_true)
# compute percentage of CIs which contain the true value
ci_frac2 = 100 * sum(ci_true) ./ n_cis
# plot CIs
p2 = plot([mc_ci[1, :]], [1, 1], linewidth=3, color=:deepskyblue, label="95% Confidence Interval", title="Sample Size 1,000", yticks=:false, legend=:false)
for i = 2:n_cis
if ci_true[i]
plot!(p2, [mc_ci[i, :]], [i, i], linewidth=2, color=:deepskyblue, label=:false)
else
plot!(p2, [mc_ci[i, :]], [i, i], linewidth=2, color=:red, label=:false)
end
end
vline!(p2, [mean_true], color=:black, linewidth=2, linestyle=:dash, label="True Value") # plot true value as a vertical line
xaxis!(p2, "Estimate")
plot!(p2, size=(500, 500)) # resize to fit slide
display(p1)
display(p2)
Text: VSRIKRISH to 22333
If we want to design a treatment strategy, we are now in the world of prescriptive modeling.
Recall: Precriptive modeling is intended to specify an action, policy, or decision.
To make a decision, we need certain pieces of information which:
The optimal solution of a model is not an optimal solution of a problem unless the model is a perfect representation of the problem, which it never is.
— Ackoff, R. L. (1979). “The Future of Operational Research is Past.” The Journal of the Operational Research Society, 30(2), 93–104. https://doi.org/10.1057/jors.1979.22
Typical objectives can include:
Most search algorithms look for critical points to find candidate optima. Then the “best” of the critical points is the global optimum.
Two common approaches:
Find estimate of gradient near current point and step in positive/negative direction (depending on max/min).
\[x_{n+1} = x_n \pm \alpha_n \nabla f(x_n)\]
The second is more problematic: in some cases, methods like stochastic gradient approximation or automatic differentiation can be used.
Use a sampling strategy to find a new proposal, then evaluate, keep if improvement.
Evolutionary Algorithms fall into this category.
Can also incorporate constraints into search.
These methods work pretty well, but can:

These methods work pretty well, but can:

These methods work pretty well, but can:

It can be convenient (and sometimes appropriate!) to formulate problems where we can guarantee finding a solution is a reasonable amount of time.
These approaches are called mathematical programs.
We will spend the next few weeks talking about linear programming and its variants.
Wednesday: Prelim 1, does not include material from today’s lecture.
Monday: Fall Break
Next Wednesday: Linear Programming
HW3: Due next Thursday (10/16) at 9pm.