Uncertainty of Computer Vision: A Bayesian's Perspective

  1. 1. Formation: What's in Bayesianists' mind?
  2. 2. References

Abstract:

[Outline]

  • bayesian modelling and inferencing
    • 统计学习方法Chap 1,4,7,9,10,11,19,20
  • from beyesian's theorem to bayesian DL
    • Structure of BNN
    • Training and inferencing

Formation: What's in Bayesianists' mind?

The posterior represents our belief/hypothesis/uncertainty about the value of each parameter (setting).

The Bayesian's Theorem Here

EXAMPLE: Fitting a normal distribution.

You've observed a set of data from a normal distribution with an unknown mean and a known standard variance . Your goal is to find the marginal posterior distribution.

The process of getting the parameters and is called "marginalization".

Python Solution:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import norm
from scipy.optimize import minimize

# Generate some data points that are normally distributed
np.random.seed(0) # For reproducibility
true_mu, true_sigma = 0, 0.1 # True parameters
data = np.random.normal(true_mu, true_sigma, 1000)

# Define the likelihood function
def negative_log_likelihood(params):
# Our parameters to estimate are the mean and standard deviation
mu, sigma = params[0], params[1]

# We use the negative log likelihood because we will be minimizing the function
# The constant term (log(2*pi)) is omitted since it doesn't affect the result
nll = -np.sum(norm.logpdf(data, loc=mu, scale=sigma))
return nll

# Initial guesses for mu and sigma
initial_params = [0, 1]

# Minimize the negative log likelihood
result = minimize(negative_log_likelihood, initial_params, method='nelder-mead')

# Estimated parameters
mu_est, sigma_est = result.x
print(f"Estimated mu: {mu_est}, Estimated sigma: {sigma_est}")

# Plot the histogram of the data
plt.hist(data, bins=30, density=True, alpha=0.5, label='Data histogram')

# Plot the PDF of the normal distribution with the estimated parameters
x = np.linspace(min(data), max(data), 100)
plt.plot(x, norm.pdf(x, mu_est, sigma_est), 'r-', label='Fitted normal distribution')

# Show the plot with legend
plt.legend()
plt.show()

References

https://towardsdatascience.com/a-comprehensive-introduction-to-bayesian-deep-learning-1221d9a051de