See also:

  • Discrete math / combinatorics

Graphical Models bayesian networks directed edges express conditional rpobablity. Hopefully acyclic. graphical models


student’s t chi squared p value

I should have more to say here? Probability Experimental design Hypothesis testing Goodness of fit metrices Bayes rules Regularization Bayes rule and regularization can be seen to be related. Regularization corresponds to a prior that the values of your parameters aren’t going to be ridiculous. A Gaussian prior and guassian distrubtion of error

\(e^{ -\frac{\eps^2}{\sigma^2} }\) \(y_j = \eps_j + \sum a_i f_i(x_j)\)

Machine learning

Cumulants Paradoxes

Measure theory stochastic calculus


Markov decision processes Monte carlo algos las vegas algos


bayesian vs freqeuntist Priors as regularization


Gaussian Poisson Binomial


Entropy Mackay Information Theory, Inference, and Learning Algorithms


from z3 import *
E = Sort("Event")
P = Function("P", E, RealSort())

# Proof system for probability theory? Kolmogorov axioms

Sets and probability. You need to know an ambient space X to be working in.

Law of Total Probability $P(A) = \sum P(A \cap B_i) = \sum P(A | B_i) P(B_i)$ if $B_i$ is a partition of the sample space

Central limit theorem

Markov bound Chernoff bound