## Deep Learning Coursera Notes

cross-entropy – expectation value of log(p).

initialization – randn for weights. use 2/sqrt(input size) if using relu. See He. Avoids blow up

epoch – one run through all data

mini-batch – break up data into 1 gpus worth chunks. Worth trying different values to see

momentum – smooths gradients that are oscillating and lets build up

Adam – combined momentum and RMS prop. Works better often? 0.9 for beta1 and 0.999 for beta2 are common parameters.

Hyperparameter search – random points use log scales for some things.

reevalute your hyperparametrs occasionally

batch normalization – adds a normalization and mean subtraction at every hidden layer. makes later neurons less susceptible to earlier changes

tensorflow – variables – placeholder, make sessions, run a trainer

strategy – fit training set, dev set, test set, real world

use better optimizer bigger network if not fitting training

use more dat, rgularize if not

satisficing metric

add weight to realyy important examples

bias  – perforance on triainig set – human level is good benchmark

error analysis – ceiling on performance. Find out how many of some kind of problem are happening to figure out what is worthwhile. Do it manually

reasonably robust to random errors in training set

build first fast, iterate fast

if you need to use a different distro from training set, use the real stuff mostly in your dev and test

Break up into train dev and train-dev. so that you can know if the problem is due to mismatch or due to overfitting

manually try to make training set more like dev set on problem cases. Maybe add noise or find more examples of the error prone thing

Transfer learning

end to end – use subproblems if you have data for subproblems

And… I can’t access the two last ones yet. Poo.

## Fixing up some jekyll problems for jupyter

the jupyer jekyll plugin supposedly won’t work on github pages

https://briancaffey.github.io/2016/03/14/ipynb-with-jekyll.html

jupyter nbconvert –to markdown jekyll_test.ipynb

To get latex (including the $tags) to work on the minima layout I added <head> <meta charset="utf-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="viewport" content="width=device-width, initial-scale=1"> <link rel="stylesheet" href="{{ "/assets/main.css" | relative_url }}"> <link rel="alternate" type="application/rss+xml" title="{{ site.title | escape }}" href="{{ "/feed.xml" | relative_url }}"> {% if jekyll.environment == 'production' and site.google_analytics %} {% include google-analytics.html %} {% endif %} <script type="text/x-mathjax-config"> MathJax.Hub.Config({ tex2jax: {inlineMath: [['$','\$'], ['\$','\$']]}
});
</script>
<script type="text/javascript" async
src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.2/MathJax.js?config=TeX-MML-AM_CHTML">
</script>
</head>

I added a pynb  directory and added the following into my _config file? Not sure this was necessary.

defaults:
- scope:
path: "pynb"
values:
image: true

replace all

fermions_part_1_files

with

/pynb/fermions_part_1_files

in the markdown file.

could also add syntax highlighting but maybe this is good enough.