You asked for this:
(See the attached PDF, but I added the questions as text as well)
So, here goes:
1) Are we supposed to be able to solve deep differentiation of Neural networks by hand? (i.e.):
i.
2) Do we need to know biology stuff ( Pre-synaptic and post-synaptic factor)?
3) Do we need to do any form of coding on the exam?
4) What does bootstrapping mean in the context of ANN ( Is it simply using random variables in the network and updating them until we reach some sort of convergence, or is it the weird thing from statistics https://en.wikipedia.org/wiki/Bootstrapping_(statistics) , or is it something else?)
5) Eligibility traces ( I don’t know enough to know what to ask here, so you might skip this one)
6) Why is it ok that we rewrite to when we find the online update rule in reinforcement learning? ( This might have been awnsered on the forums, I did not understand it, but I can probably try to go back and see if I can figure it out)
7) How does the Backprop-differentiation algorithm work (I might figure this one out when I go through the slides, though)
8) If there are online-rules that can handle gradient descent, are there any mathematical reasons for using mini-batches in the training of networks, or are the reasonings mostly empirical? (I guess the answer might just be “Because the minibatch has less variance”…)
9) Do we need to know ADAM, and how it diggers from SGD?
10) Will the exam tend more towards:
a. Text based questions, where we have to know and reason about different things that have been explained about neural networks
b. Being able to understand the math behind Neural Networks suffichiently well to be able to deduce other new concepts if we are asked to do so during the exam?
c. Knowing the material well enough by heart to be able to solve a lot of tasks very quickly?
11) Vanishing gradient problem. ( This one might just be in the slides/exercises, so it might be fine)