Questions for the Q&A-session

Questions for the Q&A-session

by Jørn Bøni Hofstad -
Number of replies: 2

You asked for this:

(See the attached PDF, but I added the questions as text as well)

So, here goes:

1)     Are we supposed to be able to solve deep differentiation of Neural networks by hand? (i.e.):

                                                    i.    

2)     Do we need to know biology stuff ( Pre-synaptic and post-synaptic factor)?

3)     Do we need to do any form of coding on the exam?

4)     What does bootstrapping mean in the context of ANN ( Is it simply using random variables in the network and updating them until we reach some sort of convergence, or is it the weird thing from statistics https://en.wikipedia.org/wiki/Bootstrapping_(statistics) , or is it something else?)

5)     Eligibility traces ( I don’t know enough to know what to ask here, so you might skip this one)

6)    Why is it ok that we rewrite  to when we find the online update rule in reinforcement learning? ( This might have been awnsered on the forums, I did not understand it, but I can probably try to go back and see if I can figure it out)

7)     How does the Backprop-differentiation algorithm work (I might figure this one out when I go through the slides, though)

8)     If there are online-rules that can handle gradient descent, are there any mathematical reasons for using mini-batches in the training of networks, or are the reasonings mostly empirical? (I guess the answer might just be “Because the minibatch has less variance”…)

9)     Do we need to know ADAM, and how it diggers from SGD?

10)  Will the exam tend more towards:

a.      Text based questions, where we have to know and reason about different things that have been explained about neural networks

b.      Being able to understand the math behind Neural Networks suffichiently well to be able to deduce other new concepts if we are asked to do so during the exam?

c.      Knowing the material well enough by heart to be able to solve a lot of tasks very quickly?

11)  Vanishing gradient problem. ( This one might just be in the slides/exercises, so it might be fine)


( More to come if I can think of anything)

In reply to Jørn Bøni Hofstad

Re: Questions for the Q&A-session

by Jørn Bøni Hofstad -

I forgot to mention that these are just most of the questions I had after reading quickly through the exercises, so not all of them are necessarily relevant, so I apologize if the number of questions is a bit to big.

In reply to Jørn Bøni Hofstad

Re: Questions for the Q&A-session

by Florian François Colombo -

Hi. 

As mentioned today. Here are some references for your questions:

1) Ex 2.1 Lecture 2

4) Ex 4.2 Lecture 4

7) Ex 2.1 Lecture 2

8) Ex 4.3

9) Ex 5.2 Lecture 5

10) Exam 2018

11) Lecture 6