HW10

HW10

by Valentin Denis Alexis Parisot -
Number of replies: 4

Hello,

In the function forward of the class UNet, the line x2 = self.dc2(x2) does not make sense for me it shouldn't be x3 = ... ?

thanks..

In reply to Valentin Denis Alexis Parisot

Re: HW10

by Nikita Durasov -

Hey,

not really, you're trying to save x2 features in order to reuse them further in the network, namely you'll do something like x = torch.cat([x2, x], dim=1). For example, the same thing happens to x3 and x4. When you have following lines:

x1 = self.dc1(x)

x2 = self.mp1(x1)

x2 = self.dc2(x2)

After x2 = self.mp1(x1), x2 is x1 features with reduced resolution (because of MaxPooling), therefore firstly they should be processed with convolutions (=> x2 = self.dc2(x2)).

Hope this helps!

In reply to Valentin Denis Alexis Parisot

Re: HW10

by Sylvain Maurice Tuari Lugeon -
Hey,
The name of the variable is not important here. What's important is that the output of a layer is fed as an input to the next layer. Here, the full sequence of modules is:

x1 = self.dc1(x)
x2 = self.mp1(x1)
x2 = self.dc2(x2)
x3 = self.mp2(x2)

So the input of mp2 is indeed the output of dc2, and everything works fine.

They decided to keep the same variable name for the results after and before the double convolution, but it is just a naming convention. We could also imagine a forward method where we use the same name for every input and output.

x = self.dc1(x)
x = self.mp1(x)
x = self.dc2(x)
x = self.mp2(x)

Hope that helps.