You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In PointNet++ model there are three SetAbstraction layers (each having three convolutional, Batchnorm and ReLU layers) and in last two the number of channels are increased by 3.
I am trying to do backpropagation manually to understand how it actually trains.
I am stuck in the third SetAbstraction layer's first convolutional layer. Here the gradient coming from backpropagation has shape (BatchSize,256,128,1), the input to this first convolutional layer is the output from the second SetAbstraction after doing max operation and increasing the channels by 3 which is of shape (BatchSize, 259,128,1). The weights of this convolutional layer has shape (256,259,1,1). Now when I try to find this convolutional layer weight gradients, it comes correct with shape of (256,259,1,1). But for input gradients the shape comes out to be (BatchSize,259,128,1). But the shape of the output of third ReLU of second SetAbstraction is (BatchSize,256,64,128) and its max operation leads to shape (BatchSize,256,128). Now how should I carry the gradient I calculated back through max operation and then relu operation as its shape is (BatchSize,259,128,1).
Please help me with this step. Thank you
The text was updated successfully, but these errors were encountered:
In PointNet++ model there are three SetAbstraction layers (each having three convolutional, Batchnorm and ReLU layers) and in last two the number of channels are increased by 3.
I am trying to do backpropagation manually to understand how it actually trains.
I am stuck in the third SetAbstraction layer's first convolutional layer. Here the gradient coming from backpropagation has shape (BatchSize,256,128,1), the input to this first convolutional layer is the output from the second SetAbstraction after doing max operation and increasing the channels by 3 which is of shape (BatchSize, 259,128,1). The weights of this convolutional layer has shape (256,259,1,1). Now when I try to find this convolutional layer weight gradients, it comes correct with shape of (256,259,1,1). But for input gradients the shape comes out to be (BatchSize,259,128,1). But the shape of the output of third ReLU of second SetAbstraction is (BatchSize,256,64,128) and its max operation leads to shape (BatchSize,256,128). Now how should I carry the gradient I calculated back through max operation and then relu operation as its shape is (BatchSize,259,128,1).
Please help me with this step. Thank you
The text was updated successfully, but these errors were encountered: