How does back propagation works through layers like maxpooling and padding? [duplicate]Back-propagation through max pooling layersTrying to figure out how to set weights for convolutional networkshow to propagate error from convolutional layer to previous layer?Back-propagation through max pooling layersSteps for back propagation of convolutional layer in CNNHow can you decide the window size on a pooling layer?Understand the shape of this Convolutional Neural NetworkHow is error back-propagated in a multi-layer RNNHow can I perform backpropagation directly in matrix form?How to make a region of interest proposal from convolutional feature maps?

What favor did Moody owe Dumbledore?

Comment Box for Substitution Method of Integrals

Pronounciation of the combination "st" in spanish accents

World War I as a war of liberals against authoritarians?

PTIJ What is the inyan of the Konami code in Uncle Moishy's song?

Would it be believable to defy demographics in a story?

In what cases must I use 了 and in what cases not?

Volumetric fire looks cuboid

Do I need to be arrogant to get ahead?

Relation between independence and correlation of uniform random variables

Is honey really a supersaturated solution? Does heating to un-crystalize redissolve it or melt it?

When did antialiasing start being available?

Practical application of matrices and determinants

A Ri-diddley-iley Riddle

Why are there no stars visible in cislunar space?

What is the English word for a graduation award?

Question on point set topology

HP P840 HDD RAID 5 many strange drive failures

Is there a creature that is resistant or immune to non-magical damage other than bludgeoning, slashing, and piercing?

Why didn't Héctor fade away after this character died in the movie Coco?

How to get the n-th line after a grepped one?

Using Past-Perfect interchangeably with the Past Continuous

How to terminate ping <dest> &

Loading the leaflet Map in Lightning Web Component



How does back propagation works through layers like maxpooling and padding? [duplicate]


Back-propagation through max pooling layersTrying to figure out how to set weights for convolutional networkshow to propagate error from convolutional layer to previous layer?Back-propagation through max pooling layersSteps for back propagation of convolutional layer in CNNHow can you decide the window size on a pooling layer?Understand the shape of this Convolutional Neural NetworkHow is error back-propagated in a multi-layer RNNHow can I perform backpropagation directly in matrix form?How to make a region of interest proposal from convolutional feature maps?













0












$begingroup$



This question already has an answer here:



  • Back-propagation through max pooling layers

    1 answer



I know back propagation takes derivatives (changing one quantity wrt other). But how this is applied when there is maxpooling layer in between two Conv2D layers? How it gains its original shape when there is padding added?










share|improve this question







New contributor




Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$



marked as duplicate by Esmailian, Ethan, Siong Thye Goh, Sean Owen yesterday


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






















    0












    $begingroup$



    This question already has an answer here:



    • Back-propagation through max pooling layers

      1 answer



    I know back propagation takes derivatives (changing one quantity wrt other). But how this is applied when there is maxpooling layer in between two Conv2D layers? How it gains its original shape when there is padding added?










    share|improve this question







    New contributor




    Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.







    $endgroup$



    marked as duplicate by Esmailian, Ethan, Siong Thye Goh, Sean Owen yesterday


    This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.




















      0












      0








      0





      $begingroup$



      This question already has an answer here:



      • Back-propagation through max pooling layers

        1 answer



      I know back propagation takes derivatives (changing one quantity wrt other). But how this is applied when there is maxpooling layer in between two Conv2D layers? How it gains its original shape when there is padding added?










      share|improve this question







      New contributor




      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.







      $endgroup$





      This question already has an answer here:



      • Back-propagation through max pooling layers

        1 answer



      I know back propagation takes derivatives (changing one quantity wrt other). But how this is applied when there is maxpooling layer in between two Conv2D layers? How it gains its original shape when there is padding added?





      This question already has an answer here:



      • Back-propagation through max pooling layers

        1 answer







      machine-learning deep-learning keras tensorflow backpropagation






      share|improve this question







      New contributor




      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      share|improve this question







      New contributor




      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this question




      share|improve this question






      New contributor




      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      asked 2 days ago









      Arshad_221bArshad_221b

      32




      32




      New contributor




      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      Arshad_221b is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




      marked as duplicate by Esmailian, Ethan, Siong Thye Goh, Sean Owen yesterday


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.









      marked as duplicate by Esmailian, Ethan, Siong Thye Goh, Sean Owen yesterday


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          Max pooling will cancel the effect of not pooled values to the gradients.
          Padded values either have no effect.



          Nice thing about convolution is, that it is basically reducable to a matrix multiplication and the backpropagation is simply the transposed of it. So you have already your backward pass stored in the foward pass.






          share|improve this answer









          $endgroup$



















            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0












            $begingroup$

            Max pooling will cancel the effect of not pooled values to the gradients.
            Padded values either have no effect.



            Nice thing about convolution is, that it is basically reducable to a matrix multiplication and the backpropagation is simply the transposed of it. So you have already your backward pass stored in the foward pass.






            share|improve this answer









            $endgroup$

















              0












              $begingroup$

              Max pooling will cancel the effect of not pooled values to the gradients.
              Padded values either have no effect.



              Nice thing about convolution is, that it is basically reducable to a matrix multiplication and the backpropagation is simply the transposed of it. So you have already your backward pass stored in the foward pass.






              share|improve this answer









              $endgroup$















                0












                0








                0





                $begingroup$

                Max pooling will cancel the effect of not pooled values to the gradients.
                Padded values either have no effect.



                Nice thing about convolution is, that it is basically reducable to a matrix multiplication and the backpropagation is simply the transposed of it. So you have already your backward pass stored in the foward pass.






                share|improve this answer









                $endgroup$



                Max pooling will cancel the effect of not pooled values to the gradients.
                Padded values either have no effect.



                Nice thing about convolution is, that it is basically reducable to a matrix multiplication and the backpropagation is simply the transposed of it. So you have already your backward pass stored in the foward pass.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 2 days ago









                Andreas LookAndreas Look

                42119




                42119













                    Popular posts from this blog

                    Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                    Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

                    Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?