What are the criteria for updating bias values in back propagation?Trying to figure out how to set weights for convolutional networksBack-propagation through max pooling layersWhy is the learning rate for the bias usually twice as large as the the LR for the weights?How to update bias in CNN?Back Propagation Using MATLABUpdating the weights of the filters in a CNNBack Propagation in time for tf.nn.dynamic_rnn for sequential input (from batch)How does backpropagation differ from reverse-mode autodiffWhat are the possible values of a filter in a CNN?CNN Back Propagation without Sigmoid Derivative
Single Colour Mastermind Problem
If Earth is tilted, why is Polaris always above the same spot?
Will tsunami waves travel forever if there was no land?
Where did the extra Pym particles come from in Endgame?
Please, smoke with good manners
Why does Bran Stark feel that Jon Snow "needs to know" about his lineage?
In the time of the mishna, were there Jewish cities without courts?
Minimum value of 4 digit number divided by sum of its digits
Confusion about capacitors
How to set printing options as reverse order as default on 18.04
Is GOCE a satellite or aircraft?
Phrase for the opposite of "foolproof"
Pawn Sacrifice Justification
Why does nature favour the Laplacian?
Binary Numbers Magic Trick
A question regarding using the definite article
Why does processed meat contain preservatives, while canned fish needs not?
Python "triplet" dictionary?
Are some sounds more pleasing to the ear, like ㄴ and ㅁ?
Why do computer-science majors learn calculus?
How to set the font color of quantity objects (Version 11.3 vs version 12)?
How to back up a running remote server?
What's the metal clinking sound at the end of credits in Avengers: Endgame?
How to stop co-workers from teasing me because I know Russian?
What are the criteria for updating bias values in back propagation?
Trying to figure out how to set weights for convolutional networksBack-propagation through max pooling layersWhy is the learning rate for the bias usually twice as large as the the LR for the weights?How to update bias in CNN?Back Propagation Using MATLABUpdating the weights of the filters in a CNNBack Propagation in time for tf.nn.dynamic_rnn for sequential input (from batch)How does backpropagation differ from reverse-mode autodiffWhat are the possible values of a filter in a CNN?CNN Back Propagation without Sigmoid Derivative
$begingroup$
During back propagation, the algorithm can modify the weight values or bias values to reduce the loss.
How does the algorithm decide whether it has to modify the weight values or bias values to reduce the loss?
Does it modify the weight values in one pass and bias values in another pass?
Thanks!
deep-learning cnn backpropagation
$endgroup$
add a comment |
$begingroup$
During back propagation, the algorithm can modify the weight values or bias values to reduce the loss.
How does the algorithm decide whether it has to modify the weight values or bias values to reduce the loss?
Does it modify the weight values in one pass and bias values in another pass?
Thanks!
deep-learning cnn backpropagation
$endgroup$
add a comment |
$begingroup$
During back propagation, the algorithm can modify the weight values or bias values to reduce the loss.
How does the algorithm decide whether it has to modify the weight values or bias values to reduce the loss?
Does it modify the weight values in one pass and bias values in another pass?
Thanks!
deep-learning cnn backpropagation
$endgroup$
During back propagation, the algorithm can modify the weight values or bias values to reduce the loss.
How does the algorithm decide whether it has to modify the weight values or bias values to reduce the loss?
Does it modify the weight values in one pass and bias values in another pass?
Thanks!
deep-learning cnn backpropagation
deep-learning cnn backpropagation
asked Apr 9 at 4:12
MaanuMaanu
1031
1031
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Actually, weight values and bias values are updated simultaneously in each pass of backpropagation. That’s because the orientation of loss gradient vector is determined by the partial derivatives of all weights and biases with respect to the loss function. So if in each pass, you want to move in the correct direction towards the minimun of loss function, you must update both weights and biases at the same time and in the correct orientation.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48932%2fwhat-are-the-criteria-for-updating-bias-values-in-back-propagation%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Actually, weight values and bias values are updated simultaneously in each pass of backpropagation. That’s because the orientation of loss gradient vector is determined by the partial derivatives of all weights and biases with respect to the loss function. So if in each pass, you want to move in the correct direction towards the minimun of loss function, you must update both weights and biases at the same time and in the correct orientation.
$endgroup$
add a comment |
$begingroup$
Actually, weight values and bias values are updated simultaneously in each pass of backpropagation. That’s because the orientation of loss gradient vector is determined by the partial derivatives of all weights and biases with respect to the loss function. So if in each pass, you want to move in the correct direction towards the minimun of loss function, you must update both weights and biases at the same time and in the correct orientation.
$endgroup$
add a comment |
$begingroup$
Actually, weight values and bias values are updated simultaneously in each pass of backpropagation. That’s because the orientation of loss gradient vector is determined by the partial derivatives of all weights and biases with respect to the loss function. So if in each pass, you want to move in the correct direction towards the minimun of loss function, you must update both weights and biases at the same time and in the correct orientation.
$endgroup$
Actually, weight values and bias values are updated simultaneously in each pass of backpropagation. That’s because the orientation of loss gradient vector is determined by the partial derivatives of all weights and biases with respect to the loss function. So if in each pass, you want to move in the correct direction towards the minimun of loss function, you must update both weights and biases at the same time and in the correct orientation.
answered Apr 9 at 5:27
pythinkerpythinker
8641314
8641314
add a comment |
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48932%2fwhat-are-the-criteria-for-updating-bias-values-in-back-propagation%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown