Can we use ReLU activation function as the output layer's non-linearity?Lack of activation function in output layer at regression?Keras retrieve value of node before activation functionBackpropagation with multiple different activation functionsCensored output data, which activation function for the output layer and which loss function to use?Alternatives to linear activation function in regression tasks to limit the outputProperly using activation functions of neural networkFeed-forward neural network not training with Keras function generators deep_learning data_science machine_learning pythonObtaining correctly gradient in neural network of output with respect to input. Is relu a bad option as the activation function?Regression with -1,1 target range - Should we use a tanh activation in the last 1 unit dense layer?Learning a highly non-linear function with a small data set

I am confused as to how the inverse of a certain function is found.

Describing a chess game in a novel

Instead of a Universal Basic Income program, why not implement a "Universal Basic Needs" program?

Have researchers managed to "reverse time"? If so, what does that mean for physics?

Pauli exclusion principle

Do I need life insurance if I can cover my own funeral costs?

I got the following comment from a reputed math journal. What does it mean?

Problem with FindRoot

ERC721: How to get the owned tokens of an address

Are Roman Catholic priests ever addressed as pastor

Does this sum go infinity?

Convergence in probability and convergence in distribution

What is the adequate fee for a reveal operation?

Is it normal that my co-workers at a fitness company criticize my food choices?

Equivalents to the present tense

How do I hide Chekhov's Gun?

Why does overlay work only on the first tcolorbox?

Meme-controlled people

A single argument pattern definition applies to multiple-argument patterns?

Welcoming 2019 Pi day: How to draw the letter π?

How to plot polar formed complex numbers?

Why does energy conservation give me the wrong answer in this inelastic collision problem?

What is the significance behind "40 days" that often appears in the Bible?

Python if-else code style for reduced code for rounding floats



Can we use ReLU activation function as the output layer's non-linearity?


Lack of activation function in output layer at regression?Keras retrieve value of node before activation functionBackpropagation with multiple different activation functionsCensored output data, which activation function for the output layer and which loss function to use?Alternatives to linear activation function in regression tasks to limit the outputProperly using activation functions of neural networkFeed-forward neural network not training with Keras function generators deep_learning data_science machine_learning pythonObtaining correctly gradient in neural network of output with respect to input. Is relu a bad option as the activation function?Regression with -1,1 target range - Should we use a tanh activation in the last 1 unit dense layer?Learning a highly non-linear function with a small data set













2












$begingroup$


I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.



Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.










share|improve this question











$endgroup$
















    2












    $begingroup$


    I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.



    Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.










    share|improve this question











    $endgroup$














      2












      2








      2





      $begingroup$


      I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.



      Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.










      share|improve this question











      $endgroup$




      I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.



      Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.







      machine-learning neural-network deep-learning keras activation-function






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited yesterday









      Media

      7,34062161




      7,34062161










      asked yesterday









      bacloud14bacloud14

      699




      699




















          1 Answer
          1






          active

          oldest

          votes


















          3












          $begingroup$

          Yes, you can. Basically, for regression tasks, it is customary to use the linear function as the non-linearity due to the fact that it's differentiable and it does not limit the output. This means you can make any output using your inputs. People do not use tanh or sigmoid as the activation function of the last layers for regression tasks due to the fact that they are limited and cannot generate all numbers which are needed. In your task, you can use ReLU as the non-linearity. The concept of non-linearities in hidden layers is to add non-linear boundaries and for the last layer in regression tasks, it should make all possible choices. In your case, ReLU is the best.






          share|improve this answer









          $endgroup$








          • 1




            $begingroup$
            wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
            $endgroup$
            – bacloud14
            yesterday











          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47336%2fcan-we-use-relu-activation-function-as-the-output-layers-non-linearity%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3












          $begingroup$

          Yes, you can. Basically, for regression tasks, it is customary to use the linear function as the non-linearity due to the fact that it's differentiable and it does not limit the output. This means you can make any output using your inputs. People do not use tanh or sigmoid as the activation function of the last layers for regression tasks due to the fact that they are limited and cannot generate all numbers which are needed. In your task, you can use ReLU as the non-linearity. The concept of non-linearities in hidden layers is to add non-linear boundaries and for the last layer in regression tasks, it should make all possible choices. In your case, ReLU is the best.






          share|improve this answer









          $endgroup$








          • 1




            $begingroup$
            wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
            $endgroup$
            – bacloud14
            yesterday
















          3












          $begingroup$

          Yes, you can. Basically, for regression tasks, it is customary to use the linear function as the non-linearity due to the fact that it's differentiable and it does not limit the output. This means you can make any output using your inputs. People do not use tanh or sigmoid as the activation function of the last layers for regression tasks due to the fact that they are limited and cannot generate all numbers which are needed. In your task, you can use ReLU as the non-linearity. The concept of non-linearities in hidden layers is to add non-linear boundaries and for the last layer in regression tasks, it should make all possible choices. In your case, ReLU is the best.






          share|improve this answer









          $endgroup$








          • 1




            $begingroup$
            wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
            $endgroup$
            – bacloud14
            yesterday














          3












          3








          3





          $begingroup$

          Yes, you can. Basically, for regression tasks, it is customary to use the linear function as the non-linearity due to the fact that it's differentiable and it does not limit the output. This means you can make any output using your inputs. People do not use tanh or sigmoid as the activation function of the last layers for regression tasks due to the fact that they are limited and cannot generate all numbers which are needed. In your task, you can use ReLU as the non-linearity. The concept of non-linearities in hidden layers is to add non-linear boundaries and for the last layer in regression tasks, it should make all possible choices. In your case, ReLU is the best.






          share|improve this answer









          $endgroup$



          Yes, you can. Basically, for regression tasks, it is customary to use the linear function as the non-linearity due to the fact that it's differentiable and it does not limit the output. This means you can make any output using your inputs. People do not use tanh or sigmoid as the activation function of the last layers for regression tasks due to the fact that they are limited and cannot generate all numbers which are needed. In your task, you can use ReLU as the non-linearity. The concept of non-linearities in hidden layers is to add non-linear boundaries and for the last layer in regression tasks, it should make all possible choices. In your case, ReLU is the best.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered yesterday









          MediaMedia

          7,34062161




          7,34062161







          • 1




            $begingroup$
            wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
            $endgroup$
            – bacloud14
            yesterday













          • 1




            $begingroup$
            wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
            $endgroup$
            – bacloud14
            yesterday








          1




          1




          $begingroup$
          wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
          $endgroup$
          – bacloud14
          yesterday





          $begingroup$
          wonderful, even validation loss is getting lower. I hope it will work well on a ton of unseen data.
          $endgroup$
          – bacloud14
          yesterday


















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47336%2fcan-we-use-relu-activation-function-as-the-output-layers-non-linearity%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

          Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

          Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?