Connection between piecewise linear basis functions and RELU activation function2019 Community Moderator ElectionWhy is Reconstruction in Autoencoders Using the Same Activation Function as Forward Activation, and not the Inverse?Deriving backpropagation equations “natively” in tensor formWhy ReLU is better than the other activation functionsWhy can it be that my neural network is predicting the contrary?Under what conditions should an autoencoder be chosen over kernel PCA?What exactly is the “hyperbolic” tanh function used in the context of activation functions?Properly using activation functions of neural networkCan we think of neurons as maps between matrices?Solving an ODE using neural networks (via Tensorflow)What is the motivation for row-wise convolution and folding in Kalchbrenner et al. (2014)?
How to manage monthly salary
Is there any use for defining additional entity types in a SOQL FROM clause?
Why did the Germans forbid the possession of pet pigeons in Rostov-on-Don in 1941?
Is it wise to hold on to stock that has plummeted and then stabilized?
What does 'script /dev/null' do?
Is it legal to have the "// (c) 2019 John Smith" header in all files when there are hundreds of contributors?
What to wear for invited talk in Canada
"listening to me about as much as you're listening to this pole here"
Why doesn't a const reference extend the life of a temporary object passed via a function?
How to make payment on the internet without leaving a money trail?
Manga about a female worker who got dragged into another world together with this high school girl and she was just told she's not needed anymore
When blogging recipes, how can I support both readers who want the narrative/journey and ones who want the printer-friendly recipe?
What is the command to reset a PC without deleting any files
Is domain driven design an anti-SQL pattern?
Patience, young "Padovan"
Shall I use personal or official e-mail account when registering to external websites for work purpose?
Why is my log file so massive? 22gb. I am running log backups
Email Account under attack (really) - anything I can do?
Find the number of surjections from A to B.
Where else does the Shulchan Aruch quote an authority by name?
Copycat chess is back
What are the advantages and disadvantages of running one shots compared to campaigns?
LWC and complex parameters
Landing in very high winds
Connection between piecewise linear basis functions and RELU activation function
2019 Community Moderator ElectionWhy is Reconstruction in Autoencoders Using the Same Activation Function as Forward Activation, and not the Inverse?Deriving backpropagation equations “natively” in tensor formWhy ReLU is better than the other activation functionsWhy can it be that my neural network is predicting the contrary?Under what conditions should an autoencoder be chosen over kernel PCA?What exactly is the “hyperbolic” tanh function used in the context of activation functions?Properly using activation functions of neural networkCan we think of neurons as maps between matrices?Solving an ODE using neural networks (via Tensorflow)What is the motivation for row-wise convolution and folding in Kalchbrenner et al. (2014)?
$begingroup$
ReLU activation is defined as follows
$$sigma(x)=max(0, x).$$
Let's assume that I have deep network of 1 hidden layer, than output from my layer has form
$$ f(x)= sigma(Wx +b), $$
where matrix W represents the weights, b is bias and x stands for input data (output of previous layer).
What is connection between $f(x)$ and piecewise linear function, defined as
$$g(x_j)=sum_i=N^N alpha_i psi_i(x_j),$$
where $alpha_i$ stands for the coefficients and $psi_i$ for the basis function.
More precisely, I would like to identify basis functions and coefficients in the definition of $f(x)$.
deep-learning linear-algebra deep-network
$endgroup$
add a comment |
$begingroup$
ReLU activation is defined as follows
$$sigma(x)=max(0, x).$$
Let's assume that I have deep network of 1 hidden layer, than output from my layer has form
$$ f(x)= sigma(Wx +b), $$
where matrix W represents the weights, b is bias and x stands for input data (output of previous layer).
What is connection between $f(x)$ and piecewise linear function, defined as
$$g(x_j)=sum_i=N^N alpha_i psi_i(x_j),$$
where $alpha_i$ stands for the coefficients and $psi_i$ for the basis function.
More precisely, I would like to identify basis functions and coefficients in the definition of $f(x)$.
deep-learning linear-algebra deep-network
$endgroup$
$begingroup$
could you add more details in your question about thepiecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!
$endgroup$
– anu
Mar 30 at 23:56
add a comment |
$begingroup$
ReLU activation is defined as follows
$$sigma(x)=max(0, x).$$
Let's assume that I have deep network of 1 hidden layer, than output from my layer has form
$$ f(x)= sigma(Wx +b), $$
where matrix W represents the weights, b is bias and x stands for input data (output of previous layer).
What is connection between $f(x)$ and piecewise linear function, defined as
$$g(x_j)=sum_i=N^N alpha_i psi_i(x_j),$$
where $alpha_i$ stands for the coefficients and $psi_i$ for the basis function.
More precisely, I would like to identify basis functions and coefficients in the definition of $f(x)$.
deep-learning linear-algebra deep-network
$endgroup$
ReLU activation is defined as follows
$$sigma(x)=max(0, x).$$
Let's assume that I have deep network of 1 hidden layer, than output from my layer has form
$$ f(x)= sigma(Wx +b), $$
where matrix W represents the weights, b is bias and x stands for input data (output of previous layer).
What is connection between $f(x)$ and piecewise linear function, defined as
$$g(x_j)=sum_i=N^N alpha_i psi_i(x_j),$$
where $alpha_i$ stands for the coefficients and $psi_i$ for the basis function.
More precisely, I would like to identify basis functions and coefficients in the definition of $f(x)$.
deep-learning linear-algebra deep-network
deep-learning linear-algebra deep-network
edited Mar 31 at 19:49
Community♦
1
1
asked Mar 29 at 8:37
user70435user70435
61
61
$begingroup$
could you add more details in your question about thepiecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!
$endgroup$
– anu
Mar 30 at 23:56
add a comment |
$begingroup$
could you add more details in your question about thepiecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!
$endgroup$
– anu
Mar 30 at 23:56
$begingroup$
could you add more details in your question about the
piecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!$endgroup$
– anu
Mar 30 at 23:56
$begingroup$
could you add more details in your question about the
piecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!$endgroup$
– anu
Mar 30 at 23:56
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48198%2fconnection-between-piecewise-linear-basis-functions-and-relu-activation-function%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48198%2fconnection-between-piecewise-linear-basis-functions-and-relu-activation-function%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
could you add more details in your question about the
piecewise linear basis functions
? The more details you add with what you thinking to compare, the more precise the answer you will get!$endgroup$
– anu
Mar 30 at 23:56