Multi label classification and sigmoid functionAccuracy drops if more layers trainable - weirdWhich Loss cross-entropy do I've to use?Keras mulilabel classification loss function: how to get accurate val_acc using binary crossentropy?How to perform a reggression on 3 functions using a Neural NetworkUsing Keras to Predict a Function Following a Normal DistributionMulti-input Convolutional Neural Network for Images ClassificationProbability Calibration : role of hidden layer in Neural NetworkMetrics values are equal while training and testing a modelSteps taking too long to completeOptimization based on validation and not training

Should we release the security issues we found in our product as CVE or we can just update those on weekly release notes?

Co-worker team leader wants to inject his friend's awful software into our development. What should I say to our common boss?

Why do passenger jet manufacturers design their planes with stall prevention systems?

How do anti-virus programs start at Windows boot?

Ban on all campaign finance?

I need to drive a 7/16" nut but am unsure how to use the socket I bought for my screwdriver

Python: Check if string and its substring are existing in the same list

Giving EXEC (@Variable) a Column name and Concatenation

Why do Australian milk farmers need to protest supermarkets' milk price?

Will a pinhole camera work with instant film?

Backup with Hanoi Strategy

What models do Create ML and Turi Create use

What are substitutions for coconut in curry?

4 tikzpictures in a 2x2 layout

Rules about breaking the rules. How do I do it well?

Create shipment and invoice in mass action

Why doesn't using two cd commands in bash script execute the second command?

Can I get a Visa Waiver after spending 6 months in USA with B-2 Visa?

Min function accepting varying number of arguments in C++17

Rejected in the fourth interview round, citing insufficient years of experience

What does it mean to make a bootable LiveUSB?

If I can solve Sudoku can I solve TSP? If yes, how?

What is IP squat space

Who is our nearest planetary neighbor, on average?



Multi label classification and sigmoid function


Accuracy drops if more layers trainable - weirdWhich Loss cross-entropy do I've to use?Keras mulilabel classification loss function: how to get accurate val_acc using binary crossentropy?How to perform a reggression on 3 functions using a Neural NetworkUsing Keras to Predict a Function Following a Normal DistributionMulti-input Convolutional Neural Network for Images ClassificationProbability Calibration : role of hidden layer in Neural NetworkMetrics values are equal while training and testing a modelSteps taking too long to completeOptimization based on validation and not training













1












$begingroup$


I'm new to neural networks so this may be silly question.
I have build standard CNN network for image classification. I want multi-label classification network so I
use binary_crossentropy as loss function:



model.compile(loss='binary_crossentropy',
optimizer=optimizers.RMSprop(lr=1e-4),
metrics=['acc'])


and have sigmoid function as activation function in last layer (two neurons for two labels):



model.add(layers.Dense(2, activation='sigmoid'))


Output gives me something like this:



[[0.000497834], [0.99942183]]


  1. Why does this two numbers add to 1, isn't sigmoid output suppose to be independent?

  2. What should I do to get independent probability as output (for example, if image doesn't belong to any of two classes
    output should be close to 0 for two neurons, something like this : [[0.001], [0.001]]

Thanks in advance for any help.










share|improve this question







New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    0.000497834 + 0.99942183 = 0.999919664 (not 1)
    $endgroup$
    – TitoOrt
    yesterday










  • $begingroup$
    Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
    $endgroup$
    – Aditya
    yesterday











  • $begingroup$
    Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
    $endgroup$
    – Kamil
    yesterday















1












$begingroup$


I'm new to neural networks so this may be silly question.
I have build standard CNN network for image classification. I want multi-label classification network so I
use binary_crossentropy as loss function:



model.compile(loss='binary_crossentropy',
optimizer=optimizers.RMSprop(lr=1e-4),
metrics=['acc'])


and have sigmoid function as activation function in last layer (two neurons for two labels):



model.add(layers.Dense(2, activation='sigmoid'))


Output gives me something like this:



[[0.000497834], [0.99942183]]


  1. Why does this two numbers add to 1, isn't sigmoid output suppose to be independent?

  2. What should I do to get independent probability as output (for example, if image doesn't belong to any of two classes
    output should be close to 0 for two neurons, something like this : [[0.001], [0.001]]

Thanks in advance for any help.










share|improve this question







New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    0.000497834 + 0.99942183 = 0.999919664 (not 1)
    $endgroup$
    – TitoOrt
    yesterday










  • $begingroup$
    Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
    $endgroup$
    – Aditya
    yesterday











  • $begingroup$
    Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
    $endgroup$
    – Kamil
    yesterday













1












1








1





$begingroup$


I'm new to neural networks so this may be silly question.
I have build standard CNN network for image classification. I want multi-label classification network so I
use binary_crossentropy as loss function:



model.compile(loss='binary_crossentropy',
optimizer=optimizers.RMSprop(lr=1e-4),
metrics=['acc'])


and have sigmoid function as activation function in last layer (two neurons for two labels):



model.add(layers.Dense(2, activation='sigmoid'))


Output gives me something like this:



[[0.000497834], [0.99942183]]


  1. Why does this two numbers add to 1, isn't sigmoid output suppose to be independent?

  2. What should I do to get independent probability as output (for example, if image doesn't belong to any of two classes
    output should be close to 0 for two neurons, something like this : [[0.001], [0.001]]

Thanks in advance for any help.










share|improve this question







New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I'm new to neural networks so this may be silly question.
I have build standard CNN network for image classification. I want multi-label classification network so I
use binary_crossentropy as loss function:



model.compile(loss='binary_crossentropy',
optimizer=optimizers.RMSprop(lr=1e-4),
metrics=['acc'])


and have sigmoid function as activation function in last layer (two neurons for two labels):



model.add(layers.Dense(2, activation='sigmoid'))


Output gives me something like this:



[[0.000497834], [0.99942183]]


  1. Why does this two numbers add to 1, isn't sigmoid output suppose to be independent?

  2. What should I do to get independent probability as output (for example, if image doesn't belong to any of two classes
    output should be close to 0 for two neurons, something like this : [[0.001], [0.001]]

Thanks in advance for any help.







keras convnet






share|improve this question







New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked yesterday









KamilKamil

61




61




New contributor




Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Kamil is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    0.000497834 + 0.99942183 = 0.999919664 (not 1)
    $endgroup$
    – TitoOrt
    yesterday










  • $begingroup$
    Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
    $endgroup$
    – Aditya
    yesterday











  • $begingroup$
    Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
    $endgroup$
    – Kamil
    yesterday
















  • $begingroup$
    0.000497834 + 0.99942183 = 0.999919664 (not 1)
    $endgroup$
    – TitoOrt
    yesterday










  • $begingroup$
    Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
    $endgroup$
    – Shamit Verma
    yesterday










  • $begingroup$
    You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
    $endgroup$
    – Aditya
    yesterday











  • $begingroup$
    Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
    $endgroup$
    – Kamil
    yesterday















$begingroup$
0.000497834 + 0.99942183 = 0.999919664 (not 1)
$endgroup$
– TitoOrt
yesterday




$begingroup$
0.000497834 + 0.99942183 = 0.999919664 (not 1)
$endgroup$
– TitoOrt
yesterday












$begingroup$
Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
$endgroup$
– Shamit Verma
yesterday




$begingroup$
Can you post complete model definition? You expectation with 2 sigmoid outputs is correct (outs should be independent)
$endgroup$
– Shamit Verma
yesterday












$begingroup$
Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
$endgroup$
– Shamit Verma
yesterday




$begingroup$
Also, post few rows from train_y . If most rows are not multi-babel, network might learn to predict only 1 output as 1 and another as 0
$endgroup$
– Shamit Verma
yesterday












$begingroup$
You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
$endgroup$
– Aditya
yesterday





$begingroup$
You are using binary cross entropy, so it's giving you 2 outputs as you also have 2 o/p at sigmoid, something is wrong with your arch maybe.. as to what you desire and what you are actually doing.. To get. The third class, that isn't easy because the o/p will then be close to .5 for both the classes to indicate that model isn't sure about this...
$endgroup$
– Aditya
yesterday













$begingroup$
Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
$endgroup$
– Kamil
yesterday




$begingroup$
Shamit Verma you were right, most rows in train_y are not multi-label, thanks. What about second question? @Aditya Is this good approach to the problem if i want my output to be like [0.01, 0.01] when image doesn't belong to any class?
$endgroup$
– Kamil
yesterday










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);






Kamil is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47293%2fmulti-label-classification-and-sigmoid-function%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes








Kamil is a new contributor. Be nice, and check out our Code of Conduct.









draft saved

draft discarded


















Kamil is a new contributor. Be nice, and check out our Code of Conduct.












Kamil is a new contributor. Be nice, and check out our Code of Conduct.











Kamil is a new contributor. Be nice, and check out our Code of Conduct.














Thanks for contributing an answer to Data Science Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47293%2fmulti-label-classification-and-sigmoid-function%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High