Confusion matrix logic2019 Community Moderator ElectionCalculating a Confusion MatrixPython - Get FP/TP from Confusion Matrix using a ListExport dataset with predicted target - PythonConfusion Matrix - Get Items FP/FN/TP/TN - PythonInterpreting confusion matrix and validation results in convolutional networksHow to make sense of confusion matrixUsing scikit Learn - Neural network to produce ROC CurvesIs it possible to find a model that minimises both false positive and false negative?Confusion MatrixCan Expectation Maximization estimate truth and confusion matrix from multiple noisy sources?

Coordinate position not precise

What's a natural way to say that someone works somewhere (for a job)?

The baby cries all morning

Where in the Bible does the greeting ("Dominus Vobiscum") used at Mass come from?

Is there a problem with hiding "forgot password" until it's needed?

I'm in charge of equipment buying but no one's ever happy with what I choose. How to fix this?

Minimal reference content

Tiptoe or tiphoof? Adjusting words to better fit fantasy races

Print name if parameter passed to function

At which point does a character regain all their Hit Dice?

Do I need a multiple entry visa for a trip UK -> Sweden -> UK?

Failed to fetch jessie backports repository

How does residential electricity work?

Your magic is very sketchy

Should my PhD thesis be submitted under my legal name?

How was Earth single-handedly capable of creating 3 of the 4 gods of chaos?

Can I Retrieve Email Addresses from BCC?

Is there an Impartial Brexit Deal comparison site?

Modify casing of marked letters

Hide Select Output from T-SQL

Is there a good way to store credentials outside of a password manager?

Is a roofing delivery truck likely to crack my driveway slab?

HashMap containsKey() returns false although hashCode() and equals() are true

How to verify if g is a generator for p?



Confusion matrix logic



2019 Community Moderator ElectionCalculating a Confusion MatrixPython - Get FP/TP from Confusion Matrix using a ListExport dataset with predicted target - PythonConfusion Matrix - Get Items FP/FN/TP/TN - PythonInterpreting confusion matrix and validation results in convolutional networksHow to make sense of confusion matrixUsing scikit Learn - Neural network to produce ROC CurvesIs it possible to find a model that minimises both false positive and false negative?Confusion MatrixCan Expectation Maximization estimate truth and confusion matrix from multiple noisy sources?










5












$begingroup$


Can someone explain me the logic behind the confusion matrix?



  • True Positive (TP): prediction is POSITIVE, actual outcome is POSITIVE, result is 'True Positive' - No questions.

  • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

  • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

  • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

enter image description here










share|improve this question







New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
    $endgroup$
    – Mitch
    Mar 21 at 18:22















5












$begingroup$


Can someone explain me the logic behind the confusion matrix?



  • True Positive (TP): prediction is POSITIVE, actual outcome is POSITIVE, result is 'True Positive' - No questions.

  • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

  • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

  • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

enter image description here










share|improve this question







New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
    $endgroup$
    – Mitch
    Mar 21 at 18:22













5












5








5


1



$begingroup$


Can someone explain me the logic behind the confusion matrix?



  • True Positive (TP): prediction is POSITIVE, actual outcome is POSITIVE, result is 'True Positive' - No questions.

  • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

  • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

  • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

enter image description here










share|improve this question







New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




Can someone explain me the logic behind the confusion matrix?



  • True Positive (TP): prediction is POSITIVE, actual outcome is POSITIVE, result is 'True Positive' - No questions.

  • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

  • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

  • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

enter image description here







confusion-matrix






share|improve this question







New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked Mar 21 at 9:34









Tauno TanilasTauno Tanilas

261




261




New contributor




Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Tauno Tanilas is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
    $endgroup$
    – Mitch
    Mar 21 at 18:22
















  • $begingroup$
    Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
    $endgroup$
    – Mitch
    Mar 21 at 18:22















$begingroup$
Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
$endgroup$
– Mitch
Mar 21 at 18:22




$begingroup$
Stick to positive/negative for the test, and True/false for whether the test matches reality (actual outcome). Then it should be clear.
$endgroup$
– Mitch
Mar 21 at 18:22










4 Answers
4






active

oldest

votes


















3












$begingroup$

A confusion matrix is a table that is often used to describe the performance of a classification model. The figure you have provided presents a binary case, but it is also used with more than 2 classes (there are just more rows/columns).



The rows refer to the actual Ground-Truth label/class of the input and the columns refer to the prediction provided by the model.



The name of the different cases are taken from the predictor's point of view.



True/False means that the prediction is the same as the ground truth and Negative/Positive refers to what was the prediction.



The 4 different cases in the confusion matrix:



True Positive (TP): The model's prediction is "Positive" and it is the same as the actual ground-truth class, which is "Positive", so this is a True Positive case.



False Negative (FN): The model's prediction is "Negative" and it is wrong because the actual ground-truth class is "Positive", so this is a False Negative case.



False Positive (FP): The model's prediction is "Positive" and it is wrong because the actual ground-truth class is "Negative", so this is a False Positive case.



True Negative (TN): The model's prediction is "Negative" and it is the same as the actual ground-truth class, which is "Negative", so this is a True Negative case.






share|improve this answer









$endgroup$








  • 2




    $begingroup$
    Thanks a lot! It's all clear now :)
    $endgroup$
    – Tauno Tanilas
    Mar 21 at 11:04



















1












$begingroup$

Please find the below:



  • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

    Answer : The predictive model supposed to give the answer as 'Positive', but it predicted as 'Negative', which means Falsely predicted as Negative aka False Negative.


  • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

    Answer : The predictive model supposed to give the answer as 'Negative', but it predicted as 'Positive', which means Falsely predicted as Positive aka False Positive.


  • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

    Answer : The predicted output supposed to be Negative, and model also predicted as Negative.


For better understanding, you can run a simple binary classfication model and analyze the confusion matrix.



Thank you,
KK






share|improve this answer









$endgroup$




















    1












    $begingroup$

    Seems like you understand the meaning of the confusion matrix, nut not the logic used to name its entries!



    Here are my 5 cents:



    The names are all of this kind:



    <True/False> <Positive/Negative>
    | |
    Part1 Part2


    1. The first part explains if the prediction was right or not. If you have only True Positive and True Negative your model is perfect. If you have only False Positive and False Negative your model is really bad.


    2. The second part explains the prediction of the model.


    So:



    • False Negative (FN): the prediction is NEGATIVE (0) but the first part is False, this means that the prediction is wrong (should have been POSITIVE (1)).


    • False Positive (FP): the prediction is POSITIVE (1) but the first part is False, this means that the prediction is wrong (should have been NEGATIVE (0)).


    • True Negative (TN): prediction is NEGATIVE and the first part is True. The prediction is right (model predicted NEGATIVE, for NEGATIVE samples)






    share|improve this answer









    $endgroup$




















      0












      $begingroup$


      True means Correct, False means Incorrect.




      True Positive (TP): Model predicts P, which is Correct.



      False Positive (FP): Model predicts P, which is Incorrect, must have predicted N.



      True Negative (TN): Model predicts N, which is Correct.



      False Negative (FN): Model predicts N, which is Incorrect, must have predicted P.






      share|improve this answer









      $endgroup$












        Your Answer





        StackExchange.ifUsing("editor", function ()
        return StackExchange.using("mathjaxEditing", function ()
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        );
        );
        , "mathjax-editing");

        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "557"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: false,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: null,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );






        Tauno Tanilas is a new contributor. Be nice, and check out our Code of Conduct.









        draft saved

        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47725%2fconfusion-matrix-logic%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        3












        $begingroup$

        A confusion matrix is a table that is often used to describe the performance of a classification model. The figure you have provided presents a binary case, but it is also used with more than 2 classes (there are just more rows/columns).



        The rows refer to the actual Ground-Truth label/class of the input and the columns refer to the prediction provided by the model.



        The name of the different cases are taken from the predictor's point of view.



        True/False means that the prediction is the same as the ground truth and Negative/Positive refers to what was the prediction.



        The 4 different cases in the confusion matrix:



        True Positive (TP): The model's prediction is "Positive" and it is the same as the actual ground-truth class, which is "Positive", so this is a True Positive case.



        False Negative (FN): The model's prediction is "Negative" and it is wrong because the actual ground-truth class is "Positive", so this is a False Negative case.



        False Positive (FP): The model's prediction is "Positive" and it is wrong because the actual ground-truth class is "Negative", so this is a False Positive case.



        True Negative (TN): The model's prediction is "Negative" and it is the same as the actual ground-truth class, which is "Negative", so this is a True Negative case.






        share|improve this answer









        $endgroup$








        • 2




          $begingroup$
          Thanks a lot! It's all clear now :)
          $endgroup$
          – Tauno Tanilas
          Mar 21 at 11:04
















        3












        $begingroup$

        A confusion matrix is a table that is often used to describe the performance of a classification model. The figure you have provided presents a binary case, but it is also used with more than 2 classes (there are just more rows/columns).



        The rows refer to the actual Ground-Truth label/class of the input and the columns refer to the prediction provided by the model.



        The name of the different cases are taken from the predictor's point of view.



        True/False means that the prediction is the same as the ground truth and Negative/Positive refers to what was the prediction.



        The 4 different cases in the confusion matrix:



        True Positive (TP): The model's prediction is "Positive" and it is the same as the actual ground-truth class, which is "Positive", so this is a True Positive case.



        False Negative (FN): The model's prediction is "Negative" and it is wrong because the actual ground-truth class is "Positive", so this is a False Negative case.



        False Positive (FP): The model's prediction is "Positive" and it is wrong because the actual ground-truth class is "Negative", so this is a False Positive case.



        True Negative (TN): The model's prediction is "Negative" and it is the same as the actual ground-truth class, which is "Negative", so this is a True Negative case.






        share|improve this answer









        $endgroup$








        • 2




          $begingroup$
          Thanks a lot! It's all clear now :)
          $endgroup$
          – Tauno Tanilas
          Mar 21 at 11:04














        3












        3








        3





        $begingroup$

        A confusion matrix is a table that is often used to describe the performance of a classification model. The figure you have provided presents a binary case, but it is also used with more than 2 classes (there are just more rows/columns).



        The rows refer to the actual Ground-Truth label/class of the input and the columns refer to the prediction provided by the model.



        The name of the different cases are taken from the predictor's point of view.



        True/False means that the prediction is the same as the ground truth and Negative/Positive refers to what was the prediction.



        The 4 different cases in the confusion matrix:



        True Positive (TP): The model's prediction is "Positive" and it is the same as the actual ground-truth class, which is "Positive", so this is a True Positive case.



        False Negative (FN): The model's prediction is "Negative" and it is wrong because the actual ground-truth class is "Positive", so this is a False Negative case.



        False Positive (FP): The model's prediction is "Positive" and it is wrong because the actual ground-truth class is "Negative", so this is a False Positive case.



        True Negative (TN): The model's prediction is "Negative" and it is the same as the actual ground-truth class, which is "Negative", so this is a True Negative case.






        share|improve this answer









        $endgroup$



        A confusion matrix is a table that is often used to describe the performance of a classification model. The figure you have provided presents a binary case, but it is also used with more than 2 classes (there are just more rows/columns).



        The rows refer to the actual Ground-Truth label/class of the input and the columns refer to the prediction provided by the model.



        The name of the different cases are taken from the predictor's point of view.



        True/False means that the prediction is the same as the ground truth and Negative/Positive refers to what was the prediction.



        The 4 different cases in the confusion matrix:



        True Positive (TP): The model's prediction is "Positive" and it is the same as the actual ground-truth class, which is "Positive", so this is a True Positive case.



        False Negative (FN): The model's prediction is "Negative" and it is wrong because the actual ground-truth class is "Positive", so this is a False Negative case.



        False Positive (FP): The model's prediction is "Positive" and it is wrong because the actual ground-truth class is "Negative", so this is a False Positive case.



        True Negative (TN): The model's prediction is "Negative" and it is the same as the actual ground-truth class, which is "Negative", so this is a True Negative case.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Mar 21 at 10:02









        Mark.FMark.F

        1,0191421




        1,0191421







        • 2




          $begingroup$
          Thanks a lot! It's all clear now :)
          $endgroup$
          – Tauno Tanilas
          Mar 21 at 11:04













        • 2




          $begingroup$
          Thanks a lot! It's all clear now :)
          $endgroup$
          – Tauno Tanilas
          Mar 21 at 11:04








        2




        2




        $begingroup$
        Thanks a lot! It's all clear now :)
        $endgroup$
        – Tauno Tanilas
        Mar 21 at 11:04





        $begingroup$
        Thanks a lot! It's all clear now :)
        $endgroup$
        – Tauno Tanilas
        Mar 21 at 11:04












        1












        $begingroup$

        Please find the below:



        • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

          Answer : The predictive model supposed to give the answer as 'Positive', but it predicted as 'Negative', which means Falsely predicted as Negative aka False Negative.


        • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

          Answer : The predictive model supposed to give the answer as 'Negative', but it predicted as 'Positive', which means Falsely predicted as Positive aka False Positive.


        • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

          Answer : The predicted output supposed to be Negative, and model also predicted as Negative.


        For better understanding, you can run a simple binary classfication model and analyze the confusion matrix.



        Thank you,
        KK






        share|improve this answer









        $endgroup$

















          1












          $begingroup$

          Please find the below:



          • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

            Answer : The predictive model supposed to give the answer as 'Positive', but it predicted as 'Negative', which means Falsely predicted as Negative aka False Negative.


          • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

            Answer : The predictive model supposed to give the answer as 'Negative', but it predicted as 'Positive', which means Falsely predicted as Positive aka False Positive.


          • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

            Answer : The predicted output supposed to be Negative, and model also predicted as Negative.


          For better understanding, you can run a simple binary classfication model and analyze the confusion matrix.



          Thank you,
          KK






          share|improve this answer









          $endgroup$















            1












            1








            1





            $begingroup$

            Please find the below:



            • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

              Answer : The predictive model supposed to give the answer as 'Positive', but it predicted as 'Negative', which means Falsely predicted as Negative aka False Negative.


            • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

              Answer : The predictive model supposed to give the answer as 'Negative', but it predicted as 'Positive', which means Falsely predicted as Positive aka False Positive.


            • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

              Answer : The predicted output supposed to be Negative, and model also predicted as Negative.


            For better understanding, you can run a simple binary classfication model and analyze the confusion matrix.



            Thank you,
            KK






            share|improve this answer









            $endgroup$



            Please find the below:



            • False Negative (FN): prediction is NEGATIVE, actual outcome is POSITIVE, result is 'False Negative' - Why is that? Shouldn't it be 'False Positive'?

              Answer : The predictive model supposed to give the answer as 'Positive', but it predicted as 'Negative', which means Falsely predicted as Negative aka False Negative.


            • False Positive (FP): prediction is POSITIVE, actual outcome is NEGATIVE, result is 'False Positive' - Why is that? Shouldn't it be 'True Negative'?

              Answer : The predictive model supposed to give the answer as 'Negative', but it predicted as 'Positive', which means Falsely predicted as Positive aka False Positive.


            • True Negative (TN): prediction is NEGATIVE, actual outcome is NEGATIVE, result is 'True Negative' - Why is that? Shouldn't it be 'False Negative'?

              Answer : The predicted output supposed to be Negative, and model also predicted as Negative.


            For better understanding, you can run a simple binary classfication model and analyze the confusion matrix.



            Thank you,
            KK







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Mar 21 at 10:07









            KK2491KK2491

            345220




            345220





















                1












                $begingroup$

                Seems like you understand the meaning of the confusion matrix, nut not the logic used to name its entries!



                Here are my 5 cents:



                The names are all of this kind:



                <True/False> <Positive/Negative>
                | |
                Part1 Part2


                1. The first part explains if the prediction was right or not. If you have only True Positive and True Negative your model is perfect. If you have only False Positive and False Negative your model is really bad.


                2. The second part explains the prediction of the model.


                So:



                • False Negative (FN): the prediction is NEGATIVE (0) but the first part is False, this means that the prediction is wrong (should have been POSITIVE (1)).


                • False Positive (FP): the prediction is POSITIVE (1) but the first part is False, this means that the prediction is wrong (should have been NEGATIVE (0)).


                • True Negative (TN): prediction is NEGATIVE and the first part is True. The prediction is right (model predicted NEGATIVE, for NEGATIVE samples)






                share|improve this answer









                $endgroup$

















                  1












                  $begingroup$

                  Seems like you understand the meaning of the confusion matrix, nut not the logic used to name its entries!



                  Here are my 5 cents:



                  The names are all of this kind:



                  <True/False> <Positive/Negative>
                  | |
                  Part1 Part2


                  1. The first part explains if the prediction was right or not. If you have only True Positive and True Negative your model is perfect. If you have only False Positive and False Negative your model is really bad.


                  2. The second part explains the prediction of the model.


                  So:



                  • False Negative (FN): the prediction is NEGATIVE (0) but the first part is False, this means that the prediction is wrong (should have been POSITIVE (1)).


                  • False Positive (FP): the prediction is POSITIVE (1) but the first part is False, this means that the prediction is wrong (should have been NEGATIVE (0)).


                  • True Negative (TN): prediction is NEGATIVE and the first part is True. The prediction is right (model predicted NEGATIVE, for NEGATIVE samples)






                  share|improve this answer









                  $endgroup$















                    1












                    1








                    1





                    $begingroup$

                    Seems like you understand the meaning of the confusion matrix, nut not the logic used to name its entries!



                    Here are my 5 cents:



                    The names are all of this kind:



                    <True/False> <Positive/Negative>
                    | |
                    Part1 Part2


                    1. The first part explains if the prediction was right or not. If you have only True Positive and True Negative your model is perfect. If you have only False Positive and False Negative your model is really bad.


                    2. The second part explains the prediction of the model.


                    So:



                    • False Negative (FN): the prediction is NEGATIVE (0) but the first part is False, this means that the prediction is wrong (should have been POSITIVE (1)).


                    • False Positive (FP): the prediction is POSITIVE (1) but the first part is False, this means that the prediction is wrong (should have been NEGATIVE (0)).


                    • True Negative (TN): prediction is NEGATIVE and the first part is True. The prediction is right (model predicted NEGATIVE, for NEGATIVE samples)






                    share|improve this answer









                    $endgroup$



                    Seems like you understand the meaning of the confusion matrix, nut not the logic used to name its entries!



                    Here are my 5 cents:



                    The names are all of this kind:



                    <True/False> <Positive/Negative>
                    | |
                    Part1 Part2


                    1. The first part explains if the prediction was right or not. If you have only True Positive and True Negative your model is perfect. If you have only False Positive and False Negative your model is really bad.


                    2. The second part explains the prediction of the model.


                    So:



                    • False Negative (FN): the prediction is NEGATIVE (0) but the first part is False, this means that the prediction is wrong (should have been POSITIVE (1)).


                    • False Positive (FP): the prediction is POSITIVE (1) but the first part is False, this means that the prediction is wrong (should have been NEGATIVE (0)).


                    • True Negative (TN): prediction is NEGATIVE and the first part is True. The prediction is right (model predicted NEGATIVE, for NEGATIVE samples)







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Mar 21 at 10:44









                    Francesco PegoraroFrancesco Pegoraro

                    60718




                    60718





















                        0












                        $begingroup$


                        True means Correct, False means Incorrect.




                        True Positive (TP): Model predicts P, which is Correct.



                        False Positive (FP): Model predicts P, which is Incorrect, must have predicted N.



                        True Negative (TN): Model predicts N, which is Correct.



                        False Negative (FN): Model predicts N, which is Incorrect, must have predicted P.






                        share|improve this answer









                        $endgroup$

















                          0












                          $begingroup$


                          True means Correct, False means Incorrect.




                          True Positive (TP): Model predicts P, which is Correct.



                          False Positive (FP): Model predicts P, which is Incorrect, must have predicted N.



                          True Negative (TN): Model predicts N, which is Correct.



                          False Negative (FN): Model predicts N, which is Incorrect, must have predicted P.






                          share|improve this answer









                          $endgroup$















                            0












                            0








                            0





                            $begingroup$


                            True means Correct, False means Incorrect.




                            True Positive (TP): Model predicts P, which is Correct.



                            False Positive (FP): Model predicts P, which is Incorrect, must have predicted N.



                            True Negative (TN): Model predicts N, which is Correct.



                            False Negative (FN): Model predicts N, which is Incorrect, must have predicted P.






                            share|improve this answer









                            $endgroup$




                            True means Correct, False means Incorrect.




                            True Positive (TP): Model predicts P, which is Correct.



                            False Positive (FP): Model predicts P, which is Incorrect, must have predicted N.



                            True Negative (TN): Model predicts N, which is Correct.



                            False Negative (FN): Model predicts N, which is Incorrect, must have predicted P.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Mar 21 at 15:53









                            EsmailianEsmailian

                            1,906116




                            1,906116




















                                Tauno Tanilas is a new contributor. Be nice, and check out our Code of Conduct.









                                draft saved

                                draft discarded


















                                Tauno Tanilas is a new contributor. Be nice, and check out our Code of Conduct.












                                Tauno Tanilas is a new contributor. Be nice, and check out our Code of Conduct.











                                Tauno Tanilas is a new contributor. Be nice, and check out our Code of Conduct.














                                Thanks for contributing an answer to Data Science Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47725%2fconfusion-matrix-logic%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                                Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                                Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High