Certainity of a classifier Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) 2019 Moderator Election Q&A - Questionnaire 2019 Community Moderator Election ResultsHow to improve an existing machine learning classifier in python?Predicting probability from scikit-learn SVC decision_function with decision_function_shape='ovo'Need to calculate derived metrics in a classifierInterpreting Machine Learning Classification MetricsClassifier that optimizes performance on only a subset of the data?Lightweight binary image classifierOne Class ClassificationAlways getting value one for a binary classifierMy naive (ha!) Gaussian Naive Bayes classifier is too slow

What is the evidence that custom checks in Northern Ireland are going to result in violence?

Why did Bronn offer to be Tyrion Lannister's champion in trial by combat?

Can a Knight grant Knighthood to another?

A journey... into the MIND

Protagonist's race is hidden - should I reveal it?

How to keep bees out of canned beverages?

Why aren't road bike wheels tiny?

Providing direct feedback to a product salesperson

Is Bran literally the world's memory?

Why does BitLocker not use RSA?

Compiling and throwing simple dynamic exceptions at runtime for JVM

What kind of equipment or other technology is necessary to photograph sprites (atmospheric phenomenon)

Pointing to problems without suggesting solutions

How do I deal with an erroneously large refund?

Etymology of 見舞い

"Destructive force" carried by a B-52?

Why do people think Winterfell crypts is the safest place for women, children & old people?

A German immigrant ancestor has a "Registration Affidavit of Alien Enemy" on file. What does that mean exactly?

/bin/ls sorts differently than just ls

tabularx column has extra padding at right?

Can this water damage be explained by lack of gutters and grading issues?

What helicopter has the most rotor blades?

Why does my GNOME settings mention "Moto C Plus"?

Converting a text document with special format to Pandas DataFrame



Certainity of a classifier



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)
2019 Moderator Election Q&A - Questionnaire
2019 Community Moderator Election ResultsHow to improve an existing machine learning classifier in python?Predicting probability from scikit-learn SVC decision_function with decision_function_shape='ovo'Need to calculate derived metrics in a classifierInterpreting Machine Learning Classification MetricsClassifier that optimizes performance on only a subset of the data?Lightweight binary image classifierOne Class ClassificationAlways getting value one for a binary classifierMy naive (ha!) Gaussian Naive Bayes classifier is too slow










0












$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question











$endgroup$







  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    Apr 4 at 15:42















0












$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question











$endgroup$







  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    Apr 4 at 15:42













0












0








0





$begingroup$


How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.










share|improve this question











$endgroup$




How to build a classifier that by default will predict that it is for class 1, but if the classifier believes with 80 certainity that it belongs to 0, it will be classed as 0. How to check how certain a classifier is on it's prediction.







python classifier






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 4 at 15:43









pcko1

1,726418




1,726418










asked Apr 4 at 15:32









OmanOman

82




82







  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    Apr 4 at 15:42












  • 1




    $begingroup$
    Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
    $endgroup$
    – Tasos
    Apr 4 at 15:42







1




1




$begingroup$
Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
$endgroup$
– Tasos
Apr 4 at 15:42




$begingroup$
Why don't you use a classifier that can export probabilities (like a Decision Tree) and make the prediction manually from there? If the probability of class 0 is > 0.8, return 0, else return 1.
$endgroup$
– Tasos
Apr 4 at 15:42










2 Answers
2






active

oldest

votes


















2












$begingroup$

Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import make_classification

# Make a dataset
X, y = make_classification(n_samples=1000, n_features=4,
n_informative=2, n_redundant=0,
random_state=0, shuffle=False)

clf = RandomForestClassifier(n_estimators=100, max_depth=2,
random_state=0)
clf.fit(X, y)

# 1 if proba is less than 0.8, otherwise 0
predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





share|improve this answer











$endgroup$




















    0












    $begingroup$

    You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






    share|improve this answer









    $endgroup$













      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "557"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48615%2fcertainity-of-a-classifier%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



      from sklearn.ensemble import RandomForestClassifier
      from sklearn.datasets import make_classification

      # Make a dataset
      X, y = make_classification(n_samples=1000, n_features=4,
      n_informative=2, n_redundant=0,
      random_state=0, shuffle=False)

      clf = RandomForestClassifier(n_estimators=100, max_depth=2,
      random_state=0)
      clf.fit(X, y)

      # 1 if proba is less than 0.8, otherwise 0
      predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





      share|improve this answer











      $endgroup$

















        2












        $begingroup$

        Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



        from sklearn.ensemble import RandomForestClassifier
        from sklearn.datasets import make_classification

        # Make a dataset
        X, y = make_classification(n_samples=1000, n_features=4,
        n_informative=2, n_redundant=0,
        random_state=0, shuffle=False)

        clf = RandomForestClassifier(n_estimators=100, max_depth=2,
        random_state=0)
        clf.fit(X, y)

        # 1 if proba is less than 0.8, otherwise 0
        predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





        share|improve this answer











        $endgroup$















          2












          2








          2





          $begingroup$

          Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



          from sklearn.ensemble import RandomForestClassifier
          from sklearn.datasets import make_classification

          # Make a dataset
          X, y = make_classification(n_samples=1000, n_features=4,
          n_informative=2, n_redundant=0,
          random_state=0, shuffle=False)

          clf = RandomForestClassifier(n_estimators=100, max_depth=2,
          random_state=0)
          clf.fit(X, y)

          # 1 if proba is less than 0.8, otherwise 0
          predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)





          share|improve this answer











          $endgroup$



          Many classifiers will give the option to get predicted probability. Then you can just put a threshold. Here is how it can be done in with sklearn:



          from sklearn.ensemble import RandomForestClassifier
          from sklearn.datasets import make_classification

          # Make a dataset
          X, y = make_classification(n_samples=1000, n_features=4,
          n_informative=2, n_redundant=0,
          random_state=0, shuffle=False)

          clf = RandomForestClassifier(n_estimators=100, max_depth=2,
          random_state=0)
          clf.fit(X, y)

          # 1 if proba is less than 0.8, otherwise 0
          predictions = 1 - (clf.predict_proba(X)[:, 0] > 0.80)






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Apr 4 at 15:55

























          answered Apr 4 at 15:42









          Simon LarssonSimon Larsson

          1,100214




          1,100214





















              0












              $begingroup$

              You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






              share|improve this answer









              $endgroup$

















                0












                $begingroup$

                You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






                share|improve this answer









                $endgroup$















                  0












                  0








                  0





                  $begingroup$

                  You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.






                  share|improve this answer









                  $endgroup$



                  You can build a neural network with softmax activation on the output layer, to give you values within the range [0,1]. Then you can further post-process those predictions however you like, i.e. using a threshold of 0.8 for binary classification between 0 and 1.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Apr 4 at 15:41









                  pcko1pcko1

                  1,726418




                  1,726418



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Data Science Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48615%2fcertainity-of-a-classifier%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                      Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

                      Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?