When does fitting happen in KNN?2019 Community Moderator ElectionHow to continue incremental learning when a categorical variable has been assigned additional category labels?When to use Multinomial Naive Bayes?Classification using Orange 3When is feature transformation required?How would I apply anomaly detection to time series data in LSTM?Why does a machine learning algorithm need a bias?Décision tree, How to see under/over fitting with just looking at the leafs?How can someone avoid over fitting or data leak in ridge and lasso regression when the training score is high and test score is low?Purpose of test data in binary classificationModel accuracy changes when target attribute change

Convert two switches to a dual stack, and add outlet - possible here?

Client team has low performances and low technical skills: we always fix their work and now they stop collaborate with us. How to solve?

DC-DC converter from low voltage at high current, to high voltage at low current

Can a vampire attack twice with their claws using multiattack?

What's the point of deactivating Num Lock on login screens?

If human space travel is limited by the G force vulnerability, is there a way to counter G forces?

Uncaught TypeError: 'set' on proxy: trap returned falsish for property Name

Mutually beneficial digestive system symbiotes

Today is the Center

Does detail obscure or enhance action?

Why doesn't Newton's third law mean a person bounces back to where they started when they hit the ground?

Mortgage Pre-approval / Loan - Apply Alone or with Fiancée?

Why do I get two different answers for this counting problem?

Operational amplifier as a comparator at high frequency

How much of data wrangling is a data scientist's job?

I'm flying to France today and my passport expires in less than 2 months

Is it legal for company to use my work email to pretend I still work there?

Is it inappropriate for a student to attend their mentor's dissertation defense?

Accidentally leaked the solution to an assignment, what to do now? (I'm the prof)

Why is consensus so controversial in Britain?

How is the claim "I am in New York only if I am in America" the same as "If I am in New York, then I am in America?

Do I have a twin with permutated remainders?

meaning of に in 本当に?

Could an aircraft fly or hover using only jets of compressed air?



When does fitting happen in KNN?



2019 Community Moderator ElectionHow to continue incremental learning when a categorical variable has been assigned additional category labels?When to use Multinomial Naive Bayes?Classification using Orange 3When is feature transformation required?How would I apply anomaly detection to time series data in LSTM?Why does a machine learning algorithm need a bias?Décision tree, How to see under/over fitting with just looking at the leafs?How can someone avoid over fitting or data leak in ridge and lasso regression when the training score is high and test score is low?Purpose of test data in binary classificationModel accuracy changes when target attribute change










1












$begingroup$


In training session, model fitting happens to reduce error. But does KNN do this?



Reducing error only happens due to changing K value and number of features, isn't it?



So training set and test set is only for do things below, right?



  1. Train model with training set

  2. Given test sample, model finds K-nearest neighbors in training set

  3. Do classification or regression for test sample

  4. Find accuracy with MSE or RMSE









share|improve this question









$endgroup$
















    1












    $begingroup$


    In training session, model fitting happens to reduce error. But does KNN do this?



    Reducing error only happens due to changing K value and number of features, isn't it?



    So training set and test set is only for do things below, right?



    1. Train model with training set

    2. Given test sample, model finds K-nearest neighbors in training set

    3. Do classification or regression for test sample

    4. Find accuracy with MSE or RMSE









    share|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      In training session, model fitting happens to reduce error. But does KNN do this?



      Reducing error only happens due to changing K value and number of features, isn't it?



      So training set and test set is only for do things below, right?



      1. Train model with training set

      2. Given test sample, model finds K-nearest neighbors in training set

      3. Do classification or regression for test sample

      4. Find accuracy with MSE or RMSE









      share|improve this question









      $endgroup$




      In training session, model fitting happens to reduce error. But does KNN do this?



      Reducing error only happens due to changing K value and number of features, isn't it?



      So training set and test set is only for do things below, right?



      1. Train model with training set

      2. Given test sample, model finds K-nearest neighbors in training set

      3. Do classification or regression for test sample

      4. Find accuracy with MSE or RMSE






      machine-learning-model






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 27 at 20:29









      Jinwoo LeeJinwoo Lee

      82




      82




















          1 Answer
          1






          active

          oldest

          votes


















          3












          $begingroup$

          Short version: this is just terminology, but arguably K-NN doesn't actually fit a model.



          Conceptually, K-NN is a lazy learning model. This means that, in a sense, there's no fitting until a new instance arrives to be classified or a value has to be predicted (depending if you're using K-NN for classifying or regression, both are possible). Even then, using the term "fitting the model" is a bit out-of-place, in my opinion.



          I think your confusion is in the sentence "model fitting happens to reduce error". No, it does not. Model fitting is simply getting a model (for example, a family of data distributions) and fitting the model by finding the parameters that better describe the data (thus choosing a member from that family).



          Maybe you're thinking of neural network's epochs as "fitting"? What happens there is that there's a family of possible models (all the possible values for each weight in the network), and fitting the model is just finding the best possible values. The fact that neural networks do so in an iterative manner does not mean that fitting is an iterative process, fitting is the end result.




          So training set and test set is only for do things below, right?




          Well, that's true for every model, yes. Training and test set separation serves the only purpose of evaluating the model on the chosen metrics. At the end of the day, the final model you choose to deploy, be it lazy or eager, will use all the available data (or a portion/transformation of it, in some cases), not just the training set.






          share|improve this answer











          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "557"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48108%2fwhen-does-fitting-happen-in-knn%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            3












            $begingroup$

            Short version: this is just terminology, but arguably K-NN doesn't actually fit a model.



            Conceptually, K-NN is a lazy learning model. This means that, in a sense, there's no fitting until a new instance arrives to be classified or a value has to be predicted (depending if you're using K-NN for classifying or regression, both are possible). Even then, using the term "fitting the model" is a bit out-of-place, in my opinion.



            I think your confusion is in the sentence "model fitting happens to reduce error". No, it does not. Model fitting is simply getting a model (for example, a family of data distributions) and fitting the model by finding the parameters that better describe the data (thus choosing a member from that family).



            Maybe you're thinking of neural network's epochs as "fitting"? What happens there is that there's a family of possible models (all the possible values for each weight in the network), and fitting the model is just finding the best possible values. The fact that neural networks do so in an iterative manner does not mean that fitting is an iterative process, fitting is the end result.




            So training set and test set is only for do things below, right?




            Well, that's true for every model, yes. Training and test set separation serves the only purpose of evaluating the model on the chosen metrics. At the end of the day, the final model you choose to deploy, be it lazy or eager, will use all the available data (or a portion/transformation of it, in some cases), not just the training set.






            share|improve this answer











            $endgroup$

















              3












              $begingroup$

              Short version: this is just terminology, but arguably K-NN doesn't actually fit a model.



              Conceptually, K-NN is a lazy learning model. This means that, in a sense, there's no fitting until a new instance arrives to be classified or a value has to be predicted (depending if you're using K-NN for classifying or regression, both are possible). Even then, using the term "fitting the model" is a bit out-of-place, in my opinion.



              I think your confusion is in the sentence "model fitting happens to reduce error". No, it does not. Model fitting is simply getting a model (for example, a family of data distributions) and fitting the model by finding the parameters that better describe the data (thus choosing a member from that family).



              Maybe you're thinking of neural network's epochs as "fitting"? What happens there is that there's a family of possible models (all the possible values for each weight in the network), and fitting the model is just finding the best possible values. The fact that neural networks do so in an iterative manner does not mean that fitting is an iterative process, fitting is the end result.




              So training set and test set is only for do things below, right?




              Well, that's true for every model, yes. Training and test set separation serves the only purpose of evaluating the model on the chosen metrics. At the end of the day, the final model you choose to deploy, be it lazy or eager, will use all the available data (or a portion/transformation of it, in some cases), not just the training set.






              share|improve this answer











              $endgroup$















                3












                3








                3





                $begingroup$

                Short version: this is just terminology, but arguably K-NN doesn't actually fit a model.



                Conceptually, K-NN is a lazy learning model. This means that, in a sense, there's no fitting until a new instance arrives to be classified or a value has to be predicted (depending if you're using K-NN for classifying or regression, both are possible). Even then, using the term "fitting the model" is a bit out-of-place, in my opinion.



                I think your confusion is in the sentence "model fitting happens to reduce error". No, it does not. Model fitting is simply getting a model (for example, a family of data distributions) and fitting the model by finding the parameters that better describe the data (thus choosing a member from that family).



                Maybe you're thinking of neural network's epochs as "fitting"? What happens there is that there's a family of possible models (all the possible values for each weight in the network), and fitting the model is just finding the best possible values. The fact that neural networks do so in an iterative manner does not mean that fitting is an iterative process, fitting is the end result.




                So training set and test set is only for do things below, right?




                Well, that's true for every model, yes. Training and test set separation serves the only purpose of evaluating the model on the chosen metrics. At the end of the day, the final model you choose to deploy, be it lazy or eager, will use all the available data (or a portion/transformation of it, in some cases), not just the training set.






                share|improve this answer











                $endgroup$



                Short version: this is just terminology, but arguably K-NN doesn't actually fit a model.



                Conceptually, K-NN is a lazy learning model. This means that, in a sense, there's no fitting until a new instance arrives to be classified or a value has to be predicted (depending if you're using K-NN for classifying or regression, both are possible). Even then, using the term "fitting the model" is a bit out-of-place, in my opinion.



                I think your confusion is in the sentence "model fitting happens to reduce error". No, it does not. Model fitting is simply getting a model (for example, a family of data distributions) and fitting the model by finding the parameters that better describe the data (thus choosing a member from that family).



                Maybe you're thinking of neural network's epochs as "fitting"? What happens there is that there's a family of possible models (all the possible values for each weight in the network), and fitting the model is just finding the best possible values. The fact that neural networks do so in an iterative manner does not mean that fitting is an iterative process, fitting is the end result.




                So training set and test set is only for do things below, right?




                Well, that's true for every model, yes. Training and test set separation serves the only purpose of evaluating the model on the chosen metrics. At the end of the day, the final model you choose to deploy, be it lazy or eager, will use all the available data (or a portion/transformation of it, in some cases), not just the training set.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Mar 27 at 21:39









                Esmailian

                2,621318




                2,621318










                answered Mar 27 at 21:32









                MephyMephy

                728519




                728519



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Data Science Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48108%2fwhen-does-fitting-happen-in-knn%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                    Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                    Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High