SVM radial basis generate equation for hyperplane2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis

Is there a minimum number of transactions in a block?

Do airline pilots ever risk not hearing communication directed to them specifically, from traffic controllers?

How does one intimidate enemies without having the capacity for violence?

The magic money tree problem

Could a US political party gain complete control over the government by removing checks & balances?

What do you call something that goes against the spirit of the law, but is legal when interpreting the law to the letter?

XeLaTeX and pdfLaTeX ignore hyphenation

How do you conduct xenoanthropology after first contact?

What is the white spray-pattern residue inside these Falcon Heavy nozzles?

My colleague's body is amazing

Modification to Chariots for Heavy Cavalry Analogue for 4-armed race

Copenhagen passport control - US citizen

How to type dʒ symbol (IPA) on Mac?

Can Medicine checks be used, with decent rolls, to completely mitigate the risk of death from ongoing damage?

Prevent a directory in /tmp from being deleted

What defenses are there against being summoned by the Gate spell?

Why don't electron-positron collisions release infinite energy?

Chess with symmetric move-square

Why are 150k or 200k jobs considered good when there are 300k+ births a month?

I’m planning on buying a laser printer but concerned about the life cycle of toner in the machine

New order #4: World

What makes Graph invariants so useful/important?

What typically incentivizes a professor to change jobs to a lower ranking university?

Why was the small council so happy for Tyrion to become the Master of Coin?



SVM radial basis generate equation for hyperplane



2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis










1












$begingroup$


I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



https://scikit-learn.org/stable/modules/svm.html



So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
i would be grateful for any kind of suggestion. Many thanks in advance.










share|improve this question









$endgroup$
















    1












    $begingroup$


    I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



    Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



    I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



    https://scikit-learn.org/stable/modules/svm.html



    So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
    i would be grateful for any kind of suggestion. Many thanks in advance.










    share|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.










      share|improve this question









      $endgroup$




      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.







      machine-learning python scikit-learn svm machine-learning-model






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Feb 25 at 17:31









      Alejandro Alejandro

      43




      43




















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



          $k(x,y)=phi(x)cdot phi(y)$



          $phi(x)$ - mapping



          The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






          share|improve this answer











          $endgroup$












          • $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02


















          0












          $begingroup$

          Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



          Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
          Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






          share|improve this answer









          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "557"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02













            0












            0








            0





            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$



            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Feb 25 at 20:52

























            answered Feb 25 at 20:47









            Michał KardachMichał Kardach

            716




            716











            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02
















            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02




            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02











            0












            $begingroup$

            Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



            Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
            Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






            share|improve this answer









            $endgroup$

















              0












              $begingroup$

              Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



              Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
              Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






              share|improve this answer









              $endgroup$















                0












                0








                0





                $begingroup$

                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






                share|improve this answer









                $endgroup$



                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Feb 26 at 5:52









                Alejandro Alejandro

                43




                43



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Data Science Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                    Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                    Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High