SVM radial basis generate equation for hyperplane2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis

Is there a minimum number of transactions in a block?

Do airline pilots ever risk not hearing communication directed to them specifically, from traffic controllers?

How does one intimidate enemies without having the capacity for violence?

The magic money tree problem

Could a US political party gain complete control over the government by removing checks & balances?

What do you call something that goes against the spirit of the law, but is legal when interpreting the law to the letter?

XeLaTeX and pdfLaTeX ignore hyphenation

How do you conduct xenoanthropology after first contact?

What is the white spray-pattern residue inside these Falcon Heavy nozzles?

My colleague's body is amazing

Modification to Chariots for Heavy Cavalry Analogue for 4-armed race

Copenhagen passport control - US citizen

How to type dʒ symbol (IPA) on Mac?

Can Medicine checks be used, with decent rolls, to completely mitigate the risk of death from ongoing damage?

Prevent a directory in /tmp from being deleted

What defenses are there against being summoned by the Gate spell?

Why don't electron-positron collisions release infinite energy?

Chess with symmetric move-square

Why are 150k or 200k jobs considered good when there are 300k+ births a month?

I’m planning on buying a laser printer but concerned about the life cycle of toner in the machine

New order #4: World

What makes Graph invariants so useful/important?

What typically incentivizes a professor to change jobs to a lower ranking university?

Why was the small council so happy for Tyrion to become the Master of Coin?



SVM radial basis generate equation for hyperplane



2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis










1












$begingroup$


I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



https://scikit-learn.org/stable/modules/svm.html



So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
i would be grateful for any kind of suggestion. Many thanks in advance.










share|improve this question









$endgroup$
















    1












    $begingroup$


    I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



    Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



    I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



    https://scikit-learn.org/stable/modules/svm.html



    So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
    i would be grateful for any kind of suggestion. Many thanks in advance.










    share|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.










      share|improve this question









      $endgroup$




      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.







      machine-learning python scikit-learn svm machine-learning-model






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Feb 25 at 17:31









      Alejandro Alejandro

      43




      43




















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



          $k(x,y)=phi(x)cdot phi(y)$



          $phi(x)$ - mapping



          The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






          share|improve this answer











          $endgroup$












          • $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02


















          0












          $begingroup$

          Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



          Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
          Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






          share|improve this answer









          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "557"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02













            0












            0








            0





            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$



            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Feb 25 at 20:52

























            answered Feb 25 at 20:47









            Michał KardachMichał Kardach

            716




            716











            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02
















            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02




            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02











            0












            $begingroup$

            Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



            Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
            Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






            share|improve this answer









            $endgroup$

















              0












              $begingroup$

              Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



              Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
              Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






              share|improve this answer









              $endgroup$















                0












                0








                0





                $begingroup$

                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






                share|improve this answer









                $endgroup$



                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Feb 26 at 5:52









                Alejandro Alejandro

                43




                43



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Data Science Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Marja Vauras Lähteet | Aiheesta muualla | NavigointivalikkoMarja Vauras Turun yliopiston tutkimusportaalissaInfobox OKSuomalaisen Tiedeakatemian varsinaiset jäsenetKasvatustieteiden tiedekunnan dekaanit ja muu johtoMarja VaurasKoulutusvienti on kestävyys- ja ketteryyslaji (2.5.2017)laajentamallaWorldCat Identities0000 0001 0855 9405n86069603utb201588738523620927

                    Which is better: GPT or RelGAN for text generation?2019 Community Moderator ElectionWhat is the difference between TextGAN and LM for text generation?GANs (generative adversarial networks) possible for text as well?Generator loss not decreasing- text to image synthesisChoosing a right algorithm for template-based text generationHow should I format input and output for text generation with LSTMsGumbel Softmax vs Vanilla Softmax for GAN trainingWhich neural network to choose for classification from text/speech?NLP text autoencoder that generates text in poetic meterWhat is the interpretation of the expectation notation in the GAN formulation?What is the difference between TextGAN and LM for text generation?How to prepare the data for text generation task

                    Is this part of the description of the Archfey warlock's Misty Escape feature redundant?When is entropic ward considered “used”?How does the reaction timing work for Wrath of the Storm? Can it potentially prevent the damage from the triggering attack?Does the Dark Arts Archlich warlock patrons's Arcane Invisibility activate every time you cast a level 1+ spell?When attacking while invisible, when exactly does invisibility break?Can I cast Hellish Rebuke on my turn?Do I have to “pre-cast” a reaction spell in order for it to be triggered?What happens if a Player Misty Escapes into an Invisible CreatureCan a reaction interrupt multiattack?Does the Fiend-patron warlock's Hurl Through Hell feature dispel effects that require the target to be on the same plane as the caster?What are you allowed to do while using the Warlock's Eldritch Master feature?