Data Visualization with multiple dimension, and linear separability The Next CEO of Stack Overflow2019 Community Moderator ElectionSame SVM configuration, same input data gives different output using Matlab and scikit-learn implementation of SVM, in a classification problemMulti Model data set Visualization PythonHow many features should I take out of 515 features?I have n dimensional data and I want to check integrity, can I downgrade to 2 dimensional feature space via PCA and do so?Linear regression on probabilistic dataHow to quantify the performance of the classifier (multi-class SVM) using the test data?Does it make sense to visualize data with a linear relationship using tSNE?How to plot High Dimensional supervised K-means on a 2D plot chartHigher dimension data visualization in Matlab/Octavefeatures importance

If Nick Fury and Coulson already knew about aliens (Kree and Skrull) why did they wait until Thor's appearance to start making weapons?

Are there any unintended negative consequences to allowing PCs to gain multiple levels at once in a short milestone-XP game?

Why does the flight controls check come before arming the autobrake on the A320?

Why does standard notation not preserve intervals (visually)

What was the first Unix version to run on a microcomputer?

What does "Its cash flow is deeply negative" mean?

Can we say or write : "No, it'sn't"?

I'm self employed. Can I contribute to my previous employers 401k?

How to scale a tikZ image which is within a figure environment

Make solar eclipses exceedingly rare, but still have new moons

Arranging cats and dogs - what is wrong with my approach

A Man With a Stainless Steel Endoskeleton (like The Terminator) Fighting Cloaked Aliens Only He Can See

Writing differences on a blackboard

How to place nodes around a circle from some initial angle?

Example of a Mathematician/Physicist whose Other Publications during their PhD eclipsed their PhD Thesis

Should I tutor a student who I know has cheated on their homework?

Received an invoice from my ex-employer billing me for training; how to handle?

is it ok to reduce charging current for li ion 18650 battery?

What is the value of α and β in a triangle?

Flying from Cape Town to England and return to another province

Do I need to write [sic] when a number is less than 10 but isn't written out?

Which one is the true statement?

Help understanding this unsettling image of Titan, Epimetheus, and Saturn's rings?

Won the lottery - how do I keep the money?



Data Visualization with multiple dimension, and linear separability



The Next CEO of Stack Overflow
2019 Community Moderator ElectionSame SVM configuration, same input data gives different output using Matlab and scikit-learn implementation of SVM, in a classification problemMulti Model data set Visualization PythonHow many features should I take out of 515 features?I have n dimensional data and I want to check integrity, can I downgrade to 2 dimensional feature space via PCA and do so?Linear regression on probabilistic dataHow to quantify the performance of the classifier (multi-class SVM) using the test data?Does it make sense to visualize data with a linear relationship using tSNE?How to plot High Dimensional supervised K-means on a 2D plot chartHigher dimension data visualization in Matlab/Octavefeatures importance










4












$begingroup$


I have a dataset of two classes with several features, how can I visualise such data using Matlab to decide if it is linear separable or not?










share|improve this question









$endgroup$
















    4












    $begingroup$


    I have a dataset of two classes with several features, how can I visualise such data using Matlab to decide if it is linear separable or not?










    share|improve this question









    $endgroup$














      4












      4








      4





      $begingroup$


      I have a dataset of two classes with several features, how can I visualise such data using Matlab to decide if it is linear separable or not?










      share|improve this question









      $endgroup$




      I have a dataset of two classes with several features, how can I visualise such data using Matlab to decide if it is linear separable or not?







      dataset visualization matlab plotting






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 23 at 20:41









      gingin

      1617




      1617




















          2 Answers
          2






          active

          oldest

          votes


















          5












          $begingroup$

          You basically need a t-SNE plot, the t-SNE will convert the high dimensional feature vector (several features in your case) to a 2d point and then you can use matplotlib to plot, while plotting you need to send in the class of the corresponding feature to get different colour for different classes of data points. once you do this, you will be able to judge if they are linearly separable.



          the code goes as follows



          import matplotlib.pyplot as plt
          from sklearn.manifold import TSNE
          tsne = TSNE(n_components=2, random_state=0)
          transformed_data = tsne.fit_transform(features)
          k = np.array(transformed_data)
          plt.scatter(k[:, 0], k[:, 1], c=class_labels, zorder=10, s=0.4)


          and the above code should produce an image as shown below



          enter image description here



          Since you're looking for MATLAB code you can get the t-SNE with the function tsne(x) click on it for documentation. Also matplotlib scatter plot is pretty easy to do on Matlab, the documentation is extensive!



          vote me if i was able to help ;)






          share|improve this answer










          New contributor




          William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          $endgroup$




















            4












            $begingroup$

            When class labels are known, you can use Linear Discriminant Analysis (LDA) for visualization to see whether classes are linearly separable. LDA is similar to PCA but supervised. It tries to project the data in a way that maximizes the distance between classes (here is a how-to post for Matlab, R, Python. Here is a mult-class LDA for Matlab).





            Also, we can get a sense of linearity without any visualization. To this end, we can train/test a linear SVM on [sample of] data and if the test accuracy (assuming class numbers are balanced) is 90%, 95%, 99%, it is an increasingly good indication that classes are almost linearly separable.



            Please note that we cannot use non-linear dimensionality reduction methods such as IsoMap or t-SNE, since they are able to show a linearly separated visualization even for classes that are not linearly separable in the original space, hence the name non-linear. Here is the famous Swiss roll example:








            share|improve this answer











            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "557"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47861%2fdata-visualization-with-multiple-dimension-and-linear-separability%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              5












              $begingroup$

              You basically need a t-SNE plot, the t-SNE will convert the high dimensional feature vector (several features in your case) to a 2d point and then you can use matplotlib to plot, while plotting you need to send in the class of the corresponding feature to get different colour for different classes of data points. once you do this, you will be able to judge if they are linearly separable.



              the code goes as follows



              import matplotlib.pyplot as plt
              from sklearn.manifold import TSNE
              tsne = TSNE(n_components=2, random_state=0)
              transformed_data = tsne.fit_transform(features)
              k = np.array(transformed_data)
              plt.scatter(k[:, 0], k[:, 1], c=class_labels, zorder=10, s=0.4)


              and the above code should produce an image as shown below



              enter image description here



              Since you're looking for MATLAB code you can get the t-SNE with the function tsne(x) click on it for documentation. Also matplotlib scatter plot is pretty easy to do on Matlab, the documentation is extensive!



              vote me if i was able to help ;)






              share|improve this answer










              New contributor




              William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.






              $endgroup$

















                5












                $begingroup$

                You basically need a t-SNE plot, the t-SNE will convert the high dimensional feature vector (several features in your case) to a 2d point and then you can use matplotlib to plot, while plotting you need to send in the class of the corresponding feature to get different colour for different classes of data points. once you do this, you will be able to judge if they are linearly separable.



                the code goes as follows



                import matplotlib.pyplot as plt
                from sklearn.manifold import TSNE
                tsne = TSNE(n_components=2, random_state=0)
                transformed_data = tsne.fit_transform(features)
                k = np.array(transformed_data)
                plt.scatter(k[:, 0], k[:, 1], c=class_labels, zorder=10, s=0.4)


                and the above code should produce an image as shown below



                enter image description here



                Since you're looking for MATLAB code you can get the t-SNE with the function tsne(x) click on it for documentation. Also matplotlib scatter plot is pretty easy to do on Matlab, the documentation is extensive!



                vote me if i was able to help ;)






                share|improve this answer










                New contributor




                William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                Check out our Code of Conduct.






                $endgroup$















                  5












                  5








                  5





                  $begingroup$

                  You basically need a t-SNE plot, the t-SNE will convert the high dimensional feature vector (several features in your case) to a 2d point and then you can use matplotlib to plot, while plotting you need to send in the class of the corresponding feature to get different colour for different classes of data points. once you do this, you will be able to judge if they are linearly separable.



                  the code goes as follows



                  import matplotlib.pyplot as plt
                  from sklearn.manifold import TSNE
                  tsne = TSNE(n_components=2, random_state=0)
                  transformed_data = tsne.fit_transform(features)
                  k = np.array(transformed_data)
                  plt.scatter(k[:, 0], k[:, 1], c=class_labels, zorder=10, s=0.4)


                  and the above code should produce an image as shown below



                  enter image description here



                  Since you're looking for MATLAB code you can get the t-SNE with the function tsne(x) click on it for documentation. Also matplotlib scatter plot is pretty easy to do on Matlab, the documentation is extensive!



                  vote me if i was able to help ;)






                  share|improve this answer










                  New contributor




                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  $endgroup$



                  You basically need a t-SNE plot, the t-SNE will convert the high dimensional feature vector (several features in your case) to a 2d point and then you can use matplotlib to plot, while plotting you need to send in the class of the corresponding feature to get different colour for different classes of data points. once you do this, you will be able to judge if they are linearly separable.



                  the code goes as follows



                  import matplotlib.pyplot as plt
                  from sklearn.manifold import TSNE
                  tsne = TSNE(n_components=2, random_state=0)
                  transformed_data = tsne.fit_transform(features)
                  k = np.array(transformed_data)
                  plt.scatter(k[:, 0], k[:, 1], c=class_labels, zorder=10, s=0.4)


                  and the above code should produce an image as shown below



                  enter image description here



                  Since you're looking for MATLAB code you can get the t-SNE with the function tsne(x) click on it for documentation. Also matplotlib scatter plot is pretty easy to do on Matlab, the documentation is extensive!



                  vote me if i was able to help ;)







                  share|improve this answer










                  New contributor




                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  share|improve this answer



                  share|improve this answer








                  edited Mar 24 at 8:24









                  Pedro Henrique Monforte

                  885




                  885






                  New contributor




                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  answered Mar 24 at 0:25









                  William ScottWilliam Scott

                  1063




                  1063




                  New contributor




                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.





                  New contributor





                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  William Scott is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.





















                      4












                      $begingroup$

                      When class labels are known, you can use Linear Discriminant Analysis (LDA) for visualization to see whether classes are linearly separable. LDA is similar to PCA but supervised. It tries to project the data in a way that maximizes the distance between classes (here is a how-to post for Matlab, R, Python. Here is a mult-class LDA for Matlab).





                      Also, we can get a sense of linearity without any visualization. To this end, we can train/test a linear SVM on [sample of] data and if the test accuracy (assuming class numbers are balanced) is 90%, 95%, 99%, it is an increasingly good indication that classes are almost linearly separable.



                      Please note that we cannot use non-linear dimensionality reduction methods such as IsoMap or t-SNE, since they are able to show a linearly separated visualization even for classes that are not linearly separable in the original space, hence the name non-linear. Here is the famous Swiss roll example:








                      share|improve this answer











                      $endgroup$

















                        4












                        $begingroup$

                        When class labels are known, you can use Linear Discriminant Analysis (LDA) for visualization to see whether classes are linearly separable. LDA is similar to PCA but supervised. It tries to project the data in a way that maximizes the distance between classes (here is a how-to post for Matlab, R, Python. Here is a mult-class LDA for Matlab).





                        Also, we can get a sense of linearity without any visualization. To this end, we can train/test a linear SVM on [sample of] data and if the test accuracy (assuming class numbers are balanced) is 90%, 95%, 99%, it is an increasingly good indication that classes are almost linearly separable.



                        Please note that we cannot use non-linear dimensionality reduction methods such as IsoMap or t-SNE, since they are able to show a linearly separated visualization even for classes that are not linearly separable in the original space, hence the name non-linear. Here is the famous Swiss roll example:








                        share|improve this answer











                        $endgroup$















                          4












                          4








                          4





                          $begingroup$

                          When class labels are known, you can use Linear Discriminant Analysis (LDA) for visualization to see whether classes are linearly separable. LDA is similar to PCA but supervised. It tries to project the data in a way that maximizes the distance between classes (here is a how-to post for Matlab, R, Python. Here is a mult-class LDA for Matlab).





                          Also, we can get a sense of linearity without any visualization. To this end, we can train/test a linear SVM on [sample of] data and if the test accuracy (assuming class numbers are balanced) is 90%, 95%, 99%, it is an increasingly good indication that classes are almost linearly separable.



                          Please note that we cannot use non-linear dimensionality reduction methods such as IsoMap or t-SNE, since they are able to show a linearly separated visualization even for classes that are not linearly separable in the original space, hence the name non-linear. Here is the famous Swiss roll example:








                          share|improve this answer











                          $endgroup$



                          When class labels are known, you can use Linear Discriminant Analysis (LDA) for visualization to see whether classes are linearly separable. LDA is similar to PCA but supervised. It tries to project the data in a way that maximizes the distance between classes (here is a how-to post for Matlab, R, Python. Here is a mult-class LDA for Matlab).





                          Also, we can get a sense of linearity without any visualization. To this end, we can train/test a linear SVM on [sample of] data and if the test accuracy (assuming class numbers are balanced) is 90%, 95%, 99%, it is an increasingly good indication that classes are almost linearly separable.



                          Please note that we cannot use non-linear dimensionality reduction methods such as IsoMap or t-SNE, since they are able to show a linearly separated visualization even for classes that are not linearly separable in the original space, hence the name non-linear. Here is the famous Swiss roll example:









                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited Mar 24 at 18:16

























                          answered Mar 24 at 6:48









                          EsmailianEsmailian

                          2,212218




                          2,212218



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Data Science Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f47861%2fdata-visualization-with-multiple-dimension-and-linear-separability%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                              Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                              Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High