Pulling the principal components out of a DimensionReducerFunction? Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How can I determine the importance of variables from Classify?Why is the classify function not giving the desired output?How to use Mathematica to train a network Using out of core classification?How to train a net for recognize the numberHow to train a network using out of core training when data have different length?FeatureSpacePlot freaks out with LatentSemanticAnalysis on wordsHow can I reproduce the result of DimensionReduction?How can I see the predictor function that Mathematica produces?Machine learning: How to fix part of the weight matrix?Ordinal subset in the set of classes
Why does tar appear to skip file contents when output file is /dev/null?
What would be Julian Assange's expected punishment, on the current English criminal law?
How do you clear the ApexPages.getMessages() collection in a test?
Do we know why communications with Beresheet and NASA were lost during the attempted landing of the Moon lander?
Using "nakedly" instead of "with nothing on"
Do working physicists consider Newtonian mechanics to be "falsified"?
How are presidential pardons supposed to be used?
Geometric mean and geometric standard deviation
Is above average number of years spent on PhD considered a red flag in future academia or industry positions?
Blender game recording at the wrong time
How to colour the US map with Yellow, Green, Red and Blue to minimize the number of states with the colour of Green
What's the point in a preamp?
Antler Helmet: Can it work?
Are my PIs rude or am I just being too sensitive?
When is phishing education going too far?
How to say that you spent the night with someone, you were only sleeping and nothing else?
Is there folklore associating late breastfeeding with low intelligence and/or gullibility?
Statistical model of ligand substitution
Losing the Initialization Vector in Cipher Block Chaining
Strange behaviour of Check
Classification of bundles, Postnikov towers, obstruction theory, local coefficients
Active filter with series inductor and resistor - do these exist?
Slither Like a Snake
Working around an AWS network ACL rule limit
Pulling the principal components out of a DimensionReducerFunction?
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)
Announcing the arrival of Valued Associate #679: Cesar Manara
Unicorn Meta Zoo #1: Why another podcast?How can I determine the importance of variables from Classify?Why is the classify function not giving the desired output?How to use Mathematica to train a network Using out of core classification?How to train a net for recognize the numberHow to train a network using out of core training when data have different length?FeatureSpacePlot freaks out with LatentSemanticAnalysis on wordsHow can I reproduce the result of DimensionReduction?How can I see the predictor function that Mathematica produces?Machine learning: How to fix part of the weight matrix?Ordinal subset in the set of classes
$begingroup$
Suppose I perform dimension reduction using PCA:
dr = DimensionReduction[1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5,
8.5, Method -> "PrincipalComponentsAnalysis"]
If I want to see the principal components themselves, in the original space, one thought is to use the "OriginalData" feature of the DimensionReducerFunction
, on basis vectors in the new space:
In[8]:= dr[1.0, 0.0, "OriginalData"]
dr[0.0, 1.0, "OriginalData"]
Out[8]= 1.86006, 2.9998, 4.81724
Out[9]= 3.38701, 3.01026, 5.64163
Is this a reasonable thing to do, or am I misinterpreting how the "OriginalData" feature works? And is there a better way to pull out the principal components themselves? People often want to visualize these for various reasons.
(There are several other questions about how to solve a similar problem with the PrincipalComponents
function; this is a question about a different function.)
machine-learning dimension-reduction
$endgroup$
add a comment |
$begingroup$
Suppose I perform dimension reduction using PCA:
dr = DimensionReduction[1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5,
8.5, Method -> "PrincipalComponentsAnalysis"]
If I want to see the principal components themselves, in the original space, one thought is to use the "OriginalData" feature of the DimensionReducerFunction
, on basis vectors in the new space:
In[8]:= dr[1.0, 0.0, "OriginalData"]
dr[0.0, 1.0, "OriginalData"]
Out[8]= 1.86006, 2.9998, 4.81724
Out[9]= 3.38701, 3.01026, 5.64163
Is this a reasonable thing to do, or am I misinterpreting how the "OriginalData" feature works? And is there a better way to pull out the principal components themselves? People often want to visualize these for various reasons.
(There are several other questions about how to solve a similar problem with the PrincipalComponents
function; this is a question about a different function.)
machine-learning dimension-reduction
$endgroup$
add a comment |
$begingroup$
Suppose I perform dimension reduction using PCA:
dr = DimensionReduction[1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5,
8.5, Method -> "PrincipalComponentsAnalysis"]
If I want to see the principal components themselves, in the original space, one thought is to use the "OriginalData" feature of the DimensionReducerFunction
, on basis vectors in the new space:
In[8]:= dr[1.0, 0.0, "OriginalData"]
dr[0.0, 1.0, "OriginalData"]
Out[8]= 1.86006, 2.9998, 4.81724
Out[9]= 3.38701, 3.01026, 5.64163
Is this a reasonable thing to do, or am I misinterpreting how the "OriginalData" feature works? And is there a better way to pull out the principal components themselves? People often want to visualize these for various reasons.
(There are several other questions about how to solve a similar problem with the PrincipalComponents
function; this is a question about a different function.)
machine-learning dimension-reduction
$endgroup$
Suppose I perform dimension reduction using PCA:
dr = DimensionReduction[1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5,
8.5, Method -> "PrincipalComponentsAnalysis"]
If I want to see the principal components themselves, in the original space, one thought is to use the "OriginalData" feature of the DimensionReducerFunction
, on basis vectors in the new space:
In[8]:= dr[1.0, 0.0, "OriginalData"]
dr[0.0, 1.0, "OriginalData"]
Out[8]= 1.86006, 2.9998, 4.81724
Out[9]= 3.38701, 3.01026, 5.64163
Is this a reasonable thing to do, or am I misinterpreting how the "OriginalData" feature works? And is there a better way to pull out the principal components themselves? People often want to visualize these for various reasons.
(There are several other questions about how to solve a similar problem with the PrincipalComponents
function; this is a question about a different function.)
machine-learning dimension-reduction
machine-learning dimension-reduction
edited Mar 31 at 22:00
J. M. is away♦
98.9k10311467
98.9k10311467
asked Mar 31 at 17:23
Michael CurryMichael Curry
786312
786312
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Here's your data:
data = 1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5, 8.5;
dr = DimensionReduction[data, Method -> "PrincipalComponentsAnalysis"];
This is not exactly a top level solution, but we can pick apart the DimensionReducerFunction
and see inside (try dr[[1]]
to see many internal properties).
Looking further, in there we have a matrix:
Transpose[dr[[1, "Model", "Matrix"]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
I think these are the components. We can try to verify:
Transpose[Last[SingularValueDecomposition[Standardize[data], 2]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
$endgroup$
2
$begingroup$
It doesn't look like you need the pre-multiplication by2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, whichStandardize[]
of course does.
$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "387"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f194324%2fpulling-the-principal-components-out-of-a-dimensionreducerfunction%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Here's your data:
data = 1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5, 8.5;
dr = DimensionReduction[data, Method -> "PrincipalComponentsAnalysis"];
This is not exactly a top level solution, but we can pick apart the DimensionReducerFunction
and see inside (try dr[[1]]
to see many internal properties).
Looking further, in there we have a matrix:
Transpose[dr[[1, "Model", "Matrix"]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
I think these are the components. We can try to verify:
Transpose[Last[SingularValueDecomposition[Standardize[data], 2]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
$endgroup$
2
$begingroup$
It doesn't look like you need the pre-multiplication by2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, whichStandardize[]
of course does.
$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
add a comment |
$begingroup$
Here's your data:
data = 1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5, 8.5;
dr = DimensionReduction[data, Method -> "PrincipalComponentsAnalysis"];
This is not exactly a top level solution, but we can pick apart the DimensionReducerFunction
and see inside (try dr[[1]]
to see many internal properties).
Looking further, in there we have a matrix:
Transpose[dr[[1, "Model", "Matrix"]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
I think these are the components. We can try to verify:
Transpose[Last[SingularValueDecomposition[Standardize[data], 2]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
$endgroup$
2
$begingroup$
It doesn't look like you need the pre-multiplication by2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, whichStandardize[]
of course does.
$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
add a comment |
$begingroup$
Here's your data:
data = 1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5, 8.5;
dr = DimensionReduction[data, Method -> "PrincipalComponentsAnalysis"];
This is not exactly a top level solution, but we can pick apart the DimensionReducerFunction
and see inside (try dr[[1]]
to see many internal properties).
Looking further, in there we have a matrix:
Transpose[dr[[1, "Model", "Matrix"]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
I think these are the components. We can try to verify:
Transpose[Last[SingularValueDecomposition[Standardize[data], 2]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
$endgroup$
Here's your data:
data = 1, 2, 3, 2, 3, 5, 3, 5, 8, 4, 5, 8.5;
dr = DimensionReduction[data, Method -> "PrincipalComponentsAnalysis"];
This is not exactly a top level solution, but we can pick apart the DimensionReducerFunction
and see inside (try dr[[1]]
to see many internal properties).
Looking further, in there we have a matrix:
Transpose[dr[[1, "Model", "Matrix"]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
I think these are the components. We can try to verify:
Transpose[Last[SingularValueDecomposition[Standardize[data], 2]]]
-0.572383, -0.577502, -0.582125, 0.793367, -0.56945, -0.215163
edited Mar 31 at 22:56
answered Mar 31 at 18:58
Chip HurstChip Hurst
23.4k15994
23.4k15994
2
$begingroup$
It doesn't look like you need the pre-multiplication by2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, whichStandardize[]
of course does.
$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
add a comment |
2
$begingroup$
It doesn't look like you need the pre-multiplication by2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, whichStandardize[]
of course does.
$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
2
2
$begingroup$
It doesn't look like you need the pre-multiplication by
2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, which Standardize[]
of course does.$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
It doesn't look like you need the pre-multiplication by
2/Sqrt[3]
; after all, any such rescaling would show itself in the singular values and not the orthogonal factors. The important thing is the shift, which Standardize[]
of course does.$endgroup$
– J. M. is away♦
Mar 31 at 22:12
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
$begingroup$
Good catch, I've edited the post. I can't remember what I ran into that led me to such a conclusion.
$endgroup$
– Chip Hurst
Mar 31 at 22:58
add a comment |
Thanks for contributing an answer to Mathematica Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f194324%2fpulling-the-principal-components-out-of-a-dimensionreducerfunction%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown