Naive Monte Carlo, MCMC and their use in Bayesian TheoryWhat does it mean intuitively to know a pdf “up to a constant”?Simulating Monte Carlo with different standard deviations and interval confidenceMonte carlo optimisation (find maximum of function with multiple parameters)Question about accuracy in Monte Carlo integrationMarkov chain Monte Carlo (MCMC) for Maximum Likelihood Estimation (MLE)Is there a Monte Carlo/MCMC sampler implemented which can deal with isolated local maxima of posterior distribution?Monte Carlo approach in a distribution of a loss processGenerate random numbers for Monte Carlo simulationMonte Carlo simulation of posterior distributionHow to represent results of Monte Carlo SimulationMonte Carlo maximum likelihood vs Bayesian inference
Why didn't Voldemort know what Grindelwald looked like?
Storage of electrolytic capacitors - how long?
Do people actually use the word "kaputt" in conversation?
Why does the frost depth increase when the surface temperature warms up?
How to test the sharpness of a knife?
Typing CO_2 easily
Has the laser at Magurele, Romania reached a tenth of the Sun's power?
Would a primitive species be able to learn English from reading books alone?
Slur or Tie when they are mixed?
Adjusting bounding box of PlotLegends in TimelinePlot
If the only attacker is removed from combat, is a creature still counted as having attacked this turn?
Why would five hundred and five be same as one?
determining multivariate least squares with constraint
What is the meaning of "You've never met a graph you didn't like?"
How to split IPA spelling into syllables
Is there a distance limit for minecart tracks?
How to get directions in deep space?
El Dorado Word Puzzle II: Videogame Edition
Giving feedback to someone without sounding prejudiced
Do I have to know the General Relativity theory to understand the concept of inertial frame?
What happens if a creature's ETB would bounce Thalia, Heretic Cathar?
My co-worker is secretly taking pictures of me
How to make money from a browser who sees 5 seconds into the future of any web page?
Confusion over Hunter with Crossbow Expert and Giant Killer
Naive Monte Carlo, MCMC and their use in Bayesian Theory
What does it mean intuitively to know a pdf “up to a constant”?Simulating Monte Carlo with different standard deviations and interval confidenceMonte carlo optimisation (find maximum of function with multiple parameters)Question about accuracy in Monte Carlo integrationMarkov chain Monte Carlo (MCMC) for Maximum Likelihood Estimation (MLE)Is there a Monte Carlo/MCMC sampler implemented which can deal with isolated local maxima of posterior distribution?Monte Carlo approach in a distribution of a loss processGenerate random numbers for Monte Carlo simulationMonte Carlo simulation of posterior distributionHow to represent results of Monte Carlo SimulationMonte Carlo maximum likelihood vs Bayesian inference
$begingroup$
So let's suppose I have a random variable X
which follows a PDF fX(x)
which is known.
I can use the Naive Monte Carlo method (with unfiltered random sampling) to obtain n
samples of fX(x)
and get empirical PDF and estimates of its parameters.
Now suppose we have three random variables (X
, Y
and Z
).
First case:
Suppose we know the relationship (function) M between X
, Y
and Z
so that Z=M(X,Y)
. Suppose we know the marginals of X
and Y
and the covariance matrix between X
and Y
.
In this case we can use Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters.
Second case:
2.A
Suppose we only know the marginals of X
and Y
and how the PDF of Z is related to the PDFs of X
and Y
. In this case we can get the empirical PDF of Z
and estimates of its parameters directly (although not knowing M
). Although I'm not sure how the dependency between X
and Y
is taken into account...
2.B
Suppose we only know the marginals of X
and Y
and how the PDF of Z
is related to the PDFs of X
and Y
(up to a normalizing constant). This in the context of Bayesian Theory is equivalent of knowing the prior and the likelihood and the Bayes rule.
In this case we can no longer use the Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters since we don't know M(X,Y)
. Here we need to resort to MCMC.
Is the above reasoning right?
Third case:
Suppose we only know the marginals of X
and Y
, and some observations of Z. What methods can one apply to estimate not only the PDF of Z but also the function M assuming the marginals of X and Y are general and representative.
bayesian mcmc monte-carlo
$endgroup$
add a comment |
$begingroup$
So let's suppose I have a random variable X
which follows a PDF fX(x)
which is known.
I can use the Naive Monte Carlo method (with unfiltered random sampling) to obtain n
samples of fX(x)
and get empirical PDF and estimates of its parameters.
Now suppose we have three random variables (X
, Y
and Z
).
First case:
Suppose we know the relationship (function) M between X
, Y
and Z
so that Z=M(X,Y)
. Suppose we know the marginals of X
and Y
and the covariance matrix between X
and Y
.
In this case we can use Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters.
Second case:
2.A
Suppose we only know the marginals of X
and Y
and how the PDF of Z is related to the PDFs of X
and Y
. In this case we can get the empirical PDF of Z
and estimates of its parameters directly (although not knowing M
). Although I'm not sure how the dependency between X
and Y
is taken into account...
2.B
Suppose we only know the marginals of X
and Y
and how the PDF of Z
is related to the PDFs of X
and Y
(up to a normalizing constant). This in the context of Bayesian Theory is equivalent of knowing the prior and the likelihood and the Bayes rule.
In this case we can no longer use the Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters since we don't know M(X,Y)
. Here we need to resort to MCMC.
Is the above reasoning right?
Third case:
Suppose we only know the marginals of X
and Y
, and some observations of Z. What methods can one apply to estimate not only the PDF of Z but also the function M assuming the marginals of X and Y are general and representative.
bayesian mcmc monte-carlo
$endgroup$
add a comment |
$begingroup$
So let's suppose I have a random variable X
which follows a PDF fX(x)
which is known.
I can use the Naive Monte Carlo method (with unfiltered random sampling) to obtain n
samples of fX(x)
and get empirical PDF and estimates of its parameters.
Now suppose we have three random variables (X
, Y
and Z
).
First case:
Suppose we know the relationship (function) M between X
, Y
and Z
so that Z=M(X,Y)
. Suppose we know the marginals of X
and Y
and the covariance matrix between X
and Y
.
In this case we can use Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters.
Second case:
2.A
Suppose we only know the marginals of X
and Y
and how the PDF of Z is related to the PDFs of X
and Y
. In this case we can get the empirical PDF of Z
and estimates of its parameters directly (although not knowing M
). Although I'm not sure how the dependency between X
and Y
is taken into account...
2.B
Suppose we only know the marginals of X
and Y
and how the PDF of Z
is related to the PDFs of X
and Y
(up to a normalizing constant). This in the context of Bayesian Theory is equivalent of knowing the prior and the likelihood and the Bayes rule.
In this case we can no longer use the Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters since we don't know M(X,Y)
. Here we need to resort to MCMC.
Is the above reasoning right?
Third case:
Suppose we only know the marginals of X
and Y
, and some observations of Z. What methods can one apply to estimate not only the PDF of Z but also the function M assuming the marginals of X and Y are general and representative.
bayesian mcmc monte-carlo
$endgroup$
So let's suppose I have a random variable X
which follows a PDF fX(x)
which is known.
I can use the Naive Monte Carlo method (with unfiltered random sampling) to obtain n
samples of fX(x)
and get empirical PDF and estimates of its parameters.
Now suppose we have three random variables (X
, Y
and Z
).
First case:
Suppose we know the relationship (function) M between X
, Y
and Z
so that Z=M(X,Y)
. Suppose we know the marginals of X
and Y
and the covariance matrix between X
and Y
.
In this case we can use Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters.
Second case:
2.A
Suppose we only know the marginals of X
and Y
and how the PDF of Z is related to the PDFs of X
and Y
. In this case we can get the empirical PDF of Z
and estimates of its parameters directly (although not knowing M
). Although I'm not sure how the dependency between X
and Y
is taken into account...
2.B
Suppose we only know the marginals of X
and Y
and how the PDF of Z
is related to the PDFs of X
and Y
(up to a normalizing constant). This in the context of Bayesian Theory is equivalent of knowing the prior and the likelihood and the Bayes rule.
In this case we can no longer use the Naive Monte Carlo method to get empirical PDF of Z
and estimates of its parameters since we don't know M(X,Y)
. Here we need to resort to MCMC.
Is the above reasoning right?
Third case:
Suppose we only know the marginals of X
and Y
, and some observations of Z. What methods can one apply to estimate not only the PDF of Z but also the function M assuming the marginals of X and Y are general and representative.
bayesian mcmc monte-carlo
bayesian mcmc monte-carlo
edited 2 days ago
jpcgandre
asked Mar 17 at 15:19
jpcgandrejpcgandre
1848
1848
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
As a preliminary, let me point out that the issue of reconstituting the joint from the marginals is a constant theme on this forum, the answer being invariably that it is not possible without further assumptions.
"Suppose we know the marginals of $X$ and $Y$ and the covariance matrix
between $X$ and $Y$."
This information is not enough for simulating $(X,Y)$, except in the bivariate Normal setting, and other parameterised cases [like exponential families] when the covariance matrix suffices to define the joint distribution. In general, the distribution of $Z$ is given by
$$mathbb P_Z(Zin mathcal A)=mathbb P_X,Y(M(X,Y)in A)=mathbb P_X,Y((X,Y)in M^-1(mathcal A))=int_M^-1(mathcal A) p_X,Y(x,y)textd(x,y)$$
and hence depends on the joint distribution of $(X,Y)$.
"...we only know the marginals of $X$ and $Y$ and how the PDF of $Z$
is related to the PDFs of $X$ and $Y$. In this case we can get the
empirical PDF of $Z$"
This question is quite unclear or too vague, but in general wrong if $X$, $Y$, and $Z$ are dependent (for the same reason as above). If for instance it is known than $p_z=Phi(p_X,p_Y)$ in the sense that $p_Z(z)$ can computed for all $z$'s then it is possible to build a Monte Carlo strategy based on this information. Further, the empirical pdf of $Z$ is unrelated to the true pdfs of $X$ and $Y$, but requires a sample of $Z$'s.
"This in the context of Bayesian Theory is equivalent of knowing the
prior and the likelihood and the Bayes rule."
In Bayesian theory there are two random variables, the parameter $theta$ and the experiment random variable $X$ (called the observation once realised as $x$). The likelihood function is a conditional density of the experiment random variable given the parameter random variable, not a marginal. And Bayes rule gives the conditional density of the (same) parameter random variable $theta$ given the experiment random variable, not a marginal.
"...we can no longer use the Naive Monte Carlo method to get empirical
PDF of $Z$"
This intuition is far from 100% correct as the knowledge of the posterior density up to a constant may be sufficient to run (a) analytical calculations (e.g., with conjugate priors) and (b) regular Monte Carlo simulations. MCMC is not a sure solution for all cases (as in the doubly intractable likelihood problem).
"Suppose we only know the marginals of $X$ and $Y$, and some observations
of $Z$. What methods can one apply to estimate not only the PDF of $Z$ but
also the function $M$ assuming the marginals of $X$ and $Y$ are general and
representative."
This question is once again too vague. Observing $Z$ allows for the estimation of its PDF by non-parametric tools, but if $X$ and $Y$ are not observed, it is difficult to imagine estimating $M$ solely from the $Z$'s and the marginal densities.
$endgroup$
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations ofX
andY
with outcomes ofZ
then you can estimate Z based on this function and the probabilistic marginals ofX
andY
, right?
$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
|
show 1 more comment
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398003%2fnaive-monte-carlo-mcmc-and-their-use-in-bayesian-theory%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
As a preliminary, let me point out that the issue of reconstituting the joint from the marginals is a constant theme on this forum, the answer being invariably that it is not possible without further assumptions.
"Suppose we know the marginals of $X$ and $Y$ and the covariance matrix
between $X$ and $Y$."
This information is not enough for simulating $(X,Y)$, except in the bivariate Normal setting, and other parameterised cases [like exponential families] when the covariance matrix suffices to define the joint distribution. In general, the distribution of $Z$ is given by
$$mathbb P_Z(Zin mathcal A)=mathbb P_X,Y(M(X,Y)in A)=mathbb P_X,Y((X,Y)in M^-1(mathcal A))=int_M^-1(mathcal A) p_X,Y(x,y)textd(x,y)$$
and hence depends on the joint distribution of $(X,Y)$.
"...we only know the marginals of $X$ and $Y$ and how the PDF of $Z$
is related to the PDFs of $X$ and $Y$. In this case we can get the
empirical PDF of $Z$"
This question is quite unclear or too vague, but in general wrong if $X$, $Y$, and $Z$ are dependent (for the same reason as above). If for instance it is known than $p_z=Phi(p_X,p_Y)$ in the sense that $p_Z(z)$ can computed for all $z$'s then it is possible to build a Monte Carlo strategy based on this information. Further, the empirical pdf of $Z$ is unrelated to the true pdfs of $X$ and $Y$, but requires a sample of $Z$'s.
"This in the context of Bayesian Theory is equivalent of knowing the
prior and the likelihood and the Bayes rule."
In Bayesian theory there are two random variables, the parameter $theta$ and the experiment random variable $X$ (called the observation once realised as $x$). The likelihood function is a conditional density of the experiment random variable given the parameter random variable, not a marginal. And Bayes rule gives the conditional density of the (same) parameter random variable $theta$ given the experiment random variable, not a marginal.
"...we can no longer use the Naive Monte Carlo method to get empirical
PDF of $Z$"
This intuition is far from 100% correct as the knowledge of the posterior density up to a constant may be sufficient to run (a) analytical calculations (e.g., with conjugate priors) and (b) regular Monte Carlo simulations. MCMC is not a sure solution for all cases (as in the doubly intractable likelihood problem).
"Suppose we only know the marginals of $X$ and $Y$, and some observations
of $Z$. What methods can one apply to estimate not only the PDF of $Z$ but
also the function $M$ assuming the marginals of $X$ and $Y$ are general and
representative."
This question is once again too vague. Observing $Z$ allows for the estimation of its PDF by non-parametric tools, but if $X$ and $Y$ are not observed, it is difficult to imagine estimating $M$ solely from the $Z$'s and the marginal densities.
$endgroup$
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations ofX
andY
with outcomes ofZ
then you can estimate Z based on this function and the probabilistic marginals ofX
andY
, right?
$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
|
show 1 more comment
$begingroup$
As a preliminary, let me point out that the issue of reconstituting the joint from the marginals is a constant theme on this forum, the answer being invariably that it is not possible without further assumptions.
"Suppose we know the marginals of $X$ and $Y$ and the covariance matrix
between $X$ and $Y$."
This information is not enough for simulating $(X,Y)$, except in the bivariate Normal setting, and other parameterised cases [like exponential families] when the covariance matrix suffices to define the joint distribution. In general, the distribution of $Z$ is given by
$$mathbb P_Z(Zin mathcal A)=mathbb P_X,Y(M(X,Y)in A)=mathbb P_X,Y((X,Y)in M^-1(mathcal A))=int_M^-1(mathcal A) p_X,Y(x,y)textd(x,y)$$
and hence depends on the joint distribution of $(X,Y)$.
"...we only know the marginals of $X$ and $Y$ and how the PDF of $Z$
is related to the PDFs of $X$ and $Y$. In this case we can get the
empirical PDF of $Z$"
This question is quite unclear or too vague, but in general wrong if $X$, $Y$, and $Z$ are dependent (for the same reason as above). If for instance it is known than $p_z=Phi(p_X,p_Y)$ in the sense that $p_Z(z)$ can computed for all $z$'s then it is possible to build a Monte Carlo strategy based on this information. Further, the empirical pdf of $Z$ is unrelated to the true pdfs of $X$ and $Y$, but requires a sample of $Z$'s.
"This in the context of Bayesian Theory is equivalent of knowing the
prior and the likelihood and the Bayes rule."
In Bayesian theory there are two random variables, the parameter $theta$ and the experiment random variable $X$ (called the observation once realised as $x$). The likelihood function is a conditional density of the experiment random variable given the parameter random variable, not a marginal. And Bayes rule gives the conditional density of the (same) parameter random variable $theta$ given the experiment random variable, not a marginal.
"...we can no longer use the Naive Monte Carlo method to get empirical
PDF of $Z$"
This intuition is far from 100% correct as the knowledge of the posterior density up to a constant may be sufficient to run (a) analytical calculations (e.g., with conjugate priors) and (b) regular Monte Carlo simulations. MCMC is not a sure solution for all cases (as in the doubly intractable likelihood problem).
"Suppose we only know the marginals of $X$ and $Y$, and some observations
of $Z$. What methods can one apply to estimate not only the PDF of $Z$ but
also the function $M$ assuming the marginals of $X$ and $Y$ are general and
representative."
This question is once again too vague. Observing $Z$ allows for the estimation of its PDF by non-parametric tools, but if $X$ and $Y$ are not observed, it is difficult to imagine estimating $M$ solely from the $Z$'s and the marginal densities.
$endgroup$
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations ofX
andY
with outcomes ofZ
then you can estimate Z based on this function and the probabilistic marginals ofX
andY
, right?
$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
|
show 1 more comment
$begingroup$
As a preliminary, let me point out that the issue of reconstituting the joint from the marginals is a constant theme on this forum, the answer being invariably that it is not possible without further assumptions.
"Suppose we know the marginals of $X$ and $Y$ and the covariance matrix
between $X$ and $Y$."
This information is not enough for simulating $(X,Y)$, except in the bivariate Normal setting, and other parameterised cases [like exponential families] when the covariance matrix suffices to define the joint distribution. In general, the distribution of $Z$ is given by
$$mathbb P_Z(Zin mathcal A)=mathbb P_X,Y(M(X,Y)in A)=mathbb P_X,Y((X,Y)in M^-1(mathcal A))=int_M^-1(mathcal A) p_X,Y(x,y)textd(x,y)$$
and hence depends on the joint distribution of $(X,Y)$.
"...we only know the marginals of $X$ and $Y$ and how the PDF of $Z$
is related to the PDFs of $X$ and $Y$. In this case we can get the
empirical PDF of $Z$"
This question is quite unclear or too vague, but in general wrong if $X$, $Y$, and $Z$ are dependent (for the same reason as above). If for instance it is known than $p_z=Phi(p_X,p_Y)$ in the sense that $p_Z(z)$ can computed for all $z$'s then it is possible to build a Monte Carlo strategy based on this information. Further, the empirical pdf of $Z$ is unrelated to the true pdfs of $X$ and $Y$, but requires a sample of $Z$'s.
"This in the context of Bayesian Theory is equivalent of knowing the
prior and the likelihood and the Bayes rule."
In Bayesian theory there are two random variables, the parameter $theta$ and the experiment random variable $X$ (called the observation once realised as $x$). The likelihood function is a conditional density of the experiment random variable given the parameter random variable, not a marginal. And Bayes rule gives the conditional density of the (same) parameter random variable $theta$ given the experiment random variable, not a marginal.
"...we can no longer use the Naive Monte Carlo method to get empirical
PDF of $Z$"
This intuition is far from 100% correct as the knowledge of the posterior density up to a constant may be sufficient to run (a) analytical calculations (e.g., with conjugate priors) and (b) regular Monte Carlo simulations. MCMC is not a sure solution for all cases (as in the doubly intractable likelihood problem).
"Suppose we only know the marginals of $X$ and $Y$, and some observations
of $Z$. What methods can one apply to estimate not only the PDF of $Z$ but
also the function $M$ assuming the marginals of $X$ and $Y$ are general and
representative."
This question is once again too vague. Observing $Z$ allows for the estimation of its PDF by non-parametric tools, but if $X$ and $Y$ are not observed, it is difficult to imagine estimating $M$ solely from the $Z$'s and the marginal densities.
$endgroup$
As a preliminary, let me point out that the issue of reconstituting the joint from the marginals is a constant theme on this forum, the answer being invariably that it is not possible without further assumptions.
"Suppose we know the marginals of $X$ and $Y$ and the covariance matrix
between $X$ and $Y$."
This information is not enough for simulating $(X,Y)$, except in the bivariate Normal setting, and other parameterised cases [like exponential families] when the covariance matrix suffices to define the joint distribution. In general, the distribution of $Z$ is given by
$$mathbb P_Z(Zin mathcal A)=mathbb P_X,Y(M(X,Y)in A)=mathbb P_X,Y((X,Y)in M^-1(mathcal A))=int_M^-1(mathcal A) p_X,Y(x,y)textd(x,y)$$
and hence depends on the joint distribution of $(X,Y)$.
"...we only know the marginals of $X$ and $Y$ and how the PDF of $Z$
is related to the PDFs of $X$ and $Y$. In this case we can get the
empirical PDF of $Z$"
This question is quite unclear or too vague, but in general wrong if $X$, $Y$, and $Z$ are dependent (for the same reason as above). If for instance it is known than $p_z=Phi(p_X,p_Y)$ in the sense that $p_Z(z)$ can computed for all $z$'s then it is possible to build a Monte Carlo strategy based on this information. Further, the empirical pdf of $Z$ is unrelated to the true pdfs of $X$ and $Y$, but requires a sample of $Z$'s.
"This in the context of Bayesian Theory is equivalent of knowing the
prior and the likelihood and the Bayes rule."
In Bayesian theory there are two random variables, the parameter $theta$ and the experiment random variable $X$ (called the observation once realised as $x$). The likelihood function is a conditional density of the experiment random variable given the parameter random variable, not a marginal. And Bayes rule gives the conditional density of the (same) parameter random variable $theta$ given the experiment random variable, not a marginal.
"...we can no longer use the Naive Monte Carlo method to get empirical
PDF of $Z$"
This intuition is far from 100% correct as the knowledge of the posterior density up to a constant may be sufficient to run (a) analytical calculations (e.g., with conjugate priors) and (b) regular Monte Carlo simulations. MCMC is not a sure solution for all cases (as in the doubly intractable likelihood problem).
"Suppose we only know the marginals of $X$ and $Y$, and some observations
of $Z$. What methods can one apply to estimate not only the PDF of $Z$ but
also the function $M$ assuming the marginals of $X$ and $Y$ are general and
representative."
This question is once again too vague. Observing $Z$ allows for the estimation of its PDF by non-parametric tools, but if $X$ and $Y$ are not observed, it is difficult to imagine estimating $M$ solely from the $Z$'s and the marginal densities.
edited 2 days ago
answered 2 days ago
Xi'anXi'an
58.5k897362
58.5k897362
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations ofX
andY
with outcomes ofZ
then you can estimate Z based on this function and the probabilistic marginals ofX
andY
, right?
$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
|
show 1 more comment
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations ofX
andY
with outcomes ofZ
then you can estimate Z based on this function and the probabilistic marginals ofX
andY
, right?
$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
1
1
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Hi! thank you for the time in helping me surf this wave. Regarding your first answer, I'm surprised since in all structural engineering reliability problems I studied and apply this is exactly what is done. You have the mathematical/numerical model that relates input variables with output variables and you estimate the distribution of the latter based on the marginals and covariance matrix, even if the marginals are not normally distributed. If this is not valid in general then I kindly ask you for a reference since this will potentially have a deep impact in civil engineering.
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
Regarding the answer to the Q2.A I agree, that is why I wrote "Although I'm not sure how the dependency between X and Y is taken into account...". Regarding Q2.B, I agree although for the purposes of the question, for me a conditional probability is still a probability function as the marginals are too. So given this I believe the logic I put forward is still applicable. About what you describe about conjugate priors, can you comment on Greenparker answer to stats.stackexchange.com/questions/275641/…
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
And if you have time, please comment/criticize my Q #3. Thank you
$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations of
X
and Y
with outcomes of Z
then you can estimate Z based on this function and the probabilistic marginals of X
and Y
, right?$endgroup$
– jpcgandre
2 days ago
$begingroup$
"It may be a matter of wording: if given only fX, fY, and ΣXY, with no further information, there is no single joint distribution with these characteristics." But (a big but indeed) if you know the function that relates observations of
X
and Y
with outcomes of Z
then you can estimate Z based on this function and the probabilistic marginals of X
and Y
, right?$endgroup$
– jpcgandre
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
$begingroup$
No since you first need to generate $(X,Y)$ pairwise before deducing $Z$.
$endgroup$
– Xi'an
2 days ago
|
show 1 more comment
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398003%2fnaive-monte-carlo-mcmc-and-their-use-in-bayesian-theory%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown