Convergence in probability and convergence in distributionMLE using multivariate normal distributionCollege enrollment probability modelHow do I find the probability mass function of an individual observation for a multinomial logit model?Stochastic process difference equation: stationary distributionWhy use geometric mean for GDP when calculating the credit-to-gdp ratio?To obtain the distribution/variance of two random coefficientsIs the mode of wage distribution a meaningful economic indicator?Reference request: Gender wage gap / Minority wage gapHow to calculate probability of type 1 error from probability density functionIndependently and Identically distributed random variables
Help rendering a complicated sum/product formula
Bash - pair each line of file
Do I need to consider instance restrictions when showing a language is in P?
World War I as a war of liberals against authoritarians?
Print last inputted byte
Do people actually use the word "kaputt" in conversation?
What does "Four-F." mean?
Dropping this riddle here
Describing a chess game in a novel
Can other pieces capture a threatening piece and prevent a checkmate?
Can "few" be used as a subject? If so, what is the rule?
What is name of this 1927 town in Missouri?
Am I eligible for the Eurail Youth pass? I am 27.5 years old
How does 取材で訪れた integrate into this sentence?
Print a physical multiplication table
Why are there no stars visible in cislunar space?
Do US professors/group leaders only get a salary, but no group budget?
What is the term when voters “dishonestly” choose something that they do not want to choose?
Could Sinn Fein swing any Brexit vote in Parliament?
How can add link in Header link Before the Welcome Message in magento 2
Coworker is lying about having kids. What should I do?
Variable completely messes up echoed string
When did antialiasing start being available?
Can a wizard cast a spell during their first turn of combat if they initiated combat by releasing a readied spell?
Convergence in probability and convergence in distribution
MLE using multivariate normal distributionCollege enrollment probability modelHow do I find the probability mass function of an individual observation for a multinomial logit model?Stochastic process difference equation: stationary distributionWhy use geometric mean for GDP when calculating the credit-to-gdp ratio?To obtain the distribution/variance of two random coefficientsIs the mode of wage distribution a meaningful economic indicator?Reference request: Gender wage gap / Minority wage gapHow to calculate probability of type 1 error from probability density functionIndependently and Identically distributed random variables
$begingroup$
Im a little confused about the difference of these two concepts, especially the convergence of probability. I understand that $X_n oversetpto Z $ if $Pr(|X_n - Z|>epsilon)=0$ for any $epsilon >0$ when $n rightarrow infty$.
I just need some clarification on what the subscript $n$ means and what $Z$ means. Is $n$ the sample size? is $Z$ a specific value, or another random variable? If it is another random variable, then wouldn't that mean that convergence in probability implies convergence in distribution? Also, Could you please give me some examples of things that are convergent in distribution but not in probability?
econometrics statistics
New contributor
$endgroup$
add a comment |
$begingroup$
Im a little confused about the difference of these two concepts, especially the convergence of probability. I understand that $X_n oversetpto Z $ if $Pr(|X_n - Z|>epsilon)=0$ for any $epsilon >0$ when $n rightarrow infty$.
I just need some clarification on what the subscript $n$ means and what $Z$ means. Is $n$ the sample size? is $Z$ a specific value, or another random variable? If it is another random variable, then wouldn't that mean that convergence in probability implies convergence in distribution? Also, Could you please give me some examples of things that are convergent in distribution but not in probability?
econometrics statistics
New contributor
$endgroup$
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday
add a comment |
$begingroup$
Im a little confused about the difference of these two concepts, especially the convergence of probability. I understand that $X_n oversetpto Z $ if $Pr(|X_n - Z|>epsilon)=0$ for any $epsilon >0$ when $n rightarrow infty$.
I just need some clarification on what the subscript $n$ means and what $Z$ means. Is $n$ the sample size? is $Z$ a specific value, or another random variable? If it is another random variable, then wouldn't that mean that convergence in probability implies convergence in distribution? Also, Could you please give me some examples of things that are convergent in distribution but not in probability?
econometrics statistics
New contributor
$endgroup$
Im a little confused about the difference of these two concepts, especially the convergence of probability. I understand that $X_n oversetpto Z $ if $Pr(|X_n - Z|>epsilon)=0$ for any $epsilon >0$ when $n rightarrow infty$.
I just need some clarification on what the subscript $n$ means and what $Z$ means. Is $n$ the sample size? is $Z$ a specific value, or another random variable? If it is another random variable, then wouldn't that mean that convergence in probability implies convergence in distribution? Also, Could you please give me some examples of things that are convergent in distribution but not in probability?
econometrics statistics
econometrics statistics
New contributor
New contributor
New contributor
asked yesterday
Martin Martin
523
523
New contributor
New contributor
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday
add a comment |
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I will attempt to explain the distinction using the simplest example: the sample mean. Suppose we have an iid sample of random variables $X_i_i=1^n$. Then define the sample mean as $barX_n$. As the sample size grows, our value of the sample mean changes, hence the subscript $n$ to emphasize that our sample mean depends on the sample size.
Noting that $barX_n$ itself is a random variable, we can define a sequence of random variables, where elements of the sequence are indexed by different samples (sample size is growing), i.e. $barX_n_n=1^infty$. The weak law of large numbers (WLLN) tells us that so long as $E(X_1^2)<infty$, that
$$plimbarX_n = mu,$$
or equivalently
$$barX_n rightarrow_P mu,$$
where $mu=E(X_1)$. Formally, convergence in probability is defined as
$$forall epsilon>0, lim_n rightarrow infty P(|barX_n - mu| <epsilon)=1. $$
In other words, the probability of our estimate being within $epsilon$ from the true value tends to 1 as $n rightarrow infty$. Convergence in probability gives us confidence our estimators perform well with large samples.
Convergence in distribution tell us something very different and is primarily used for hypothesis testing. Under the same distributional assumptions described above, CLT gives us that
$$sqrtn(barX_n-mu) rightarrow_D N(0,E(X_1^2)).$$
Convergence in distribution means that the cdf of the left-hand size converges at all continuity points to the cdf of the right-hand side, i.e.
$$lim_n rightarrow infty F_n(x) = F(x),$$
where $F_n(x)$ is the cdf of $sqrtn(barX_n-mu)$ and $F(x)$ is the cdf for a $N(0,E(X_1^2))$ distribution. Knowing the limiting distribution allows us to test hypotheses about the sample mean (or whatever estimate we are generating).
New contributor
$endgroup$
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "591"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Martin is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2feconomics.stackexchange.com%2fquestions%2f27300%2fconvergence-in-probability-and-convergence-in-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I will attempt to explain the distinction using the simplest example: the sample mean. Suppose we have an iid sample of random variables $X_i_i=1^n$. Then define the sample mean as $barX_n$. As the sample size grows, our value of the sample mean changes, hence the subscript $n$ to emphasize that our sample mean depends on the sample size.
Noting that $barX_n$ itself is a random variable, we can define a sequence of random variables, where elements of the sequence are indexed by different samples (sample size is growing), i.e. $barX_n_n=1^infty$. The weak law of large numbers (WLLN) tells us that so long as $E(X_1^2)<infty$, that
$$plimbarX_n = mu,$$
or equivalently
$$barX_n rightarrow_P mu,$$
where $mu=E(X_1)$. Formally, convergence in probability is defined as
$$forall epsilon>0, lim_n rightarrow infty P(|barX_n - mu| <epsilon)=1. $$
In other words, the probability of our estimate being within $epsilon$ from the true value tends to 1 as $n rightarrow infty$. Convergence in probability gives us confidence our estimators perform well with large samples.
Convergence in distribution tell us something very different and is primarily used for hypothesis testing. Under the same distributional assumptions described above, CLT gives us that
$$sqrtn(barX_n-mu) rightarrow_D N(0,E(X_1^2)).$$
Convergence in distribution means that the cdf of the left-hand size converges at all continuity points to the cdf of the right-hand side, i.e.
$$lim_n rightarrow infty F_n(x) = F(x),$$
where $F_n(x)$ is the cdf of $sqrtn(barX_n-mu)$ and $F(x)$ is the cdf for a $N(0,E(X_1^2))$ distribution. Knowing the limiting distribution allows us to test hypotheses about the sample mean (or whatever estimate we are generating).
New contributor
$endgroup$
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
add a comment |
$begingroup$
I will attempt to explain the distinction using the simplest example: the sample mean. Suppose we have an iid sample of random variables $X_i_i=1^n$. Then define the sample mean as $barX_n$. As the sample size grows, our value of the sample mean changes, hence the subscript $n$ to emphasize that our sample mean depends on the sample size.
Noting that $barX_n$ itself is a random variable, we can define a sequence of random variables, where elements of the sequence are indexed by different samples (sample size is growing), i.e. $barX_n_n=1^infty$. The weak law of large numbers (WLLN) tells us that so long as $E(X_1^2)<infty$, that
$$plimbarX_n = mu,$$
or equivalently
$$barX_n rightarrow_P mu,$$
where $mu=E(X_1)$. Formally, convergence in probability is defined as
$$forall epsilon>0, lim_n rightarrow infty P(|barX_n - mu| <epsilon)=1. $$
In other words, the probability of our estimate being within $epsilon$ from the true value tends to 1 as $n rightarrow infty$. Convergence in probability gives us confidence our estimators perform well with large samples.
Convergence in distribution tell us something very different and is primarily used for hypothesis testing. Under the same distributional assumptions described above, CLT gives us that
$$sqrtn(barX_n-mu) rightarrow_D N(0,E(X_1^2)).$$
Convergence in distribution means that the cdf of the left-hand size converges at all continuity points to the cdf of the right-hand side, i.e.
$$lim_n rightarrow infty F_n(x) = F(x),$$
where $F_n(x)$ is the cdf of $sqrtn(barX_n-mu)$ and $F(x)$ is the cdf for a $N(0,E(X_1^2))$ distribution. Knowing the limiting distribution allows us to test hypotheses about the sample mean (or whatever estimate we are generating).
New contributor
$endgroup$
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
add a comment |
$begingroup$
I will attempt to explain the distinction using the simplest example: the sample mean. Suppose we have an iid sample of random variables $X_i_i=1^n$. Then define the sample mean as $barX_n$. As the sample size grows, our value of the sample mean changes, hence the subscript $n$ to emphasize that our sample mean depends on the sample size.
Noting that $barX_n$ itself is a random variable, we can define a sequence of random variables, where elements of the sequence are indexed by different samples (sample size is growing), i.e. $barX_n_n=1^infty$. The weak law of large numbers (WLLN) tells us that so long as $E(X_1^2)<infty$, that
$$plimbarX_n = mu,$$
or equivalently
$$barX_n rightarrow_P mu,$$
where $mu=E(X_1)$. Formally, convergence in probability is defined as
$$forall epsilon>0, lim_n rightarrow infty P(|barX_n - mu| <epsilon)=1. $$
In other words, the probability of our estimate being within $epsilon$ from the true value tends to 1 as $n rightarrow infty$. Convergence in probability gives us confidence our estimators perform well with large samples.
Convergence in distribution tell us something very different and is primarily used for hypothesis testing. Under the same distributional assumptions described above, CLT gives us that
$$sqrtn(barX_n-mu) rightarrow_D N(0,E(X_1^2)).$$
Convergence in distribution means that the cdf of the left-hand size converges at all continuity points to the cdf of the right-hand side, i.e.
$$lim_n rightarrow infty F_n(x) = F(x),$$
where $F_n(x)$ is the cdf of $sqrtn(barX_n-mu)$ and $F(x)$ is the cdf for a $N(0,E(X_1^2))$ distribution. Knowing the limiting distribution allows us to test hypotheses about the sample mean (or whatever estimate we are generating).
New contributor
$endgroup$
I will attempt to explain the distinction using the simplest example: the sample mean. Suppose we have an iid sample of random variables $X_i_i=1^n$. Then define the sample mean as $barX_n$. As the sample size grows, our value of the sample mean changes, hence the subscript $n$ to emphasize that our sample mean depends on the sample size.
Noting that $barX_n$ itself is a random variable, we can define a sequence of random variables, where elements of the sequence are indexed by different samples (sample size is growing), i.e. $barX_n_n=1^infty$. The weak law of large numbers (WLLN) tells us that so long as $E(X_1^2)<infty$, that
$$plimbarX_n = mu,$$
or equivalently
$$barX_n rightarrow_P mu,$$
where $mu=E(X_1)$. Formally, convergence in probability is defined as
$$forall epsilon>0, lim_n rightarrow infty P(|barX_n - mu| <epsilon)=1. $$
In other words, the probability of our estimate being within $epsilon$ from the true value tends to 1 as $n rightarrow infty$. Convergence in probability gives us confidence our estimators perform well with large samples.
Convergence in distribution tell us something very different and is primarily used for hypothesis testing. Under the same distributional assumptions described above, CLT gives us that
$$sqrtn(barX_n-mu) rightarrow_D N(0,E(X_1^2)).$$
Convergence in distribution means that the cdf of the left-hand size converges at all continuity points to the cdf of the right-hand side, i.e.
$$lim_n rightarrow infty F_n(x) = F(x),$$
where $F_n(x)$ is the cdf of $sqrtn(barX_n-mu)$ and $F(x)$ is the cdf for a $N(0,E(X_1^2))$ distribution. Knowing the limiting distribution allows us to test hypotheses about the sample mean (or whatever estimate we are generating).
New contributor
edited 14 hours ago
New contributor
answered yesterday
dlnBdlnB
4258
4258
New contributor
New contributor
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
add a comment |
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Your definition of convergence in probability is more demanding than the standard definition. For example, suppose $X_n = 1$ with probability $1/n$, with $X_n = 0$ otherwise. It’s clear that $X_n$ must converge in probability to $0$. However, $X_n$ does not converge to $0$ according to your definition, because we always have that $P(|X_n| < varepsilon ) neq 1$ for $varepsilon < 1$ and any $n$.
$endgroup$
– Theoretical Economist
15 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
$begingroup$
Yes, you are right. I posted my answer too quickly and made an error in writing the definition of weak convergence. I have corrected my post.
$endgroup$
– dlnB
14 hours ago
add a comment |
Martin is a new contributor. Be nice, and check out our Code of Conduct.
Martin is a new contributor. Be nice, and check out our Code of Conduct.
Martin is a new contributor. Be nice, and check out our Code of Conduct.
Martin is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Economics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2feconomics.stackexchange.com%2fquestions%2f27300%2fconvergence-in-probability-and-convergence-in-distribution%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
See: quora.com/…
$endgroup$
– afreelunch
yesterday