Mean and Variance of Continuous Random VariableMean and variance of call center dataPoisson random variable- varianceHow to obtain variance of a random variable that depends on a hypergeometric variable?Variance reduction technique in Monte Carlo integrationVariance of a continuous uniformly distributed random variableWeighted sample mean and variance - asking for references and detailsCan the variance of a continuous random variable with known distribution be impossible to find?Mean and variance of the maximum of a random number of Uniform variablesApproximating the expected value and variance of the function of a (continuous univariate) random variableMean of maximum of exponential random variables (independent but not identical)

What's the metal clinking sound at the end of credits in Avengers: Endgame?

How to stop co-workers from teasing me because I know Russian?

Any examples of headwear for races with animal ears?

How to replace the "space symbol" (squat-u) in listings?

When and why did journal article titles become descriptive, rather than creatively allusive?

Why was Germany not as successful as other Europeans in establishing overseas colonies?

Subtleties of choosing the sequence of tenses in Russian

Build a trail cart

Pawn Sacrifice Justification

Examples of non trivial equivalence relations , I mean equivalence relations without the expression " same ... as" in their definition?

Past Perfect Tense

Airbnb - host wants to reduce rooms, can we get refund?

Toggle Overlays shortcut?

Why is the origin of “threshold” uncertain?

When to use 1/Ka vs Kb

Pulling the rope with one hand is as heavy as with two hands?

Help, my Death Star suffers from Kessler syndrome!

"ne paelici suspectaretur" (Tacitus)

When did stoichiometry begin to be taught in U.S. high schools?

Confusion about capacitors

Why do computer-science majors learn calculus?

What is the difference between `a[bc]d` (brackets) and `ab,cd` (braces)?

I listed a wrong degree date on my background check. What will happen?

How to determine the actual or "true" resolution of a digital photograph?



Mean and Variance of Continuous Random Variable


Mean and variance of call center dataPoisson random variable- varianceHow to obtain variance of a random variable that depends on a hypergeometric variable?Variance reduction technique in Monte Carlo integrationVariance of a continuous uniformly distributed random variableWeighted sample mean and variance - asking for references and detailsCan the variance of a continuous random variable with known distribution be impossible to find?Mean and variance of the maximum of a random number of Uniform variablesApproximating the expected value and variance of the function of a (continuous univariate) random variableMean of maximum of exponential random variables (independent but not identical)






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








3












$begingroup$


I have a problem on my homework about the continuous random variable $y$ where the cdf is $F(y)=frac1(1+e^-y)$.



Part a is asking for the pdf which I found to be $frace^y(e^y+1)^2$.



Part b asks for the mean and variance of $y$ but when I tried to find the $E(y)$, I got zero with the integral from $-infty$ to $infty$ of $fracye^y(e^y+1)^2$. I'm not sure where I'm going wrong with this problem?










share|cite|improve this question











$endgroup$


















    3












    $begingroup$


    I have a problem on my homework about the continuous random variable $y$ where the cdf is $F(y)=frac1(1+e^-y)$.



    Part a is asking for the pdf which I found to be $frace^y(e^y+1)^2$.



    Part b asks for the mean and variance of $y$ but when I tried to find the $E(y)$, I got zero with the integral from $-infty$ to $infty$ of $fracye^y(e^y+1)^2$. I'm not sure where I'm going wrong with this problem?










    share|cite|improve this question











    $endgroup$














      3












      3








      3





      $begingroup$


      I have a problem on my homework about the continuous random variable $y$ where the cdf is $F(y)=frac1(1+e^-y)$.



      Part a is asking for the pdf which I found to be $frace^y(e^y+1)^2$.



      Part b asks for the mean and variance of $y$ but when I tried to find the $E(y)$, I got zero with the integral from $-infty$ to $infty$ of $fracye^y(e^y+1)^2$. I'm not sure where I'm going wrong with this problem?










      share|cite|improve this question











      $endgroup$




      I have a problem on my homework about the continuous random variable $y$ where the cdf is $F(y)=frac1(1+e^-y)$.



      Part a is asking for the pdf which I found to be $frace^y(e^y+1)^2$.



      Part b asks for the mean and variance of $y$ but when I tried to find the $E(y)$, I got zero with the integral from $-infty$ to $infty$ of $fracye^y(e^y+1)^2$. I'm not sure where I'm going wrong with this problem?







      variance mean






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Apr 8 at 2:10









      Noah

      3,9961417




      3,9961417










      asked Apr 7 at 23:29









      EBuschEBusch

      162




      162




















          3 Answers
          3






          active

          oldest

          votes


















          3












          $begingroup$

          This distribution is the standard logistic distribution, and its moments and information are examined in deCani and Stine (1986). The distribution function can be written equivalently in either of the following two forms:



          $$F_Y(y) = frac11+e^-y = frace^y1+e^y .$$



          The density function can be written in either of the following two forms:



          $$f_Y(y) = frace^-y(1+e^-y)^2 = frace^y(1+e^y)^2.$$



          Since $f_Y(y) = f_Y(-y)$ for all $y in mathbbR$ the density is symmetric around zero, so it is trivial to show that $mathbbE(Y)=0$. Deriving the variance is more difficult, but it can be done by a number of different methods. This simplest method is to derive the moment-generating function of the distribution and use this to get the variance. Another method is to derive the variance is by direct application of the variance formula, using integration by parts in conjunction with the symmetry of the density function. (This leads to an integral involving the dilogarithm function, which then requires you to take limits of this function using an asymptotic form. The mathematics for this derivation gets pretty long.)



          Since this is homework, I will not show you how to do either of these derivations, but I might come back to this question later and add them, once enough time has elapsed so that your homework has been submitted. I recommend attempting the derivation of the variance by first finding the moment-generating function. If you have trouble you could consult the cited paper to review their method.






          share|cite|improve this answer











          $endgroup$








          • 1




            $begingroup$
            Thanks for name of distribution, which was eluding me. (+1)
            $endgroup$
            – BruceET
            Apr 10 at 0:29


















          1












          $begingroup$

          What makes you think you did something wrong?



          beginalign
          & Pr(Yle y) = F(y) = frac 1 1+e^-y \[10pt]
          textand & Pr(Yge -y) = 1-F(-y) = 1- frac 1 1+e^y \[8pt]
          = & frace^y1+e^y = frace^ycdot e^-y(1+e^y)cdot e^-y = frac 1 e^-y+1,
          endalign

          and therefore
          $$
          Pr(Yle y) = Pr(Y ge -y).
          $$

          So this distribution is symmetric about $0.$



          Therefore, if the expected value exists, it is $0.$



          You can also show that the density function is an even function:
          beginalign
          f(y) & = frace^y(1+e^y)^2. \[12pt]
          f(-y) & = frace^-y(1+e^-y)^2 = frace^-ycdotleft( e^y right)^2Big((1+e^-y) cdot e^y Big)^2 = frace^y(e^y+1)^2 = f(y).
          endalign

          Since the density is an even function, the expected value must be $0$ if it exists.



          The expected value $operatorname E(Y)$ exists if $operatorname E(|Y|) < +infty.$






          share|cite|improve this answer









          $endgroup$




















            1












            $begingroup$

            Comment:



            Setting what I take to be your CDF equal to $U sim mathsfUnif(0,1),$ and solving for the quantile function (inverse CDF) in terms of $U,$ I simulate a sample of ten million
            observations as shown below. [Thanks to @Noah for recent clarification of notation in Problem.]



            Then, when I plot your PDF through the histogram of the large sample, that density function
            seems to fit pretty well.



            set.seed(1019) # for reproducibility
            u = runif(10^7); x = -log(1/u - 1)
            mean(x); sd(x); sqrt(pi^2/3); 2*sd(x)/sqrt(10^7)
            [1] -0.000594651 # aprx E(X) = 0
            [1] 1.81335 # aprx SD(X) = 1.813799
            [1] 1.813799 # exact SD(X) per Wikipedia on 'logistic distn'
            [1] 0.003626701 # aprx 95% margin of simulation error for E(X)

            hist(x, prob=T, br=100, col="skyblue2")
            curve(exp(x)/(exp(x)+1)^2, -10, 10, add=T, lwd=2, col="red")


            enter image description here



            I don't pretend that this is a 'worked answer' to your problem, but
            I hope it may give you enough clues to improve the version of the problem you posted and to finish the problem on your own.






            share|cite|improve this answer











            $endgroup$













              Your Answer








              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "65"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f401726%2fmean-and-variance-of-continuous-random-variable%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              3












              $begingroup$

              This distribution is the standard logistic distribution, and its moments and information are examined in deCani and Stine (1986). The distribution function can be written equivalently in either of the following two forms:



              $$F_Y(y) = frac11+e^-y = frace^y1+e^y .$$



              The density function can be written in either of the following two forms:



              $$f_Y(y) = frace^-y(1+e^-y)^2 = frace^y(1+e^y)^2.$$



              Since $f_Y(y) = f_Y(-y)$ for all $y in mathbbR$ the density is symmetric around zero, so it is trivial to show that $mathbbE(Y)=0$. Deriving the variance is more difficult, but it can be done by a number of different methods. This simplest method is to derive the moment-generating function of the distribution and use this to get the variance. Another method is to derive the variance is by direct application of the variance formula, using integration by parts in conjunction with the symmetry of the density function. (This leads to an integral involving the dilogarithm function, which then requires you to take limits of this function using an asymptotic form. The mathematics for this derivation gets pretty long.)



              Since this is homework, I will not show you how to do either of these derivations, but I might come back to this question later and add them, once enough time has elapsed so that your homework has been submitted. I recommend attempting the derivation of the variance by first finding the moment-generating function. If you have trouble you could consult the cited paper to review their method.






              share|cite|improve this answer











              $endgroup$








              • 1




                $begingroup$
                Thanks for name of distribution, which was eluding me. (+1)
                $endgroup$
                – BruceET
                Apr 10 at 0:29















              3












              $begingroup$

              This distribution is the standard logistic distribution, and its moments and information are examined in deCani and Stine (1986). The distribution function can be written equivalently in either of the following two forms:



              $$F_Y(y) = frac11+e^-y = frace^y1+e^y .$$



              The density function can be written in either of the following two forms:



              $$f_Y(y) = frace^-y(1+e^-y)^2 = frace^y(1+e^y)^2.$$



              Since $f_Y(y) = f_Y(-y)$ for all $y in mathbbR$ the density is symmetric around zero, so it is trivial to show that $mathbbE(Y)=0$. Deriving the variance is more difficult, but it can be done by a number of different methods. This simplest method is to derive the moment-generating function of the distribution and use this to get the variance. Another method is to derive the variance is by direct application of the variance formula, using integration by parts in conjunction with the symmetry of the density function. (This leads to an integral involving the dilogarithm function, which then requires you to take limits of this function using an asymptotic form. The mathematics for this derivation gets pretty long.)



              Since this is homework, I will not show you how to do either of these derivations, but I might come back to this question later and add them, once enough time has elapsed so that your homework has been submitted. I recommend attempting the derivation of the variance by first finding the moment-generating function. If you have trouble you could consult the cited paper to review their method.






              share|cite|improve this answer











              $endgroup$








              • 1




                $begingroup$
                Thanks for name of distribution, which was eluding me. (+1)
                $endgroup$
                – BruceET
                Apr 10 at 0:29













              3












              3








              3





              $begingroup$

              This distribution is the standard logistic distribution, and its moments and information are examined in deCani and Stine (1986). The distribution function can be written equivalently in either of the following two forms:



              $$F_Y(y) = frac11+e^-y = frace^y1+e^y .$$



              The density function can be written in either of the following two forms:



              $$f_Y(y) = frace^-y(1+e^-y)^2 = frace^y(1+e^y)^2.$$



              Since $f_Y(y) = f_Y(-y)$ for all $y in mathbbR$ the density is symmetric around zero, so it is trivial to show that $mathbbE(Y)=0$. Deriving the variance is more difficult, but it can be done by a number of different methods. This simplest method is to derive the moment-generating function of the distribution and use this to get the variance. Another method is to derive the variance is by direct application of the variance formula, using integration by parts in conjunction with the symmetry of the density function. (This leads to an integral involving the dilogarithm function, which then requires you to take limits of this function using an asymptotic form. The mathematics for this derivation gets pretty long.)



              Since this is homework, I will not show you how to do either of these derivations, but I might come back to this question later and add them, once enough time has elapsed so that your homework has been submitted. I recommend attempting the derivation of the variance by first finding the moment-generating function. If you have trouble you could consult the cited paper to review their method.






              share|cite|improve this answer











              $endgroup$



              This distribution is the standard logistic distribution, and its moments and information are examined in deCani and Stine (1986). The distribution function can be written equivalently in either of the following two forms:



              $$F_Y(y) = frac11+e^-y = frace^y1+e^y .$$



              The density function can be written in either of the following two forms:



              $$f_Y(y) = frace^-y(1+e^-y)^2 = frace^y(1+e^y)^2.$$



              Since $f_Y(y) = f_Y(-y)$ for all $y in mathbbR$ the density is symmetric around zero, so it is trivial to show that $mathbbE(Y)=0$. Deriving the variance is more difficult, but it can be done by a number of different methods. This simplest method is to derive the moment-generating function of the distribution and use this to get the variance. Another method is to derive the variance is by direct application of the variance formula, using integration by parts in conjunction with the symmetry of the density function. (This leads to an integral involving the dilogarithm function, which then requires you to take limits of this function using an asymptotic form. The mathematics for this derivation gets pretty long.)



              Since this is homework, I will not show you how to do either of these derivations, but I might come back to this question later and add them, once enough time has elapsed so that your homework has been submitted. I recommend attempting the derivation of the variance by first finding the moment-generating function. If you have trouble you could consult the cited paper to review their method.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited Apr 9 at 4:09

























              answered Apr 9 at 1:27









              BenBen

              29.3k234130




              29.3k234130







              • 1




                $begingroup$
                Thanks for name of distribution, which was eluding me. (+1)
                $endgroup$
                – BruceET
                Apr 10 at 0:29












              • 1




                $begingroup$
                Thanks for name of distribution, which was eluding me. (+1)
                $endgroup$
                – BruceET
                Apr 10 at 0:29







              1




              1




              $begingroup$
              Thanks for name of distribution, which was eluding me. (+1)
              $endgroup$
              – BruceET
              Apr 10 at 0:29




              $begingroup$
              Thanks for name of distribution, which was eluding me. (+1)
              $endgroup$
              – BruceET
              Apr 10 at 0:29













              1












              $begingroup$

              What makes you think you did something wrong?



              beginalign
              & Pr(Yle y) = F(y) = frac 1 1+e^-y \[10pt]
              textand & Pr(Yge -y) = 1-F(-y) = 1- frac 1 1+e^y \[8pt]
              = & frace^y1+e^y = frace^ycdot e^-y(1+e^y)cdot e^-y = frac 1 e^-y+1,
              endalign

              and therefore
              $$
              Pr(Yle y) = Pr(Y ge -y).
              $$

              So this distribution is symmetric about $0.$



              Therefore, if the expected value exists, it is $0.$



              You can also show that the density function is an even function:
              beginalign
              f(y) & = frace^y(1+e^y)^2. \[12pt]
              f(-y) & = frace^-y(1+e^-y)^2 = frace^-ycdotleft( e^y right)^2Big((1+e^-y) cdot e^y Big)^2 = frace^y(e^y+1)^2 = f(y).
              endalign

              Since the density is an even function, the expected value must be $0$ if it exists.



              The expected value $operatorname E(Y)$ exists if $operatorname E(|Y|) < +infty.$






              share|cite|improve this answer









              $endgroup$

















                1












                $begingroup$

                What makes you think you did something wrong?



                beginalign
                & Pr(Yle y) = F(y) = frac 1 1+e^-y \[10pt]
                textand & Pr(Yge -y) = 1-F(-y) = 1- frac 1 1+e^y \[8pt]
                = & frace^y1+e^y = frace^ycdot e^-y(1+e^y)cdot e^-y = frac 1 e^-y+1,
                endalign

                and therefore
                $$
                Pr(Yle y) = Pr(Y ge -y).
                $$

                So this distribution is symmetric about $0.$



                Therefore, if the expected value exists, it is $0.$



                You can also show that the density function is an even function:
                beginalign
                f(y) & = frace^y(1+e^y)^2. \[12pt]
                f(-y) & = frace^-y(1+e^-y)^2 = frace^-ycdotleft( e^y right)^2Big((1+e^-y) cdot e^y Big)^2 = frace^y(e^y+1)^2 = f(y).
                endalign

                Since the density is an even function, the expected value must be $0$ if it exists.



                The expected value $operatorname E(Y)$ exists if $operatorname E(|Y|) < +infty.$






                share|cite|improve this answer









                $endgroup$















                  1












                  1








                  1





                  $begingroup$

                  What makes you think you did something wrong?



                  beginalign
                  & Pr(Yle y) = F(y) = frac 1 1+e^-y \[10pt]
                  textand & Pr(Yge -y) = 1-F(-y) = 1- frac 1 1+e^y \[8pt]
                  = & frace^y1+e^y = frace^ycdot e^-y(1+e^y)cdot e^-y = frac 1 e^-y+1,
                  endalign

                  and therefore
                  $$
                  Pr(Yle y) = Pr(Y ge -y).
                  $$

                  So this distribution is symmetric about $0.$



                  Therefore, if the expected value exists, it is $0.$



                  You can also show that the density function is an even function:
                  beginalign
                  f(y) & = frace^y(1+e^y)^2. \[12pt]
                  f(-y) & = frace^-y(1+e^-y)^2 = frace^-ycdotleft( e^y right)^2Big((1+e^-y) cdot e^y Big)^2 = frace^y(e^y+1)^2 = f(y).
                  endalign

                  Since the density is an even function, the expected value must be $0$ if it exists.



                  The expected value $operatorname E(Y)$ exists if $operatorname E(|Y|) < +infty.$






                  share|cite|improve this answer









                  $endgroup$



                  What makes you think you did something wrong?



                  beginalign
                  & Pr(Yle y) = F(y) = frac 1 1+e^-y \[10pt]
                  textand & Pr(Yge -y) = 1-F(-y) = 1- frac 1 1+e^y \[8pt]
                  = & frace^y1+e^y = frace^ycdot e^-y(1+e^y)cdot e^-y = frac 1 e^-y+1,
                  endalign

                  and therefore
                  $$
                  Pr(Yle y) = Pr(Y ge -y).
                  $$

                  So this distribution is symmetric about $0.$



                  Therefore, if the expected value exists, it is $0.$



                  You can also show that the density function is an even function:
                  beginalign
                  f(y) & = frace^y(1+e^y)^2. \[12pt]
                  f(-y) & = frace^-y(1+e^-y)^2 = frace^-ycdotleft( e^y right)^2Big((1+e^-y) cdot e^y Big)^2 = frace^y(e^y+1)^2 = f(y).
                  endalign

                  Since the density is an even function, the expected value must be $0$ if it exists.



                  The expected value $operatorname E(Y)$ exists if $operatorname E(|Y|) < +infty.$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Apr 8 at 2:44









                  Michael HardyMichael Hardy

                  4,1451430




                  4,1451430





















                      1












                      $begingroup$

                      Comment:



                      Setting what I take to be your CDF equal to $U sim mathsfUnif(0,1),$ and solving for the quantile function (inverse CDF) in terms of $U,$ I simulate a sample of ten million
                      observations as shown below. [Thanks to @Noah for recent clarification of notation in Problem.]



                      Then, when I plot your PDF through the histogram of the large sample, that density function
                      seems to fit pretty well.



                      set.seed(1019) # for reproducibility
                      u = runif(10^7); x = -log(1/u - 1)
                      mean(x); sd(x); sqrt(pi^2/3); 2*sd(x)/sqrt(10^7)
                      [1] -0.000594651 # aprx E(X) = 0
                      [1] 1.81335 # aprx SD(X) = 1.813799
                      [1] 1.813799 # exact SD(X) per Wikipedia on 'logistic distn'
                      [1] 0.003626701 # aprx 95% margin of simulation error for E(X)

                      hist(x, prob=T, br=100, col="skyblue2")
                      curve(exp(x)/(exp(x)+1)^2, -10, 10, add=T, lwd=2, col="red")


                      enter image description here



                      I don't pretend that this is a 'worked answer' to your problem, but
                      I hope it may give you enough clues to improve the version of the problem you posted and to finish the problem on your own.






                      share|cite|improve this answer











                      $endgroup$

















                        1












                        $begingroup$

                        Comment:



                        Setting what I take to be your CDF equal to $U sim mathsfUnif(0,1),$ and solving for the quantile function (inverse CDF) in terms of $U,$ I simulate a sample of ten million
                        observations as shown below. [Thanks to @Noah for recent clarification of notation in Problem.]



                        Then, when I plot your PDF through the histogram of the large sample, that density function
                        seems to fit pretty well.



                        set.seed(1019) # for reproducibility
                        u = runif(10^7); x = -log(1/u - 1)
                        mean(x); sd(x); sqrt(pi^2/3); 2*sd(x)/sqrt(10^7)
                        [1] -0.000594651 # aprx E(X) = 0
                        [1] 1.81335 # aprx SD(X) = 1.813799
                        [1] 1.813799 # exact SD(X) per Wikipedia on 'logistic distn'
                        [1] 0.003626701 # aprx 95% margin of simulation error for E(X)

                        hist(x, prob=T, br=100, col="skyblue2")
                        curve(exp(x)/(exp(x)+1)^2, -10, 10, add=T, lwd=2, col="red")


                        enter image description here



                        I don't pretend that this is a 'worked answer' to your problem, but
                        I hope it may give you enough clues to improve the version of the problem you posted and to finish the problem on your own.






                        share|cite|improve this answer











                        $endgroup$















                          1












                          1








                          1





                          $begingroup$

                          Comment:



                          Setting what I take to be your CDF equal to $U sim mathsfUnif(0,1),$ and solving for the quantile function (inverse CDF) in terms of $U,$ I simulate a sample of ten million
                          observations as shown below. [Thanks to @Noah for recent clarification of notation in Problem.]



                          Then, when I plot your PDF through the histogram of the large sample, that density function
                          seems to fit pretty well.



                          set.seed(1019) # for reproducibility
                          u = runif(10^7); x = -log(1/u - 1)
                          mean(x); sd(x); sqrt(pi^2/3); 2*sd(x)/sqrt(10^7)
                          [1] -0.000594651 # aprx E(X) = 0
                          [1] 1.81335 # aprx SD(X) = 1.813799
                          [1] 1.813799 # exact SD(X) per Wikipedia on 'logistic distn'
                          [1] 0.003626701 # aprx 95% margin of simulation error for E(X)

                          hist(x, prob=T, br=100, col="skyblue2")
                          curve(exp(x)/(exp(x)+1)^2, -10, 10, add=T, lwd=2, col="red")


                          enter image description here



                          I don't pretend that this is a 'worked answer' to your problem, but
                          I hope it may give you enough clues to improve the version of the problem you posted and to finish the problem on your own.






                          share|cite|improve this answer











                          $endgroup$



                          Comment:



                          Setting what I take to be your CDF equal to $U sim mathsfUnif(0,1),$ and solving for the quantile function (inverse CDF) in terms of $U,$ I simulate a sample of ten million
                          observations as shown below. [Thanks to @Noah for recent clarification of notation in Problem.]



                          Then, when I plot your PDF through the histogram of the large sample, that density function
                          seems to fit pretty well.



                          set.seed(1019) # for reproducibility
                          u = runif(10^7); x = -log(1/u - 1)
                          mean(x); sd(x); sqrt(pi^2/3); 2*sd(x)/sqrt(10^7)
                          [1] -0.000594651 # aprx E(X) = 0
                          [1] 1.81335 # aprx SD(X) = 1.813799
                          [1] 1.813799 # exact SD(X) per Wikipedia on 'logistic distn'
                          [1] 0.003626701 # aprx 95% margin of simulation error for E(X)

                          hist(x, prob=T, br=100, col="skyblue2")
                          curve(exp(x)/(exp(x)+1)^2, -10, 10, add=T, lwd=2, col="red")


                          enter image description here



                          I don't pretend that this is a 'worked answer' to your problem, but
                          I hope it may give you enough clues to improve the version of the problem you posted and to finish the problem on your own.







                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Apr 10 at 0:39

























                          answered Apr 8 at 0:53









                          BruceETBruceET

                          7,3611721




                          7,3611721



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Cross Validated!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f401726%2fmean-and-variance-of-continuous-random-variable%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                              Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                              Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High