FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups 
 ProfileProfile   PreferencesPreferences   Log in to check your private messagesLog in to check your private messages   Log inLog in 
Forum index » Science and Technology » Math
Expectation value in terms of cumulative distribution
Post new topic   Reply to topic Page 1 of 1 [7 Posts] View previous topic :: View next topic
Author Message
Randy Poe
science forum Guru


Joined: 24 Mar 2005
Posts: 2485

PostPosted: Wed Jul 19, 2006 9:34 pm    Post subject: Expectation value in terms of cumulative distribution Reply with quote

I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

So I started wondering if I could estimate E[x] and E[x^2] directly
from F(x) and came up with an elementary theorem in about
30 seconds which I don't remember seeing. The proof is only
a couple of lines. Even though the proof seems pretty straightforward,
I'd like to know if this is in fact a theorem anyone has seen before.

Theorem: Let X be a random variable with finite support [a,b],
density function f(x) and cumulative probability F(x).
Then E[x] = b - integral(a,b)F(x) dx and
E[x^2] = b^2 - 2*integral(a,b) x F(x) dx

Proof: Integrate by parts.

E[x] = integral(a,b) x f(x) dx

Choose u = x, dv = f(x) dx => du = dx, v = F(x)

E[x] = (b*F(b) - a*F(a)) - integral(a,b) F(x) dx which gives the first
result since F(a) = 0, F(b) = 1.

E[x^2] = integral(a,b) x^2 f(x) dx
= (b^2*F(b) - a^2*F(a)) - integral(a,b) 2x F(x) dx

which gives the second result.

- Randy
Back to top
Stephen J. Herschkorn
science forum Guru


Joined: 24 Mar 2005
Posts: 641

PostPosted: Wed Jul 19, 2006 9:45 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

It is well-known that if X is a nonnegative random variable, then EX
= integral(x=0..infty, P{X > x}). The proof is easy to follow:

Let I(x) be the indicator variable for {X > x}. Then X =
int(x=0..infty, I(x)). Hence,
EX = E int(x=0..infty, I(x)) = int(x=0..infty, EI(x)) by Tonelli's
theorem. The integrand EI(x) = P{X > x}.

Note that X need not be continuous.

As a corollary, for nonnonegative X and a > 0,
E[X^a] = int(u=0..infty, P{X^a > u}) = int(u=0..infty, P{X >
u^(1/a)). Thus,
E[X^a] = a int(x=0..infty, x^(a-1) P{X > x}) by change of variable in
the integral.

--
Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan
Back to top
C6L1V@shaw.ca
science forum Guru


Joined: 23 May 2005
Posts: 628

PostPosted: Wed Jul 19, 2006 9:47 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

Randy Poe wrote:
Quote:
I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

So I started wondering if I could estimate E[x] and E[x^2] directly
from F(x) and came up with an elementary theorem in about
30 seconds which I don't remember seeing.

For any non-negative r.v., EX = int_0^infinity G(x) dx, where G(x) =
P{X > x}. This is true, even if X does not have a density function f;
it is an old result that appears in just about every probability
textbook; see, eg., Ross, Introduction to Probability Models. (The
result holds in the sense that EX = infinity iff the integral diverges;
and if one side is finite, so is the other, and then they are equal.) I
guess you can reduce the [a,b] result to the >= 0 result by considering
X = a + Y, with 0 <= Y <= b-a, and I think your E(X) expression follows
from that, whenever X has a density. However, I guess your actual
expression is new. If memory serves, there is a similar well-known
expression for E(X^2), but my memory does not serve far enough to
retrieve the actual formula.

R.G. Vickson


Quote:
The proof is only
a couple of lines. Even though the proof seems pretty straightforward,
I'd like to know if this is in fact a theorem anyone has seen before.

Theorem: Let X be a random variable with finite support [a,b],
density function f(x) and cumulative probability F(x).
Then E[x] = b - integral(a,b)F(x) dx and
E[x^2] = b^2 - 2*integral(a,b) x F(x) dx

Proof: Integrate by parts.

E[x] = integral(a,b) x f(x) dx

Choose u = x, dv = f(x) dx => du = dx, v = F(x)

E[x] = (b*F(b) - a*F(a)) - integral(a,b) F(x) dx which gives the first
result since F(a) = 0, F(b) = 1.

E[x^2] = integral(a,b) x^2 f(x) dx
= (b^2*F(b) - a^2*F(a)) - integral(a,b) 2x F(x) dx

which gives the second result.

- Randy
Back to top
Ray Koopman
science forum Guru Wannabe


Joined: 25 Mar 2005
Posts: 216

PostPosted: Wed Jul 19, 2006 11:12 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

Randy Poe wrote:
Quote:
I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

If the p-interval is small enough, couldn't you estimate the first
and second raw moments by simple quadrature (e.g., trapezoids) on
q(p) and q(p)^2 ?
Back to top
ArtflDodgr
science forum beginner


Joined: 09 May 2005
Posts: 45

PostPosted: Thu Jul 20, 2006 5:01 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

In article <1153344847.277486.315790@h48g2000cwc.googlegroups.com>,
"Randy Poe" <poespam-trap@yahoo.com> wrote:

Quote:
I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

So I started wondering if I could estimate E[x] and E[x^2] directly
from F(x) and came up with an elementary theorem in about
30 seconds which I don't remember seeing. The proof is only
a couple of lines. Even though the proof seems pretty straightforward,
I'd like to know if this is in fact a theorem anyone has seen before.

It's well known. You can find the formula (valid for a non-negative
random variable X)

E[X^b] = int_0^infty P[X>t] b*t^{b-1} dt,

for b>0, on page 150 of the second edition of vol. II of Feller's
"An Introduction to Probability Theory and its Applications".
Without doubt, the formula predates the 1971 publication of that book by
many years. (For example, it appears as an exercise in Chung's
"Course", which first appeared in 1968; the inequality

E[X^b] <= int_0^infty P[X>t] b*t^{b-1} dt

plays a role in Doob's 1953 proof of his L^p maximal inequality for
submartingales.)

Your formula results by applying the above formula to the r.v. X = x-a.

Feller's proof is simply integration by parts, and yields the more
general expression (still for non-negative X)

E[G(X)] = int_0^infty P[X>t] dG(t)

provided G: [0,infty) --> [0,\inty) is non-decreasing.

Quote:
Theorem: Let X be a random variable with finite support [a,b],
density function f(x) and cumulative probability F(x).
Then E[x] = b - integral(a,b)F(x) dx and
E[x^2] = b^2 - 2*integral(a,b) x F(x) dx

Proof: Integrate by parts.

E[x] = integral(a,b) x f(x) dx

Choose u = x, dv = f(x) dx => du = dx, v = F(x)

E[x] = (b*F(b) - a*F(a)) - integral(a,b) F(x) dx which gives the first
result since F(a) = 0, F(b) = 1.

E[x^2] = integral(a,b) x^2 f(x) dx
= (b^2*F(b) - a^2*F(a)) - integral(a,b) 2x F(x) dx

which gives the second result.

--
A.
Back to top
Herman Rubin
science forum Guru


Joined: 25 Mar 2005
Posts: 730

PostPosted: Thu Jul 20, 2006 5:03 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

In article <1153344847.277486.315790@h48g2000cwc.googlegroups.com>,
Randy Poe <poespam-trap@yahoo.com> wrote:
Quote:
I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

So I started wondering if I could estimate E[x] and E[x^2] directly
from F(x) and came up with an elementary theorem in about
30 seconds which I don't remember seeing. The proof is only
a couple of lines. Even though the proof seems pretty straightforward,
I'd like to know if this is in fact a theorem anyone has seen before.

It is well known that if X is non-negative with cdf F,
then E(X) = \int (1 - F(t)) dt. It is less well known
that E(X) = \int^0 -F(t) dt + \int_0 (1 - F(t)) dt,
the first integral on the negative side, and the second
on the positive side. This can be extended to integrals
with arbitrary measures, and can even be used as a
definition of the Lebesgue-Stieltjes integral.

Suppose X >= 0. This can easily be extended to the
expectation of X^2 which is \int P(X^2 > t) dt =
\int P(X > sqrt(t)) dt = \int (1 - F(u))*2u du.

As you see, lots more can be done with this.
--
This address is for information only. I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Department of Statistics, Purdue University
hrubin@stat.purdue.edu Phone: (765)494-6054 FAX: (765)494-0558
Back to top
Randy Poe
science forum Guru


Joined: 24 Mar 2005
Posts: 2485

PostPosted: Thu Jul 20, 2006 5:46 pm    Post subject: Re: Expectation value in terms of cumulative distribution Reply with quote

C6L1V@shaw.ca wrote:
Quote:
Randy Poe wrote:
I have to estimate some means and variances from empirical
data, and for various reasons what I have to work with are
estimates of cumulative distributions rather than densities
Actually, what I have are quantile levels, a set of values
q(p) = x such that P(X<=x) = F(x) = p at evenly spaced values of p
from 0 to 1, which gives me in effect estimates of F(x) at unevenly
spaced intervals of x.

So I started wondering if I could estimate E[x] and E[x^2] directly
from F(x) and came up with an elementary theorem in about
30 seconds which I don't remember seeing.

For any non-negative r.v., EX = int_0^infinity G(x) dx, where G(x) =
P{X > x}.

OK. I did remember something like that for non-negative rvs, but
couldn't remember the exact theorem and didn't have a
reference at hand. As I was scribbling this down and remembering
there was some sort of result with non-negative rvs, I was
thinking -- why do I need the non-negative requirement?

Quote:
This is true, even if X does not have a density function f;
it is an old result that appears in just about every probability
textbook; see, eg., Ross, Introduction to Probability Models. (The
result holds in the sense that EX = infinity iff the integral diverges;
and if one side is finite, so is the other, and then they are equal.) I
guess you can reduce the [a,b] result to the >= 0 result by considering
X = a + Y,

Yes, I missed that trivial point. Others pointed out this connection
as well. I was thinking that this was, if not especially interesting,
at least different because it drops the non-negativity requirement.
Oops.

So if I'm going to rival JSH for earthshaking new mathematics, I guess
it won't be on the basis of this little theorem.

Quote:
with 0 <= Y <= b-a, and I think your E(X) expression follows
from that, whenever X has a density. However, I guess your actual
expression is new. If memory serves, there is a similar well-known
expression for E(X^2), but my memory does not serve far enough to
retrieve the actual formula.

Since X^2 is non-negative, it's probably a straightforward corollary.

Thanks all for the responses.

- Randy
Back to top
Google

Back to top
Display posts from previous:   
Post new topic   Reply to topic Page 1 of 1 [7 Posts] View previous topic :: View next topic
The time now is Tue Dec 12, 2017 9:58 pm | All times are GMT
Forum index » Science and Technology » Math
Jump to:  

Similar Topics
Topic Author Forum Replies Last Post
No new posts Is there a way to write out the process of the cumulative... Michael11 Math 1 Wed Jul 19, 2006 7:16 am
No new posts Distribution of Goldbach pairs as even integers n increases stargene@sbcglobal.net Math 1 Mon Jul 17, 2006 5:02 am
No new posts Help: Exchange of limit and expectation Yecloud Math 8 Sat Jul 15, 2006 2:00 pm
No new posts What is the expected value of a truncated trivariate norm... chrislbartlett@gmail.com1 Math 1 Mon Jul 10, 2006 3:10 pm
No new posts how to generate a random number following truncated Weibu... comtech Math 5 Sat Jul 08, 2006 1:01 am

Copyright © 2004-2005 DeniX Solutions SRL
Other DeniX Solutions sites: Electronics forum |  Medicine forum |  Unix/Linux blog |  Unix/Linux documentation |  Unix/Linux forums  |  send newsletters
 


Powered by phpBB © 2001, 2005 phpBB Group
[ Time: 0.1413s ][ Queries: 16 (0.1197s) ][ GZIP on - Debug on ]