FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups 
 ProfileProfile   PreferencesPreferences   Log in to check your private messagesLog in to check your private messages   Log inLog in 
Forum index » Science and Technology » Math
Sum and difference of independent variables
Post new topic   Reply to topic Page 1 of 2 [16 Posts] View previous topic :: View next topic
Goto page:  1, 2 Next
Author Message
Robert B. Israel
science forum Guru


Joined: 24 Mar 2005
Posts: 2151

PostPosted: Sun Jul 09, 2006 11:44 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

In article <1152458138.800021.138380@m73g2000cwd.googlegroups.com>,
<rtonyreeder@yahoo.com> wrote:
Quote:

Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?

No, it's that the events {X <= x} and {Y <= y} are independent for
every x and y, i.e. Prob(X <= x and Y <= y} = Prob(X <= x) Prob(Y <= y).
In this case you might think about, say, how it could happen that
X+Y <= 1/3 and X-Y <= -2/3.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
C6L1V@shaw.ca
science forum Guru


Joined: 23 May 2005
Posts: 628

PostPosted: Sun Jul 09, 2006 7:54 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

rtonyree...@yahoo.com wrote:
Quote:
C6L1V@shaw.ca wrote:
rtonyreeder@yahoo.com wrote:
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?


I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and X-Y because
they are independent?

No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.


iid = independent and identically distributed ?

Yes.


The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?

I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and X-Y are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.

Thanks. That is not obvious.

OK, here is one more reason that you may, or may not find convincing.
Independence refers to the whole (joint) probability distribution,
while mean, variance and correlation refer only to "first moments" and
"second moments"---things like expected values of X and X^2. To say
that U = X+Y and V = X-Y are uncorrelated just makes a claim about
properties up to the second order. However, higher order objects (like
expectations of U^3 or V^4 or U^2 V^3, etc.) are not covered by the
"no-correlation" result.

Note, however, that for normally-distributed random variables, the
first and second moments determine the distrbution completely, so that
is why U and V *are* independent *for the special case of independent,
indentically distributed normal random variables*. In fact,
'independence = no correlation' for normal distributions, but not for
most others.

RGV

Quote:



In light of this, what are you saying? That there is no correlation
because X+Y and X-Y because they are independent?

No. He said in a previous posting that they are DEPENDENT.

R.G. Vickson
Adjunct Professor, University of Waterloo


They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.


Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
rtonyreeder@yahoo.com
science forum beginner


Joined: 25 May 2006
Posts: 32

PostPosted: Sun Jul 09, 2006 7:40 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

C6L1V@shaw.ca wrote:
Quote:
rtonyreeder@yahoo.com wrote:
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?


I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and X-Y because
they are independent?

No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.


iid = independent and identically distributed ?

Yes.


The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?

I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and X-Y are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.

Thanks. That is not obvious.

Quote:


In light of this, what are you saying? That there is no correlation
because X+Y and X-Y because they are independent?

No. He said in a previous posting that they are DEPENDENT.

R.G. Vickson
Adjunct Professor, University of Waterloo


They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.


Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
C6L1V@shaw.ca
science forum Guru


Joined: 23 May 2005
Posts: 628

PostPosted: Sun Jul 09, 2006 4:31 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

rtonyreeder@yahoo.com wrote:
Quote:
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?


I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and X-Y because
they are independent?

No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.

Quote:

iid = independent and identically distributed ?

Yes.

Quote:

The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?

I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and X-Y are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.

Quote:

In light of this, what are you saying? That there is no correlation
because X+Y and X-Y because they are independent?

No. He said in a previous posting that they are DEPENDENT.

R.G. Vickson
Adjunct Professor, University of Waterloo

Quote:

They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.


Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
rtonyreeder@yahoo.com
science forum beginner


Joined: 25 May 2006
Posts: 32

PostPosted: Sun Jul 09, 2006 3:15 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

Robert Israel wrote:
Quote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?

Quote:

I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and X-Y because
they are independent?

iid = independent and identically distributed ?

The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?

I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

In light of this, what are you saying? That there is no correlation
because X+Y and X-Y because they are independent?

They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.

Quote:

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
Robert B. Israel
science forum Guru


Joined: 24 Mar 2005
Posts: 2151

PostPosted: Sun Jul 09, 2006 6:35 am    Post subject: Re: Sum and difference of independent variables Reply with quote

In article <1152400862.078527.136160@m79g2000cwm.googlegroups.com>,
Pubkeybreaker <Robert_silverman@raytheon.com> wrote:
Quote:

Stephen J. Herschkorn wrote:
Lenore wrote:

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?


I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

--
Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan

Another simple explanation. Suppose X and Y are independent and
uniform on [0,1].
When X and Y are large, then X+Y will also be large, but X-Y will
tend to be small,
showing that X+Y and X-Y are negatively correlated.

Nope. The correlation of X+Y and X-Y is 0 whenever E[X] = E[Y]
and E[X^2] = E[Y^2].

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
Robert B. Israel
science forum Guru


Joined: 24 Mar 2005
Posts: 2151

PostPosted: Sun Jul 09, 2006 6:31 am    Post subject: Re: Sum and difference of independent variables Reply with quote

In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
<rtonyreeder@yahoo.com> wrote:
Quote:

Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

Quote:
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
rtonyreeder@yahoo.com
science forum beginner


Joined: 25 May 2006
Posts: 32

PostPosted: Sun Jul 09, 2006 4:36 am    Post subject: Re: Sum and difference of independent variables Reply with quote

Stephen J. Herschkorn wrote:
Quote:
I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

Since this really isn't my field of expertise, my previous post told of
some things I thought were interesting. I wasn't sure how to determine
independence of random variables.

My current understanding is that, for X and Y to be independent,
Pxy(X,Y) = Px(X)Py(Y). This seems to be the point of your answer, that
if it isn't true, then the variables aren't independent.

I discussed plotting Y-X vs Y+X, for uniformly distributed X and
independent Y, and getting an even spattering of points on a 45 deg
tilted square. Such a case can't be independent, since it can't be
written as P3(Y+X,Y-X) = P1(Y+X)P2(Y-X), i.e., the range of Y-X depends
on Y+X.

At least, that's the way it seems to me.
Back to top
Pubkeybreaker
science forum Guru


Joined: 24 Mar 2005
Posts: 333

PostPosted: Sat Jul 08, 2006 11:21 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

Stephen J. Herschkorn wrote:
Quote:
Lenore wrote:

Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?


I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

--
Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan

Another simple explanation. Suppose X and Y are independent and
uniform on [0,1].
When X and Y are large, then X+Y will also be large, but X-Y will
tend to be small,
showing that X+Y and X-Y are negatively correlated.
Back to top
Stephen J. Herschkorn
science forum Guru


Joined: 24 Mar 2005
Posts: 641

PostPosted: Sat Jul 08, 2006 6:19 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

Lenore wrote:

Quote:
Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?


I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

--
Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan
Back to top
rtonyreeder@yahoo.com
science forum beginner


Joined: 25 May 2006
Posts: 32

PostPosted: Sat Jul 08, 2006 4:23 am    Post subject: Re: Sum and difference of independent variables Reply with quote

Lenore wrote:
Quote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

Thank you.

Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

Anyway, it was an interesting problem. I will have to give it more
thought.
Back to top
Robert B. Israel
science forum Guru


Joined: 24 Mar 2005
Posts: 2151

PostPosted: Fri Jul 07, 2006 11:45 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

In article <1152300087.051551.190110@h48g2000cwc.googlegroups.com>,
C6L1V@shaw.ca <C6L1V@shaw.ca> wrote:
Quote:
Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that
the sum and difference of two independent variables X and Y, (X+Y) and
(X-Y), are themselves independent variables, if X and Y are identically
distributed?

It's only true if X and Y are normal.

Quote:
For iid X and Y, let M(c) = E exp(cX) be the moment-generating function
(mgf).

It's better, I think, to use the characteristic function
phi(c) = E exp(icX), since the mgf for real c may not exist.
Your argument is the same.

Quote:
IF U = X+Y and V = X-Y were independent, their joint mgf
U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where
f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and
g(b) = E exp(bV) = E exp(bX) exp(-bY) = M(b) M(-b).
However, we have
U(a,b) = E exp[(a+b)X + (a-b)Y] = E exp[(a+b)X] exp[(a-b)Y] = M(a+b)
M(a-b).

In general, we do not have M(a+b) M(a-b) = M(a)^2 M(b) M(-b) for all
a,b, so U and V are not independent, at least, not for some M(.).

Consider the equation phi(a+b) phi(a-b) = phi(a)^2 phi(b) phi(-b).
Note that phi(0) = 1, phi(x) is continuous, and
phi(-x) = conjugate(phi(x)). From this it's possible to show
that for some real constants c and d with c > 0,
phi(x) = exp(- c x^2 + i d x)
and then use the inverse Fourier transform to show the distribution
of X was normal.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Back to top
C6L1V@shaw.ca
science forum Guru


Joined: 23 May 2005
Posts: 628

PostPosted: Fri Jul 07, 2006 7:21 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

Lenore wrote:
Quote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

For iid X and Y, let M(c) = E exp(cX) be the moment-generating function
(mgf). IF U = X+Y and V = X-Y were independent, their joint mgf
U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where
f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and
g(b) = E exp(bV) = E exp(bX) exp(-bY) = M(b) M(-b).
However, we have
U(a,b) = E exp[(a+b)X + (a-b)Y] = E exp[(a+b)X] exp[(a-b)Y] = M(a+b)
M(a-b).

In general, we do not have M(a+b) M(a-b) = M(a)^2 M(b) M(-b) for all
a,b, so U and V are not independent, at least, not for some M(.).

R.G. Vickson


Quote:

Thank you.

Lenore
Back to top
A N Niel
science forum Guru


Joined: 28 Apr 2005
Posts: 475

PostPosted: Fri Jul 07, 2006 6:38 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

In article <1152296149.525812.46090@s13g2000cwa.googlegroups.com>,
Randy Poe <poespam-trap@yahoo.com> wrote:

Quote:
Lenore wrote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that the
sum and difference of two independent variables X and Y, (X+Y) and (X-Y),
are themselves independent variables, if X and Y are identically
distributed?


False if X Y are uniformly distributed on [0,1], as you may
easily compute.
Back to top
Randy Poe
science forum Guru


Joined: 24 Mar 2005
Posts: 2485

PostPosted: Fri Jul 07, 2006 6:15 pm    Post subject: Re: Sum and difference of independent variables Reply with quote

Lenore wrote:
Quote:
Hello everyone,

Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

I think they aren't in general.

Here's my reasoning (I'm using prime to mean matrix transpose):

Suppose X and Y have zero mean. Consider the random variable
Z = (X,Y)' and the linearly transformed random variable
W = (X+Y,X-Y)' = AZ where
A =(1 1)
(1 -1)

The covariance matrix of Z is E[ZZ'] and is diagonal by
assumption.

The covariance matrix of W is A*E[ZZ']*A' and is not
diagonal.

- Randy
Back to top
Google

Back to top
Display posts from previous:   
Post new topic   Reply to topic Page 1 of 2 [16 Posts] Goto page:  1, 2 Next
View previous topic :: View next topic
The time now is Thu Sep 20, 2018 7:24 am | All times are GMT
Forum index » Science and Technology » Math
Jump to:  

Similar Topics
Topic Author Forum Replies Last Post
No new posts What is the difference between unit h and unit h = h / 2pi? socratus Relativity 0 Thu Jul 20, 2006 1:22 pm
No new posts axiom of choice independent of ZF? alberthendriks@gmail.com1 Math 3 Wed Jul 19, 2006 11:04 am
No new posts # of independent tensor components Hauke Reddmann Research 0 Tue Jul 18, 2006 11:41 am
No new posts No difference from any other 'Bush sucks' person... gb7648 New Theories 9 Thu Jul 13, 2006 10:54 pm
No new posts Is sum of two normally distributed variables normally dis... Konrad Viltersten Probability 2 Mon Jul 10, 2006 5:17 am

Copyright © 2004-2005 DeniX Solutions SRL
Other DeniX Solutions sites: Electronics forum |  Medicine forum |  Unix/Linux blog |  Unix/Linux documentation |  Unix/Linux forums  |  send newsletters
 


Powered by phpBB © 2001, 2005 phpBB Group
[ Time: 0.0263s ][ Queries: 16 (0.0022s) ][ GZIP on - Debug on ]