Search   Memberlist   Usergroups
 Page 1 of 2 [16 Posts] View previous topic :: View next topic Goto page:  1, 2 Next
Author Message
Robert B. Israel
science forum Guru

Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 11:44 pm    Post subject: Re: Sum and difference of independent variables

<rtonyreeder@yahoo.com> wrote:
 Quote: Robert Israel wrote: In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>, rtonyreeder@yahoo.com> wrote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure. Do you know the definition of independent? You might check before posting. Thanks, Robert, for the clarification on the definition of independence. Yes, I am being sarcastic. I think I was clear on hat I was saying, and did not pretend to be an expert. I didn't know it at the time, and I don't know for sure that I'm correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later post?

No, it's that the events {X <= x} and {Y <= y} are independent for
every x and y, i.e. Prob(X <= x and Y <= y} = Prob(X <= x) Prob(Y <= y).
In this case you might think about, say, how it could happen that
X+Y <= 1/3 and X-Y <= -2/3.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
C6L1V@shaw.ca
science forum Guru

Joined: 23 May 2005
Posts: 628

Posted: Sun Jul 09, 2006 7:54 pm    Post subject: Re: Sum and difference of independent variables

rtonyree...@yahoo.com wrote:
 Quote: C6L1V@shaw.ca wrote: rtonyreeder@yahoo.com wrote: Robert Israel wrote: In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>, rtonyreeder@yahoo.com> wrote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure. Do you know the definition of independent? You might check before posting. Thanks, Robert, for the clarification on the definition of independence. Yes, I am being sarcastic. I think I was clear on hat I was saying, and did not pretend to be an expert. I didn't know it at the time, and I don't know for sure that I'm correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later post? I decided to run a correlation. First, X correlated with Y (a product of the FFT of X with the conjugated of the FFT of Y, transformed back with an IFFT). The result looked like white noise, i.e., no correlation. I then did the same with Y-X and Y+X, and again, the result looked like white noise, though with about twice the RMS amplitude. I am not sure if that implies lack of independence. It's very easy to see that X+Y and X-Y are uncorrelated whenever X and Y are iid and have variances. Are you saying that there is no correlation because X+Y and X-Y because they are independent? No, he is not saying that. He is saying they are uncorrelated. Period. They ARE dependent (except in the normal distribution case), but they are also uncorrelated. iid = independent and identically distributed ? Yes. The "variances" part, are you talking about probability density functions without variances, like the Cauchy (Lorentz) distribution? I haven't given a lot of thought to the difference between independence and being uncorrelated. I though that they were similar, i.e., independence requires being uncorrelated; being uncorrelated implies independence; being dependent requires correlation; and being correlated implies dependence. Absolutely not. Lack of correlation and independence are different concepts. Independent random variables are uncorrelated, but not conversely. The pair X+Y and X-Y are a perfect example: they are dependent (i.e., not independent) but are, nevertheless, uncorrelated. Thanks. That is not obvious.

OK, here is one more reason that you may, or may not find convincing.
Independence refers to the whole (joint) probability distribution,
while mean, variance and correlation refer only to "first moments" and
"second moments"---things like expected values of X and X^2. To say
that U = X+Y and V = X-Y are uncorrelated just makes a claim about
properties up to the second order. However, higher order objects (like
expectations of U^3 or V^4 or U^2 V^3, etc.) are not covered by the
"no-correlation" result.

Note, however, that for normally-distributed random variables, the
first and second moments determine the distrbution completely, so that
is why U and V *are* independent *for the special case of independent,
indentically distributed normal random variables*. In fact,
'independence = no correlation' for normal distributions, but not for
most others.

RGV

 Quote: In light of this, what are you saying? That there is no correlation because X+Y and X-Y because they are independent? No. He said in a previous posting that they are DEPENDENT. R.G. Vickson Adjunct Professor, University of Waterloo They are not independent, if P(x,y) = Px(x)Py(y) is the definition in independence. Robert Israel israel@math.ubc.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
rtonyreeder@yahoo.com
science forum beginner

Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 7:40 pm    Post subject: Re: Sum and difference of independent variables

C6L1V@shaw.ca wrote:
 Quote: rtonyreeder@yahoo.com wrote: Robert Israel wrote: In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>, rtonyreeder@yahoo.com> wrote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure. Do you know the definition of independent? You might check before posting. Thanks, Robert, for the clarification on the definition of independence. Yes, I am being sarcastic. I think I was clear on hat I was saying, and did not pretend to be an expert. I didn't know it at the time, and I don't know for sure that I'm correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later post? I decided to run a correlation. First, X correlated with Y (a product of the FFT of X with the conjugated of the FFT of Y, transformed back with an IFFT). The result looked like white noise, i.e., no correlation. I then did the same with Y-X and Y+X, and again, the result looked like white noise, though with about twice the RMS amplitude. I am not sure if that implies lack of independence. It's very easy to see that X+Y and X-Y are uncorrelated whenever X and Y are iid and have variances. Are you saying that there is no correlation because X+Y and X-Y because they are independent? No, he is not saying that. He is saying they are uncorrelated. Period. They ARE dependent (except in the normal distribution case), but they are also uncorrelated. iid = independent and identically distributed ? Yes. The "variances" part, are you talking about probability density functions without variances, like the Cauchy (Lorentz) distribution? I haven't given a lot of thought to the difference between independence and being uncorrelated. I though that they were similar, i.e., independence requires being uncorrelated; being uncorrelated implies independence; being dependent requires correlation; and being correlated implies dependence. Absolutely not. Lack of correlation and independence are different concepts. Independent random variables are uncorrelated, but not conversely. The pair X+Y and X-Y are a perfect example: they are dependent (i.e., not independent) but are, nevertheless, uncorrelated.

Thanks. That is not obvious.

 Quote: In light of this, what are you saying? That there is no correlation because X+Y and X-Y because they are independent? No. He said in a previous posting that they are DEPENDENT. R.G. Vickson Adjunct Professor, University of Waterloo They are not independent, if P(x,y) = Px(x)Py(y) is the definition in independence. Robert Israel israel@math.ubc.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
C6L1V@shaw.ca
science forum Guru

Joined: 23 May 2005
Posts: 628

Posted: Sun Jul 09, 2006 4:31 pm    Post subject: Re: Sum and difference of independent variables

rtonyreeder@yahoo.com wrote:
 Quote: Robert Israel wrote: In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>, rtonyreeder@yahoo.com> wrote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure. Do you know the definition of independent? You might check before posting. Thanks, Robert, for the clarification on the definition of independence. Yes, I am being sarcastic. I think I was clear on hat I was saying, and did not pretend to be an expert. I didn't know it at the time, and I don't know for sure that I'm correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later post? I decided to run a correlation. First, X correlated with Y (a product of the FFT of X with the conjugated of the FFT of Y, transformed back with an IFFT). The result looked like white noise, i.e., no correlation. I then did the same with Y-X and Y+X, and again, the result looked like white noise, though with about twice the RMS amplitude. I am not sure if that implies lack of independence. It's very easy to see that X+Y and X-Y are uncorrelated whenever X and Y are iid and have variances. Are you saying that there is no correlation because X+Y and X-Y because they are independent?

No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.

 Quote: iid = independent and identically distributed ?

Yes.

 Quote: The "variances" part, are you talking about probability density functions without variances, like the Cauchy (Lorentz) distribution? I haven't given a lot of thought to the difference between independence and being uncorrelated. I though that they were similar, i.e., independence requires being uncorrelated; being uncorrelated implies independence; being dependent requires correlation; and being correlated implies dependence.

Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and X-Y are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.

 Quote: In light of this, what are you saying? That there is no correlation because X+Y and X-Y because they are independent?

No. He said in a previous posting that they are DEPENDENT.

R.G. Vickson

 Quote: They are not independent, if P(x,y) = Px(x)Py(y) is the definition in independence. Robert Israel israel@math.ubc.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
rtonyreeder@yahoo.com
science forum beginner

Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 3:15 pm    Post subject: Re: Sum and difference of independent variables

Robert Israel wrote:
 Quote: In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>, rtonyreeder@yahoo.com> wrote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure. Do you know the definition of independent? You might check before posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.

I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?

 Quote: I decided to run a correlation. First, X correlated with Y (a product of the FFT of X with the conjugated of the FFT of Y, transformed back with an IFFT). The result looked like white noise, i.e., no correlation. I then did the same with Y-X and Y+X, and again, the result looked like white noise, though with about twice the RMS amplitude. I am not sure if that implies lack of independence. It's very easy to see that X+Y and X-Y are uncorrelated whenever X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and X-Y because
they are independent?

iid = independent and identically distributed ?

The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?

I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

In light of this, what are you saying? That there is no correlation
because X+Y and X-Y because they are independent?

They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.

 Quote: Robert Israel israel@math.ubc.ca Department of Mathematics http://www.math.ubc.ca/~israel University of British Columbia Vancouver, BC, Canada
Robert B. Israel
science forum Guru

Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 6:35 am    Post subject: Re: Sum and difference of independent variables

Pubkeybreaker <Robert_silverman@raytheon.com> wrote:
 Quote: Stephen J. Herschkorn wrote: Lenore wrote: Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? I don't understand why all the explanations in this thread are so complicated. Here is the simplest possible counterexample: Let X and Y be i.i.d. Bernoulli (1/2). Then P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}. -- Stephen J. Herschkorn sjherschko@netscape.net Math Tutor on the Internet and in Central New Jersey and Manhattan Another simple explanation. Suppose X and Y are independent and uniform on [0,1]. When X and Y are large, then X+Y will also be large, but X-Y will tend to be small, showing that X+Y and X-Y are negatively correlated.

Nope. The correlation of X+Y and X-Y is 0 whenever E[X] = E[Y]
and E[X^2] = E[Y^2].

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
Robert B. Israel
science forum Guru

Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 6:31 am    Post subject: Re: Sum and difference of independent variables

<rtonyreeder@yahoo.com> wrote:
 Quote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore I am not an expert on this subject, but do use random numbers quite often. I though your question was interesting enough that I wrote a MathCad sheet that generated 1000 uniformly distributed random numbers X and Y. I plotted Y vs. X and got what you would expect, 1000 points randomly spattered on a square. I then plotted Y-X vs Y+X and got 1000 points randomly spattered on a 45 degree tilted square. It seemed as though Y-X was independent of Y+X, just like X is independent of Y. Since the square is tilted at 45 deg, the value of Y-X is limited in range, depending on the value of Y+X, and vice versa. Maybe this means they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.

 Quote: I decided to run a correlation. First, X correlated with Y (a product of the FFT of X with the conjugated of the FFT of Y, transformed back with an IFFT). The result looked like white noise, i.e., no correlation. I then did the same with Y-X and Y+X, and again, the result looked like white noise, though with about twice the RMS amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and X-Y are uncorrelated whenever
X and Y are iid and have variances.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
rtonyreeder@yahoo.com
science forum beginner

Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 4:36 am    Post subject: Re: Sum and difference of independent variables

Stephen J. Herschkorn wrote:
 Quote: I don't understand why all the explanations in this thread are so complicated. Here is the simplest possible counterexample: Let X and Y be i.i.d. Bernoulli (1/2). Then P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

Since this really isn't my field of expertise, my previous post told of
some things I thought were interesting. I wasn't sure how to determine
independence of random variables.

My current understanding is that, for X and Y to be independent,
Pxy(X,Y) = Px(X)Py(Y). This seems to be the point of your answer, that
if it isn't true, then the variables aren't independent.

I discussed plotting Y-X vs Y+X, for uniformly distributed X and
independent Y, and getting an even spattering of points on a 45 deg
tilted square. Such a case can't be independent, since it can't be
written as P3(Y+X,Y-X) = P1(Y+X)P2(Y-X), i.e., the range of Y-X depends
on Y+X.

At least, that's the way it seems to me.
Pubkeybreaker
science forum Guru

Joined: 24 Mar 2005
Posts: 333

Posted: Sat Jul 08, 2006 11:21 pm    Post subject: Re: Sum and difference of independent variables

Stephen J. Herschkorn wrote:
 Quote: Lenore wrote: Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? I don't understand why all the explanations in this thread are so complicated. Here is the simplest possible counterexample: Let X and Y be i.i.d. Bernoulli (1/2). Then P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}. -- Stephen J. Herschkorn sjherschko@netscape.net Math Tutor on the Internet and in Central New Jersey and Manhattan

Another simple explanation. Suppose X and Y are independent and
uniform on [0,1].
When X and Y are large, then X+Y will also be large, but X-Y will
tend to be small,
showing that X+Y and X-Y are negatively correlated.
Stephen J. Herschkorn
science forum Guru

Joined: 24 Mar 2005
Posts: 641

Posted: Sat Jul 08, 2006 6:19 pm    Post subject: Re: Sum and difference of independent variables

Lenore wrote:

 Quote: Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, X-Y = 0} != P{X+Y = 2} P{X-Y = 0}.

--
Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan
rtonyreeder@yahoo.com
science forum beginner

Joined: 25 May 2006
Posts: 32

Posted: Sat Jul 08, 2006 4:23 am    Post subject: Re: Sum and difference of independent variables

Lenore wrote:
 Quote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed? Thank you. Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.

I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted Y-X vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
Y-X was independent of Y+X, just like X is independent of Y.

Since the square is tilted at 45 deg, the value of Y-X is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with Y-X and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

Anyway, it was an interesting problem. I will have to give it more
thought.
Robert B. Israel
science forum Guru

Joined: 24 Mar 2005
Posts: 2151

Posted: Fri Jul 07, 2006 11:45 pm    Post subject: Re: Sum and difference of independent variables

C6L1V@shaw.ca <C6L1V@shaw.ca> wrote:
 Quote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

It's only true if X and Y are normal.

 Quote: For iid X and Y, let M(c) = E exp(cX) be the moment-generating function (mgf).

It's better, I think, to use the characteristic function
phi(c) = E exp(icX), since the mgf for real c may not exist.

 Quote: IF U = X+Y and V = X-Y were independent, their joint mgf U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and g(b) = E exp(bV) = E exp(bX) exp(-bY) = M(b) M(-b). However, we have U(a,b) = E exp[(a+b)X + (a-b)Y] = E exp[(a+b)X] exp[(a-b)Y] = M(a+b) M(a-b). In general, we do not have M(a+b) M(a-b) = M(a)^2 M(b) M(-b) for all a,b, so U and V are not independent, at least, not for some M(.).

Consider the equation phi(a+b) phi(a-b) = phi(a)^2 phi(b) phi(-b).
Note that phi(0) = 1, phi(x) is continuous, and
phi(-x) = conjugate(phi(x)). From this it's possible to show
that for some real constants c and d with c > 0,
phi(x) = exp(- c x^2 + i d x)
and then use the inverse Fourier transform to show the distribution
of X was normal.

Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
C6L1V@shaw.ca
science forum Guru

Joined: 23 May 2005
Posts: 628

Posted: Fri Jul 07, 2006 7:21 pm    Post subject: Re: Sum and difference of independent variables

Lenore wrote:
 Quote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

For iid X and Y, let M(c) = E exp(cX) be the moment-generating function
(mgf). IF U = X+Y and V = X-Y were independent, their joint mgf
U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where
f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and
g(b) = E exp(bV) = E exp(bX) exp(-bY) = M(b) M(-b).
However, we have
U(a,b) = E exp[(a+b)X + (a-b)Y] = E exp[(a+b)X] exp[(a-b)Y] = M(a+b)
M(a-b).

In general, we do not have M(a+b) M(a-b) = M(a)^2 M(b) M(-b) for all
a,b, so U and V are not independent, at least, not for some M(.).

R.G. Vickson

 Quote: Thank you. Lenore
A N Niel
science forum Guru

Joined: 28 Apr 2005
Posts: 475

Posted: Fri Jul 07, 2006 6:38 pm    Post subject: Re: Sum and difference of independent variables

Randy Poe <poespam-trap@yahoo.com> wrote:

 Quote: Lenore wrote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

False if X Y are uniformly distributed on [0,1], as you may
easily compute.
Randy Poe
science forum Guru

Joined: 24 Mar 2005
Posts: 2485

Posted: Fri Jul 07, 2006 6:15 pm    Post subject: Re: Sum and difference of independent variables

Lenore wrote:
 Quote: Hello everyone, Is there a proof - or, indeed, does the statement hold at all - that the sum and difference of two independent variables X and Y, (X+Y) and (X-Y), are themselves independent variables, if X and Y are identically distributed?

I think they aren't in general.

Here's my reasoning (I'm using prime to mean matrix transpose):

Suppose X and Y have zero mean. Consider the random variable
Z = (X,Y)' and the linearly transformed random variable
W = (X+Y,X-Y)' = AZ where
A =(1 1)
(1 -1)

The covariance matrix of Z is E[ZZ'] and is diagonal by
assumption.

The covariance matrix of W is A*E[ZZ']*A' and is not
diagonal.

- Randy

 Display posts from previous: All Posts1 Day7 Days2 Weeks1 Month3 Months6 Months1 Year Oldest FirstNewest First
 Page 1 of 2 [16 Posts] Goto page:  1, 2 Next View previous topic :: View next topic
 The time now is Thu Feb 21, 2019 8:17 am | All times are GMT
 Jump to: Select a forum-------------------Forum index|___Science and Technology    |___Math    |   |___Research    |   |___num-analysis    |   |___Symbolic    |   |___Combinatorics    |   |___Probability    |   |   |___Prediction    |   |       |   |___Undergraduate    |   |___Recreational    |       |___Physics    |   |___Research    |   |___New Theories    |   |___Acoustics    |   |___Electromagnetics    |   |___Strings    |   |___Particle    |   |___Fusion    |   |___Relativity    |       |___Chem    |   |___Analytical    |   |___Electrochem    |   |   |___Battery    |   |       |   |___Coatings    |       |___Engineering        |___Control        |___Mechanics        |___Chemical

 Topic Author Forum Replies Last Post Similar Topics What is the difference between unit h and unit h = h / 2pi? socratus Relativity 0 Thu Jul 20, 2006 1:22 pm axiom of choice independent of ZF? alberthendriks@gmail.com1 Math 3 Wed Jul 19, 2006 11:04 am # of independent tensor components Hauke Reddmann Research 0 Tue Jul 18, 2006 11:41 am No difference from any other 'Bush sucks' person... gb7648 New Theories 9 Thu Jul 13, 2006 10:54 pm Is sum of two normally distributed variables normally dis... Konrad Viltersten Probability 2 Mon Jul 10, 2006 5:17 am