Author 
Message 
Robert B. Israel science forum Guru
Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 11:44 pm Post subject:
Re: Sum and difference of independent variables



In article <1152458138.800021.138380@m73g2000cwd.googlegroups.com>,
<rtonyreeder@yahoo.com> wrote:
Quote: 
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
Do you know the definition of independent? You might check before
posting.
Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.
I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?

No, it's that the events {X <= x} and {Y <= y} are independent for
every x and y, i.e. Prob(X <= x and Y <= y} = Prob(X <= x) Prob(Y <= y).
In this case you might think about, say, how it could happen that
X+Y <= 1/3 and XY <= 2/3.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 

Back to top 


C6L1V@shaw.ca science forum Guru
Joined: 23 May 2005
Posts: 628

Posted: Sun Jul 09, 2006 7:54 pm Post subject:
Re: Sum and difference of independent variables



rtonyree...@yahoo.com wrote:
Quote:  C6L1V@shaw.ca wrote:
rtonyreeder@yahoo.com wrote:
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
Do you know the definition of independent? You might check before
posting.
Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.
I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.
It's very easy to see that X+Y and XY are uncorrelated whenever
X and Y are iid and have variances.
Are you saying that there is no correlation because X+Y and XY because
they are independent?
No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.
iid = independent and identically distributed ?
Yes.
The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?
I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.
Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and XY are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.
Thanks. That is not obvious.

OK, here is one more reason that you may, or may not find convincing.
Independence refers to the whole (joint) probability distribution,
while mean, variance and correlation refer only to "first moments" and
"second moments"things like expected values of X and X^2. To say
that U = X+Y and V = XY are uncorrelated just makes a claim about
properties up to the second order. However, higher order objects (like
expectations of U^3 or V^4 or U^2 V^3, etc.) are not covered by the
"nocorrelation" result.
Note, however, that for normallydistributed random variables, the
first and second moments determine the distrbution completely, so that
is why U and V *are* independent *for the special case of independent,
indentically distributed normal random variables*. In fact,
'independence = no correlation' for normal distributions, but not for
most others.
RGV
Quote: 
In light of this, what are you saying? That there is no correlation
because X+Y and XY because they are independent?
No. He said in a previous posting that they are DEPENDENT.
R.G. Vickson
Adjunct Professor, University of Waterloo
They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 


Back to top 


rtonyreeder@yahoo.com science forum beginner
Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 7:40 pm Post subject:
Re: Sum and difference of independent variables



C6L1V@shaw.ca wrote:
Quote:  rtonyreeder@yahoo.com wrote:
Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
Do you know the definition of independent? You might check before
posting.
Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.
I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.
It's very easy to see that X+Y and XY are uncorrelated whenever
X and Y are iid and have variances.
Are you saying that there is no correlation because X+Y and XY because
they are independent?
No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.
iid = independent and identically distributed ?
Yes.
The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?
I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.
Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and XY are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.

Thanks. That is not obvious.
Quote: 
In light of this, what are you saying? That there is no correlation
because X+Y and XY because they are independent?
No. He said in a previous posting that they are DEPENDENT.
R.G. Vickson
Adjunct Professor, University of Waterloo
They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 


Back to top 


C6L1V@shaw.ca science forum Guru
Joined: 23 May 2005
Posts: 628

Posted: Sun Jul 09, 2006 4:31 pm Post subject:
Re: Sum and difference of independent variables



rtonyreeder@yahoo.com wrote:
Quote:  Robert Israel wrote:
In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
Do you know the definition of independent? You might check before
posting.
Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.
I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.
It's very easy to see that X+Y and XY are uncorrelated whenever
X and Y are iid and have variances.
Are you saying that there is no correlation because X+Y and XY because
they are independent?

No, he is not saying that. He is saying they are uncorrelated. Period.
They ARE dependent (except in the normal distribution case), but they
are also uncorrelated.
Quote: 
iid = independent and identically distributed ?

Yes.
Quote: 
The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?
I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.

Absolutely not. Lack of correlation and independence are different
concepts. Independent random variables are uncorrelated, but not
conversely. The pair X+Y and XY are a perfect example: they are
dependent (i.e., not independent) but are, nevertheless, uncorrelated.
Quote: 
In light of this, what are you saying? That there is no correlation
because X+Y and XY because they are independent?

No. He said in a previous posting that they are DEPENDENT.
R.G. Vickson
Adjunct Professor, University of Waterloo


Back to top 


rtonyreeder@yahoo.com science forum beginner
Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 3:15 pm Post subject:
Re: Sum and difference of independent variables



Robert Israel wrote:
Quote:  In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
rtonyreeder@yahoo.com> wrote:
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
Do you know the definition of independent? You might check before
posting.

Thanks, Robert, for the clarification on the definition of
independence. Yes, I am being sarcastic. I think I was clear on hat I
was saying, and did not pretend to be an expert.
I didn't know it at the time, and I don't know for sure that I'm
correct now. Is it that P(x,y) = Px(x)Py(y), which I put in a later
post?
Quote: 
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.
It's very easy to see that X+Y and XY are uncorrelated whenever
X and Y are iid and have variances.

Are you saying that there is no correlation because X+Y and XY because
they are independent?
iid = independent and identically distributed ?
The "variances" part, are you talking about probability density
functions without variances, like the Cauchy (Lorentz) distribution?
I haven't given a lot of thought to the difference between independence
and being uncorrelated. I though that they were similar, i.e.,
independence requires being uncorrelated; being uncorrelated implies
independence; being dependent requires correlation; and being
correlated implies dependence.
In light of this, what are you saying? That there is no correlation
because X+Y and XY because they are independent?
They are not independent, if P(x,y) = Px(x)Py(y) is the definition in
independence.


Back to top 


Robert B. Israel science forum Guru
Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 6:35 am Post subject:
Re: Sum and difference of independent variables



In article <1152400862.078527.136160@m79g2000cwm.googlegroups.com>,
Pubkeybreaker <Robert_silverman@raytheon.com> wrote:
Quote: 
Stephen J. Herschkorn wrote:
Lenore wrote:
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, XY = 0} != P{X+Y = 2} P{XY = 0}.

Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan
Another simple explanation. Suppose X and Y are independent and
uniform on [0,1].
When X and Y are large, then X+Y will also be large, but XY will
tend to be small,
showing that X+Y and XY are negatively correlated.

Nope. The correlation of X+Y and XY is 0 whenever E[X] = E[Y]
and E[X^2] = E[Y^2].
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 

Back to top 


Robert B. Israel science forum Guru
Joined: 24 Mar 2005
Posts: 2151

Posted: Sun Jul 09, 2006 6:31 am Post subject:
Re: Sum and difference of independent variables



In article <1152332626.391378.211230@s53g2000cws.googlegroups.com>,
<rtonyreeder@yahoo.com> wrote:
Quote: 
Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?
Thank you.
Lenore
I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.

Do you know the definition of independent? You might check before
posting.
Quote:  I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.

It's very easy to see that X+Y and XY are uncorrelated whenever
X and Y are iid and have variances.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 

Back to top 


rtonyreeder@yahoo.com science forum beginner
Joined: 25 May 2006
Posts: 32

Posted: Sun Jul 09, 2006 4:36 am Post subject:
Re: Sum and difference of independent variables



Stephen J. Herschkorn wrote:
Quote:  I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, XY = 0} != P{X+Y = 2} P{XY = 0}.

Since this really isn't my field of expertise, my previous post told of
some things I thought were interesting. I wasn't sure how to determine
independence of random variables.
My current understanding is that, for X and Y to be independent,
Pxy(X,Y) = Px(X)Py(Y). This seems to be the point of your answer, that
if it isn't true, then the variables aren't independent.
I discussed plotting YX vs Y+X, for uniformly distributed X and
independent Y, and getting an even spattering of points on a 45 deg
tilted square. Such a case can't be independent, since it can't be
written as P3(Y+X,YX) = P1(Y+X)P2(YX), i.e., the range of YX depends
on Y+X.
At least, that's the way it seems to me. 

Back to top 


Pubkeybreaker science forum Guru
Joined: 24 Mar 2005
Posts: 333

Posted: Sat Jul 08, 2006 11:21 pm Post subject:
Re: Sum and difference of independent variables



Stephen J. Herschkorn wrote:
Quote:  Lenore wrote:
Is there a proof  or, indeed, does the statement hold at all  that the sum and difference of two independent variables X and Y, (X+Y) and (XY), are themselves independent variables, if X and Y are identically distributed?
I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, XY = 0} != P{X+Y = 2} P{XY = 0}.

Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan

Another simple explanation. Suppose X and Y are independent and
uniform on [0,1].
When X and Y are large, then X+Y will also be large, but XY will
tend to be small,
showing that X+Y and XY are negatively correlated. 

Back to top 


Stephen J. Herschkorn science forum Guru
Joined: 24 Mar 2005
Posts: 641

Posted: Sat Jul 08, 2006 6:19 pm Post subject:
Re: Sum and difference of independent variables



Lenore wrote:
Quote:  Is there a proof  or, indeed, does the statement hold at all  that the sum and difference of two independent variables X and Y, (X+Y) and (XY), are themselves independent variables, if X and Y are identically distributed?

I don't understand why all the explanations in this thread are so
complicated. Here is the simplest possible counterexample: Let X
and Y be i.i.d. Bernoulli (1/2). Then
P{X+Y = 2, XY = 0} != P{X+Y = 2} P{XY = 0}.

Stephen J. Herschkorn sjherschko@netscape.net
Math Tutor on the Internet and in Central New Jersey and Manhattan 

Back to top 


rtonyreeder@yahoo.com science forum beginner
Joined: 25 May 2006
Posts: 32

Posted: Sat Jul 08, 2006 4:23 am Post subject:
Re: Sum and difference of independent variables



Lenore wrote:
Quote:  Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that the sum and difference of two independent variables X and Y, (X+Y) and (XY), are themselves independent variables, if X and Y are identically distributed?
Thank you.
Lenore

I am not an expert on this subject, but do use random numbers quite
often. I though your question was interesting enough that I wrote a
MathCad sheet that generated 1000 uniformly distributed random numbers
X and Y.
I plotted Y vs. X and got what you would expect, 1000 points randomly
spattered on a square. I then plotted YX vs Y+X and got 1000 points
randomly spattered on a 45 degree tilted square. It seemed as though
YX was independent of Y+X, just like X is independent of Y.
Since the square is tilted at 45 deg, the value of YX is limited in
range, depending on the value of Y+X, and vice versa. Maybe this means
they aren't independent, I am just not sure.
I decided to run a correlation. First, X correlated with Y (a product
of the FFT of X with the conjugated of the FFT of Y, transformed back
with an IFFT). The result looked like white noise, i.e., no
correlation. I then did the same with YX and Y+X, and again, the
result looked like white noise, though with about twice the RMS
amplitude. I am not sure if that implies lack of independence.
Anyway, it was an interesting problem. I will have to give it more
thought. 

Back to top 


Robert B. Israel science forum Guru
Joined: 24 Mar 2005
Posts: 2151

Posted: Fri Jul 07, 2006 11:45 pm Post subject:
Re: Sum and difference of independent variables



In article <1152300087.051551.190110@h48g2000cwc.googlegroups.com>,
C6L1V@shaw.ca <C6L1V@shaw.ca> wrote:
Quote:  Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that
the sum and difference of two independent variables X and Y, (X+Y) and
(XY), are themselves independent variables, if X and Y are identically
distributed?

It's only true if X and Y are normal.
Quote:  For iid X and Y, let M(c) = E exp(cX) be the momentgenerating function
(mgf).

It's better, I think, to use the characteristic function
phi(c) = E exp(icX), since the mgf for real c may not exist.
Your argument is the same.
Quote:  IF U = X+Y and V = XY were independent, their joint mgf
U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where
f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and
g(b) = E exp(bV) = E exp(bX) exp(bY) = M(b) M(b).
However, we have
U(a,b) = E exp[(a+b)X + (ab)Y] = E exp[(a+b)X] exp[(ab)Y] = M(a+b)
M(ab).
In general, we do not have M(a+b) M(ab) = M(a)^2 M(b) M(b) for all
a,b, so U and V are not independent, at least, not for some M(.).

Consider the equation phi(a+b) phi(ab) = phi(a)^2 phi(b) phi(b).
Note that phi(0) = 1, phi(x) is continuous, and
phi(x) = conjugate(phi(x)). From this it's possible to show
that for some real constants c and d with c > 0,
phi(x) = exp( c x^2 + i d x)
and then use the inverse Fourier transform to show the distribution
of X was normal.
Robert Israel israel@math.ubc.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada 

Back to top 


C6L1V@shaw.ca science forum Guru
Joined: 23 May 2005
Posts: 628

Posted: Fri Jul 07, 2006 7:21 pm Post subject:
Re: Sum and difference of independent variables



Lenore wrote:
Quote:  Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that the sum and difference of two independent variables X and Y, (X+Y) and (XY), are themselves independent variables, if X and Y are identically distributed?

For iid X and Y, let M(c) = E exp(cX) be the momentgenerating function
(mgf). IF U = X+Y and V = XY were independent, their joint mgf
U(a,b) = E exp(aU + bV) would have the form f(a) g(b), where
f(a) = E exp (aU) = E exp(aX) exp(aY) = M(a)^2, and
g(b) = E exp(bV) = E exp(bX) exp(bY) = M(b) M(b).
However, we have
U(a,b) = E exp[(a+b)X + (ab)Y] = E exp[(a+b)X] exp[(ab)Y] = M(a+b)
M(ab).
In general, we do not have M(a+b) M(ab) = M(a)^2 M(b) M(b) for all
a,b, so U and V are not independent, at least, not for some M(.).
R.G. Vickson


Back to top 


A N Niel science forum Guru
Joined: 28 Apr 2005
Posts: 475

Posted: Fri Jul 07, 2006 6:38 pm Post subject:
Re: Sum and difference of independent variables



In article <1152296149.525812.46090@s13g2000cwa.googlegroups.com>,
Randy Poe <poespamtrap@yahoo.com> wrote:
Quote:  Lenore wrote:
Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that the
sum and difference of two independent variables X and Y, (X+Y) and (XY),
are themselves independent variables, if X and Y are identically
distributed?

False if X Y are uniformly distributed on [0,1], as you may
easily compute. 

Back to top 


Randy Poe science forum Guru
Joined: 24 Mar 2005
Posts: 2485

Posted: Fri Jul 07, 2006 6:15 pm Post subject:
Re: Sum and difference of independent variables



Lenore wrote:
Quote:  Hello everyone,
Is there a proof  or, indeed, does the statement hold at all  that the sum and difference of two independent variables X and Y, (X+Y) and (XY), are themselves independent variables, if X and Y are identically distributed?

I think they aren't in general.
Here's my reasoning (I'm using prime to mean matrix transpose):
Suppose X and Y have zero mean. Consider the random variable
Z = (X,Y)' and the linearly transformed random variable
W = (X+Y,XY)' = AZ where
A =(1 1)
(1 1)
The covariance matrix of Z is E[ZZ'] and is diagonal by
assumption.
The covariance matrix of W is A*E[ZZ']*A' and is not
diagonal.
 Randy 

Back to top 


Google


Back to top 



The time now is Thu Feb 21, 2019 8:17 am  All times are GMT

