Author 
Message 
yangyu05@gmail.com science forum beginner
Joined: 28 Oct 2005
Posts: 7

Posted: Tue Jun 06, 2006 7:52 pm Post subject:
Question on KullbackLeibler Divergence



hi all,
I have a question regarding KullbackLeibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.
KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).
We know that KL(p, q) is lowbounded by 0. My question is: Does KL(p,
q) has an upperbound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).
Thanks for the help.
Yang 

Back to top 


Janusz Kawczak science forum beginner
Joined: 18 Feb 2006
Posts: 10

Posted: Wed Jun 07, 2006 6:16 am Post subject:
Re: Question on KullbackLeibler Divergence



Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.
Janusz.
yang wrote:
Quote:  hi all,
I have a question regarding KullbackLeibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.
KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).
We know that KL(p, q) is lowbounded by 0. My question is: Does KL(p,
q) has an upperbound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).
Thanks for the help.
Yang



Back to top 


yangyu05@gmail.com science forum beginner
Joined: 28 Oct 2005
Posts: 7

Posted: Thu Jun 08, 2006 4:18 pm Post subject:
Re: Question on KullbackLeibler Divergence



thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)
Janusz Kawczak wrote:
Quote:  Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.
Janusz.
yang wrote:
hi all,
I have a question regarding KullbackLeibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.
KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).
We know that KL(p, q) is lowbounded by 0. My question is: Does KL(p,
q) has an upperbound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).
Thanks for the help.
Yang



Back to top 


Janusz Kawczak science forum beginner
Joined: 18 Feb 2006
Posts: 10

Posted: Thu Jun 08, 2006 5:58 pm Post subject:
Re: Question on KullbackLeibler Divergence



You are right. Your question makes no sense as you posed it. You need to
explain what you intend to achieve in more "understandable" way. Maybe
an example would help?
Janusz.
yang wrote:
Quote:  thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)
Janusz Kawczak wrote:
Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.
Janusz.
yang wrote:
hi all,
I have a question regarding KullbackLeibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.
KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).
We know that KL(p, q) is lowbounded by 0. My question is: Does KL(p,
q) has an upperbound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).
Thanks for the help.
Yang



Back to top 


yangyu05@gmail.com science forum beginner
Joined: 28 Oct 2005
Posts: 7

Posted: Thu Jun 08, 2006 6:50 pm Post subject:
Re: Question on KullbackLeibler Divergence



Ok, the basic situation is as follows.
Consider a sequence of numbers {a_1, a_2, ..., a_m}. Each a_i is in the
set [n]={1, ..., n}. Numbers in the sequence do not need to be unique.
We want to hide the pattern of the sequence by mapping each a_i to a
subset of [n]. For example, mapping sequence {1, 1, 2, 3} to {{1, 2,
3}, {1, 2, 3}, {1, 2, 3}, {1, 2, 3}} hides the pattern of the origianl
sequence.
We also want to minimize the length of the outcome of the mapping. The
above trivial mapping in the example is not always preferred. Thus, we
need a way to measure the goodness of a mapping. That's why we come up
with KL divergence. Then the question is what's the best mapping we can
achieve under this metric. Also, is this metric good enough?
Hope i am making more sense here. Thanks.
yang
Janusz Kawczak wrote:
Quote:  You are right. Your question makes no sense as you posed it. You need to
explain what you intend to achieve in more "understandable" way. Maybe
an example would help?
Janusz.
yang wrote:
thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)
Janusz Kawczak wrote:
Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.
Janusz.
yang wrote:
hi all,
I have a question regarding KullbackLeibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.
KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).
We know that KL(p, q) is lowbounded by 0. My question is: Does KL(p,
q) has an upperbound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).
Thanks for the help.
Yang



Back to top 


Google


Back to top 



The time now is Fri Sep 21, 2018 3:41 am  All times are GMT

