FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups 
 ProfileProfile   PreferencesPreferences   Log in to check your private messagesLog in to check your private messages   Log inLog in 
Forum index » Science and Technology » Math » Probability
Question on Kullback-Leibler Divergence
Post new topic   Reply to topic Page 1 of 1 [5 Posts] View previous topic :: View next topic
Author Message
yangyu05@gmail.com
science forum beginner


Joined: 28 Oct 2005
Posts: 7

PostPosted: Tue Jun 06, 2006 7:52 pm    Post subject: Question on Kullback-Leibler Divergence Reply with quote

hi all,

I have a question regarding Kullback-Leibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.

KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).

We know that KL(p, q) is low-bounded by 0. My question is: Does KL(p,
q) has an upper-bound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).

Thanks for the help.

Yang
Back to top
Janusz Kawczak
science forum beginner


Joined: 18 Feb 2006
Posts: 10

PostPosted: Wed Jun 07, 2006 6:16 am    Post subject: Re: Question on Kullback-Leibler Divergence Reply with quote

Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.

Janusz.

yang wrote:
Quote:
hi all,

I have a question regarding Kullback-Leibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.

KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).

We know that KL(p, q) is low-bounded by 0. My question is: Does KL(p,
q) has an upper-bound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).

Thanks for the help.

Yang
Back to top
yangyu05@gmail.com
science forum beginner


Joined: 28 Oct 2005
Posts: 7

PostPosted: Thu Jun 08, 2006 4:18 pm    Post subject: Re: Question on Kullback-Leibler Divergence Reply with quote

thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)

Janusz Kawczak wrote:
Quote:
Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.

Janusz.

yang wrote:
hi all,

I have a question regarding Kullback-Leibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.

KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).

We know that KL(p, q) is low-bounded by 0. My question is: Does KL(p,
q) has an upper-bound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).

Thanks for the help.

Yang
Back to top
Janusz Kawczak
science forum beginner


Joined: 18 Feb 2006
Posts: 10

PostPosted: Thu Jun 08, 2006 5:58 pm    Post subject: Re: Question on Kullback-Leibler Divergence Reply with quote

You are right. Your question makes no sense as you posed it. You need to
explain what you intend to achieve in more "understandable" way. Maybe
an example would help?

Janusz.


yang wrote:
Quote:
thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)

Janusz Kawczak wrote:

Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.

Janusz.

yang wrote:

hi all,

I have a question regarding Kullback-Leibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.

KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).

We know that KL(p, q) is low-bounded by 0. My question is: Does KL(p,
q) has an upper-bound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).

Thanks for the help.

Yang


Back to top
yangyu05@gmail.com
science forum beginner


Joined: 28 Oct 2005
Posts: 7

PostPosted: Thu Jun 08, 2006 6:50 pm    Post subject: Re: Question on Kullback-Leibler Divergence Reply with quote

Ok, the basic situation is as follows.

Consider a sequence of numbers {a_1, a_2, ..., a_m}. Each a_i is in the
set [n]={1, ..., n}. Numbers in the sequence do not need to be unique.
We want to hide the pattern of the sequence by mapping each a_i to a
subset of [n]. For example, mapping sequence {1, 1, 2, 3} to {{1, 2,
3}, {1, 2, 3}, {1, 2, 3}, {1, 2, 3}} hides the pattern of the origianl
sequence.

We also want to minimize the length of the outcome of the mapping. The
above trivial mapping in the example is not always preferred. Thus, we
need a way to measure the goodness of a mapping. That's why we come up
with KL divergence. Then the question is what's the best mapping we can
achieve under this metric. Also, is this metric good enough?

Hope i am making more sense here. Thanks.

yang


Janusz Kawczak wrote:
Quote:
You are right. Your question makes no sense as you posed it. You need to
explain what you intend to achieve in more "understandable" way. Maybe
an example would help?

Janusz.


yang wrote:
thanks. my next question: does there exist q, such that for any p, q
maximize KL(p, q) on average? I am not sure if my question make sense
here. :)

Janusz Kawczak wrote:

Yes, positive infinity as an upper bound!
BTW, usually q is the theoretical measure. But of course it does not
matter in your question.

Janusz.

yang wrote:

hi all,

I have a question regarding Kullback-Leibler Divergence (KL) in
information theory. Given a true distribution p and an estimated
distribution q for a random variable X, KL(p, q) measures the
"distance" from p to q.

KL(p, q) = \sum_i p(i) * log (p(i) / q(i)).

We know that KL(p, q) is low-bounded by 0. My question is: Does KL(p,
q) has an upper-bound? i.e., for a given distribution p, is there a
distribution q that maximizes KL(p, q).

Thanks for the help.

Yang


Back to top
Google

Back to top
Display posts from previous:   
Post new topic   Reply to topic Page 1 of 1 [5 Posts] View previous topic :: View next topic
The time now is Sun Sep 24, 2017 2:07 pm | All times are GMT
Forum index » Science and Technology » Math » Probability
Jump to:  

Similar Topics
Topic Author Forum Replies Last Post
No new posts Question about Life. socratus Probability 0 Sun Jan 06, 2008 10:01 pm
No new posts Probability Question dumont Probability 0 Mon Oct 23, 2006 3:38 pm
No new posts Question about exponention WingDragon@gmail.com Math 2 Fri Jul 21, 2006 8:13 am
No new posts question on solartron 1260 carrie_yao@hotmail.com Electrochem 0 Fri Jul 21, 2006 7:11 am
No new posts A Combinatorics/Graph Theory Question mathlover Undergraduate 1 Wed Jul 19, 2006 11:30 pm

Copyright © 2004-2005 DeniX Solutions SRL
Other DeniX Solutions sites: Electronics forum |  Medicine forum |  Unix/Linux blog |  Unix/Linux documentation |  Unix/Linux forums  |  send newsletters
 


Powered by phpBB © 2001, 2005 phpBB Group
[ Time: 0.0227s ][ Queries: 16 (0.0059s) ][ GZIP on - Debug on ]