青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产精品久久影院| 欧美日韩伦理在线免费| 国产一区久久久| 久久精品成人| 久久久久九九视频| 91久久极品少妇xxxxⅹ软件| 欧美国产一区二区| 欧美日本精品一区二区三区| 亚洲视频1区| 午夜在线观看免费一区| 韩国一区二区三区美女美女秀| 毛片av中文字幕一区二区| 免费视频亚洲| 亚洲综合第一页| 久久国产精品99国产精| 91久久久亚洲精品| 亚洲视频免费观看| 极品少妇一区二区三区精品视频| 欧美高清视频免费观看| 欧美日韩久久| 久久免费视频网| 欧美大片一区二区| 午夜久久tv| 麻豆freexxxx性91精品| 亚洲欧美韩国| 免费成人美女女| 午夜欧美不卡精品aaaaa| 久久久综合激的五月天| 99国产精品视频免费观看| 亚洲欧美日韩精品久久| 亚洲精品乱码久久久久久蜜桃麻豆 | 免费观看成人| 亚洲欧美日韩一区二区三区在线观看 | 欧美一区二区三区日韩视频| 亚洲电影在线看| 亚洲欧美国产日韩天堂区| 亚洲第一久久影院| 亚洲一区二区三区欧美| 亚洲区第一页| 久久精品成人一区二区三区蜜臀 | 亚洲精品欧美极品| 久久精品国产视频| 亚洲伊人一本大道中文字幕| 老鸭窝亚洲一区二区三区| 午夜视频一区| 欧美日韩精品一区| 欧美国产日韩xxxxx| 国产性做久久久久久| 国产精品99久久久久久久vr | 国产综合精品| 亚洲欧美日韩精品综合在线观看| 日韩午夜黄色| 欧美二区乱c少妇| 欧美高清在线| 在线观看久久av| 久久久91精品国产一区二区三区 | 欧美大片国产精品| **性色生活片久久毛片| 久久成人国产| 久久蜜桃精品| 在线观看亚洲精品视频| 久久精精品视频| 麻豆精品一区二区av白丝在线| 国产精品久久久久毛片大屁完整版| 亚洲毛片播放| 亚洲一区二区综合| 国产精品v欧美精品v日本精品动漫| 亚洲靠逼com| 亚洲视频香蕉人妖| 国产精品久久久久久久久婷婷| av不卡免费看| 欧美一区二区三区四区高清| 国产欧美日韩麻豆91| 亚洲免费网址| 免费中文日韩| 日韩午夜电影| 国产精品国产亚洲精品看不卡15| 亚洲视频在线免费观看| 久久成人精品电影| 在线看日韩欧美| 欧美精品久久一区二区| 日韩一级不卡| 欧美在线亚洲在线| 亚洲高清免费| 欧美日韩精选| 欧美一区二区三区男人的天堂 | 玖玖在线精品| 亚洲免费成人av电影| 国产精品久久久久久久app| 午夜精品视频在线| 亚洲电影自拍| 亚洲天堂av在线免费| 国产亚洲成精品久久| 老司机午夜精品| 在线一区二区三区四区五区| 久久久久久久久久久成人| 亚洲精品看片| 国产欧美一区二区三区在线看蜜臀| 欧美中文字幕视频| 亚洲日本无吗高清不卡| 久久gogo国模啪啪人体图| 亚洲国产91| 国产精品任我爽爆在线播放| 亚洲国产精品成人一区二区| 欧美剧在线免费观看网站| 亚洲——在线| 亚洲精品久久久久久一区二区| 午夜精品久久久久久久| 最新成人av网站| 国产嫩草影院久久久久 | 亚洲网友自拍| 亚洲国产专区| 久久久午夜精品| 亚洲婷婷国产精品电影人久久| 韩国欧美一区| 国产乱码精品| 欧美日韩一区精品| 欧美1区2区视频| 久久国产乱子精品免费女 | 欧美激情一区二区三区| 欧美一区亚洲二区| 一区二区国产精品| 亚洲国产欧美在线| 国产一区二区三区在线观看网站| 欧美了一区在线观看| 久久亚洲视频| 欧美中文在线观看国产| 亚洲小说欧美另类社区| 亚洲精一区二区三区| 91久久精品国产91性色| 欧美69视频| 美女91精品| 久久久噜噜噜久久狠狠50岁| 午夜精品区一区二区三| 亚洲尤物视频在线| 亚洲少妇一区| 中文有码久久| 亚洲亚洲精品在线观看| 一本色道久久88综合亚洲精品ⅰ| 最新国产成人在线观看| 亚洲国产一区二区a毛片| 在线精品福利| 在线观看一区二区精品视频| 经典三级久久| 在线日韩欧美视频| 亚洲激情成人| 亚洲人成毛片在线播放女女| 91久久极品少妇xxxxⅹ软件| 亚洲日本黄色| 日韩天堂在线观看| 亚洲综合色网站| 性一交一乱一区二区洋洋av| 亚洲欧美日韩人成在线播放| 亚洲欧美日韩在线不卡| 欧美一区二区在线观看| 久久精品女人| 亚洲第一天堂无码专区| 亚洲日韩中文字幕在线播放| 99精品免费| 欧美一区二区三区视频免费| 久久成人精品视频| 美日韩精品免费| 欧美日韩美女| 国产欧美综合在线| ●精品国产综合乱码久久久久| 亚洲日韩中文字幕在线播放| 亚洲少妇最新在线视频| 欧美亚洲综合久久| 欧美va亚洲va日韩∨a综合色| 91久久黄色| 亚洲欧美区自拍先锋| 久久久久久久综合日本| 欧美久久成人| 国产日韩精品一区观看| 在线观看亚洲| 亚洲午夜电影网| 久久久精品一品道一区| 亚洲国产成人av在线| 亚洲视频 欧洲视频| 久久手机精品视频| 欧美视频不卡| 亚洲激情中文1区| 亚洲免费视频成人| 欧美国产欧美亚州国产日韩mv天天看完整| 欧美日韩精品在线| 黄色免费成人| 亚洲欧美激情精品一区二区| 狼人社综合社区| 亚洲性图久久| 欧美韩日视频| 国外视频精品毛片| 久久免费黄色| 亚洲免费在线观看视频| 一区二区欧美日韩视频| 欧美大尺度在线观看| 亚洲一区二区三区高清 | 老司机一区二区三区| 亚洲婷婷综合久久一本伊一区| 美女999久久久精品视频| 国产精品视频xxxx|