青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2025年9月>
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美日韩一区二区三区在线观看免| 蜜臀av在线播放一区二区三区| 91久久精品国产91久久| 久久成人国产精品| 精品999在线观看| 欧美激情一区二区三区蜜桃视频 | 久久精品国产77777蜜臀| 香蕉尹人综合在线观看| 黄色成人免费观看| 欧美激情按摩| 欧美亚洲成人精品| 久久久91精品| 欧美国产在线观看| 欧美一区二区视频在线观看| 久久精品国语| 一区二区黄色| 国产精品久久久久久一区二区三区| 国产精品成人免费精品自在线观看| 亚洲精品在线电影| 欧美国产一区二区在线观看| 性欧美videos另类喷潮| 欧美一区二区三区的| 亚洲黑丝在线| 一区二区免费看| 国产综合第一页| 亚洲毛片在线| 在线播放精品| 亚洲女性裸体视频| 亚洲人体影院| 久久av资源网站| 亚洲欧美日韩精品久久久久| 久久久女女女女999久久| 亚洲无玛一区| 免费在线观看精品| 久久激情综合| 国产精品高清一区二区三区| 亚洲第一区在线观看| 国产精品视频大全| 日韩视频在线观看一区二区| 国产美女精品视频免费观看| 国产一区美女| 亚洲九九精品| 亚洲精品视频在线| 久久久久久久一区二区| 亚洲欧美日韩另类| 欧美日韩网站| 91久久精品美女高潮| 精品成人a区在线观看| 亚洲一区二区视频在线| 一本一本久久a久久精品综合麻豆 一本一本久久a久久精品牛牛影视 | 久久另类ts人妖一区二区| 国产精品草草| 亚洲毛片在线看| 亚洲欧洲日韩在线| 麻豆精品传媒视频| 麻豆亚洲精品| 影音先锋久久| 欧美色欧美亚洲高清在线视频| 国产一区二区av| 一区二区国产日产| 一本色道久久综合| 欧美日韩国产另类不卡| 亚洲国产精品成人综合| 亚洲国产黄色片| 久久综合免费视频影院| 男女视频一区二区| 精品成人国产| 乱码第一页成人| 亚洲国产日韩在线一区模特| 亚洲国产日韩欧美在线动漫| 久久综合色天天久久综合图片| 免费不卡在线观看av| 在线观看视频一区| 美日韩精品免费观看视频| 亚洲国产精品va在线看黑人| 亚洲精品中文字幕有码专区| 欧美精品一区在线观看| 99re66热这里只有精品4| 亚洲女女女同性video| 午夜影视日本亚洲欧洲精品| 亚洲专区在线| 国产伦精品一区二区| 欧美一区二区女人| 久久亚洲精品一区二区| 亚洲高清不卡在线观看| 欧美极品在线视频| 亚洲视频第一页| 久久久久久久91| 亚洲另类黄色| 国产精品三上| 免费看的黄色欧美网站| 亚洲免费av网站| 久久久久久97三级| 亚洲六月丁香色婷婷综合久久| 欧美三级特黄| 欧美在线一区二区| 亚洲欧洲精品一区二区三区| 欧美一区二区三区精品电影| 一区二区视频欧美| 欧美日韩另类在线| 久久精品国产999大香线蕉| 国产精品电影观看| 久久精品91久久香蕉加勒比| 国产亚洲成人一区| 欧美激情精品久久久久久黑人| 国产精品99久久久久久久久久久久| 久久高清国产| 99精品视频一区二区三区| 国产欧美一区二区色老头 | 久久精品在线| 亚洲视频一起| 亚洲二区视频| 国产亚洲欧美日韩一区二区| 欧美日韩高清在线| 美女国内精品自产拍在线播放| 亚洲小视频在线| 亚洲精品久久久久久一区二区| 久久成人18免费网站| 一区二区三区欧美激情| 最新热久久免费视频| 国产在线日韩| 国产目拍亚洲精品99久久精品| 欧美成人69| 国产精品久久久久毛片软件| 久久精品国产99精品国产亚洲性色| 亚洲欧洲在线观看| 国产日韩专区| 欧美日韩国产一区二区三区| 老色鬼久久亚洲一区二区| 亚洲欧美在线aaa| 亚洲天堂成人在线视频| 亚洲精品视频啊美女在线直播| 免费高清在线一区| 久久久国产成人精品| 性视频1819p久久| 亚洲影视在线| 亚洲欧美成人一区二区在线电影| 亚洲毛片网站| 日韩视频在线免费观看| 亚洲精品网址在线观看| 亚洲国产免费| 亚洲精品欧美专区| 日韩视频一区| 一区二区三区不卡视频在线观看 | 亚洲高清在线播放| 国产日韩精品一区二区| 国产精品人人爽人人做我的可爱 | 欧美三区美女| 国产精品扒开腿做爽爽爽视频| 欧美区高清在线| 欧美日韩在线播放三区四区| 国产精品av免费在线观看| 欧美性理论片在线观看片免费| 国产精品久久久久77777| 国产精品你懂得| 国产日本欧美一区二区三区在线| 国产欧美日韩一级| 黄色成人在线| 亚洲人成在线播放| 亚洲一区二区在线看| 小黄鸭视频精品导航| 久久久久国产精品一区三寸| 美女久久网站| 亚洲精品一区二区三区在线观看| 一本一本久久a久久精品牛牛影视| 99www免费人成精品| 国产一区二区三区丝袜| 国产一区二区成人久久免费影院| 国产一区二区日韩| 亚洲黄色免费网站| 亚洲在线日韩| 久久综合电影| 99成人在线| 久久久久久久成人| 欧美日韩在线一区二区三区| 国产麻豆综合| 亚洲欧洲日韩在线| 午夜精品久久久久久久99樱桃| 久久久久久一区二区三区| 亚洲欧洲精品一区二区三区波多野1战4 | 亚洲一区二区视频在线| 久久国产手机看片| 亚洲国产精品va| 欧美一区二区在线免费观看| 欧美激情精品久久久久久变态| 国产日韩欧美在线视频观看| 亚洲精品偷拍| 91久久线看在观草草青青| 9色国产精品| 久久午夜精品| 国产精品女主播一区二区三区| 国内在线观看一区二区三区| 一区二区三区免费看| 噜噜噜在线观看免费视频日韩| 一区二区精品在线| 免费久久精品视频| 国产在线一区二区三区四区| 亚洲综合三区| 亚洲三级影院| 男女av一区三区二区色多|