青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产精品久久久久国产精品日日 | 国产一区二区久久久| 亚洲深夜福利在线| 国产精品99久久不卡二区| 国产精品美女视频网站| 久久久久久久波多野高潮日日| 欧美一区午夜视频在线观看| 激情五月综合色婷婷一区二区| 噜噜噜躁狠狠躁狠狠精品视频| 免费看亚洲片| 亚洲欧美日韩直播| 久久精品盗摄| 一区二区高清视频| 欧美中文字幕第一页| 亚洲精品视频免费观看| 亚洲午夜精品网| 亚洲深夜av| 玉米视频成人免费看| 亚洲黄色免费| 国产精品久久久久久妇女6080 | 亚洲综合色噜噜狠狠| 欧美亚洲免费高清在线观看| 亚洲大片精品永久免费| 亚洲在线成人| 久久亚洲高清| 樱桃国产成人精品视频| 99热免费精品| 一区二区在线视频| 亚洲影院色无极综合| 亚洲国产精品va在线看黑人| 在线亚洲一区观看| 亚洲国产片色| 欧美在线高清视频| 亚洲一区黄色| 欧美精品一区二区三区在线播放| 欧美一区二区三区在线免费观看| 免费久久99精品国产自在现线| 亚洲男人天堂2024| 欧美人在线视频| 美国十次成人| 国产一区二区视频在线观看| 日韩视频免费看| 亚洲国产成人精品视频| 欧美一级午夜免费电影| 亚洲免费网址| 欧美日韩精品一区二区三区四区| 欧美电影专区| 精品二区视频| 久久九九精品| 久久久亚洲人| 国产香蕉97碰碰久久人人| 午夜精品久久久久久99热| 欧美大片免费观看在线观看网站推荐 | 久久一二三区| 噜噜噜91成人网| 亚洲国产天堂久久综合| 在线播放不卡| 久久偷窥视频| 另类欧美日韩国产在线| 黑人操亚洲美女惩罚| 亚洲欧美一区二区激情| 欧美一区二区私人影院日本| 国产精品久久久久久影院8一贰佰 国产精品久久久久久影视 | 国产精品久久久久久久久久三级| 亚洲美女在线视频| 亚洲私人黄色宅男| 国产精品久久久久国产精品日日 | 久久久青草青青国产亚洲免观| 国产日韩精品一区二区三区| 亚洲一区亚洲二区| 欧美一级二级三级蜜桃| 国产伦精品一区二区三区免费| 亚洲女人天堂av| 久久精品91久久香蕉加勒比| 国产在线成人| 免费高清在线一区| 亚洲国产欧美国产综合一区| 亚洲最新视频在线| 国产精品亚洲第一区在线暖暖韩国| 午夜精品亚洲| 欧美www视频在线观看| 亚洲精品在线免费| 欧美视频一区二| 欧美在线观看网址综合| 欧美国产视频一区二区| 一区二区三区四区蜜桃| 国产欧美日韩亚州综合| 久久久久国色av免费看影院| 欧美激情在线观看| 亚洲欧美高清| 亚洲高清精品中出| 欧美精品首页| 欧美在线日韩精品| 亚洲精品人人| 久久全国免费视频| 在线一区二区三区四区五区| 国产欧美一区二区三区另类精品 | 午夜精品免费视频| 欧美高清视频在线播放| 午夜久久99| 日韩视频在线免费| 国产欧美日韩在线观看| 欧美激情麻豆| 久久精品在线视频| 中文日韩在线视频| 欧美激情麻豆| 久久久久久久网| 亚洲永久字幕| 亚洲精品人人| 樱桃视频在线观看一区| 国产精品久久一级| 欧美精品三级日韩久久| 久久色中文字幕| 午夜精品久久久久99热蜜桃导演| 91久久黄色| 欧美成人日韩| 噜噜噜噜噜久久久久久91| 欧美一区二区三区日韩| 一区二区三区欧美在线| 亚洲国产女人aaa毛片在线| 国产视频在线观看一区二区| 欧美日韩一级黄| 欧美国产日韩精品免费观看| 久久久久久久网站| 小处雏高清一区二区三区| 亚洲四色影视在线观看| 亚洲欧洲综合另类| 亚洲高清不卡一区| 欧美韩日一区二区| 欧美高清视频一区二区| 欧美三区免费完整视频在线观看| 欧美成人精品h版在线观看| 欧美一区二区免费观在线| 亚洲一区二区三区乱码aⅴ| 99亚洲视频| 99综合在线| 99v久久综合狠狠综合久久| 亚洲福利视频网| 欧美激情欧美激情在线五月| 欧美α欧美αv大片| 美女亚洲精品| 欧美国产日韩精品| 亚洲国产精品一区二区第一页| 亚洲电影下载| 亚洲精品一区二区在线观看| 亚洲三级免费电影| 日韩视频一区二区三区在线播放免费观看 | 国模精品一区二区三区色天香| 国产日韩欧美在线一区| 国产欧美一区二区三区久久人妖 | 欧美午夜片欧美片在线观看| 欧美午夜三级| 国产日韩欧美一二三区| 好看的av在线不卡观看| 亚洲二区视频| 一区二区三区四区五区视频| 亚洲一区欧美二区| 久久精品国产亚洲a| 噜噜噜噜噜久久久久久91| 欧美激情影院| 在线视频免费在线观看一区二区| 亚洲在线观看视频| 久久精品二区亚洲w码| 噜噜噜久久亚洲精品国产品小说| 美女精品自拍一二三四| 欧美日韩精品久久久| 国产美女精品视频| 亚洲激情国产| 亚洲综合日韩在线| 免费在线观看日韩欧美| 亚洲精品免费在线| 销魂美女一区二区三区视频在线| 卡通动漫国产精品| 国产精品久久久久高潮| 在线成人av.com| 亚洲网址在线| 麻豆精品精华液| 在线视频一区二区| 老司机午夜精品视频在线观看| 欧美日韩网址| 亚洲国产成人精品久久| 亚洲欧美网站| 亚洲精品久久久久久久久久久| 午夜免费日韩视频| 欧美日韩一区二区三区在线视频 | 欧美性一二三区| 亚洲福利在线观看| 久久精品视频导航| 亚洲美女福利视频网站| 久久久久久久一区二区三区| 国产精品成人一区| 亚洲日产国产精品| 巨乳诱惑日韩免费av| 亚洲一区二区三区免费观看| 欧美激情成人在线| 影音先锋中文字幕一区| 欧美在线网址| 亚洲无线观看| 欧美三级精品| 亚洲乱码国产乱码精品精可以看|