青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10059) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美精品成人在线| 欧美电影免费观看高清| 国产精品久久| 午夜精品一区二区三区四区| 日韩西西人体444www| 欧美日韩午夜激情| 亚洲欧美制服中文字幕| 亚洲一区二区免费| 国产日韩欧美黄色| 美女日韩在线中文字幕| 老鸭窝亚洲一区二区三区| 亚洲精品日韩综合观看成人91| 欧美激情一区二区三级高清视频| 欧美v日韩v国产v| 一区二区91| 午夜日韩在线观看| 激情成人av| 亚洲国产精品激情在线观看| 欧美华人在线视频| 午夜精品久久久久久久久久久久| 亚洲欧美日韩综合国产aⅴ| 黄色在线一区| 亚洲精品久久久久久下一站| 国产精品v欧美精品∨日韩| 久久久久久伊人| 欧美日韩国产丝袜另类| 久久精彩免费视频| 欧美承认网站| 久久精品在线观看| 欧美另类变人与禽xxxxx| 久久精品欧美日韩精品| 女人香蕉久久**毛片精品| 亚洲欧美怡红院| 久久在线91| 性欧美暴力猛交另类hd| 美日韩在线观看| 欧美亚洲在线观看| 欧美黑人一区二区三区| 久久亚洲精品一区二区| 欧美视频中文字幕在线| 欧美va亚洲va香蕉在线| 国产日本欧美视频| 亚洲另类在线一区| 亚洲高清激情| 欧美一级夜夜爽| 亚洲一区二区三区高清| 欧美成人国产一区二区| 久久精品国产视频| 欧美午夜视频| 亚洲美女黄网| 亚洲精品国产精品久久清纯直播 | 亚洲高清成人| 欧美一级午夜免费电影| 亚洲综合不卡| 欧美日韩在线观看一区二区三区| 欧美国产综合视频| 激情av一区二区| 欧美在线视频二区| 欧美一区二区日韩一区二区| 欧美激情视频在线播放| 欧美激情中文字幕乱码免费| 国产综合激情| 欧美一区二区三区啪啪| 欧美一级视频免费在线观看| 国产精品爱啪在线线免费观看| 亚洲国产日韩一区二区| 亚洲日本一区二区| 欧美国产欧美综合| 91久久精品日日躁夜夜躁欧美| 亚洲国产精品va在线观看黑人| 久久久久久久久综合| 麻豆久久婷婷| 有码中文亚洲精品| 久久综合五月| 亚洲第一久久影院| 亚洲美女色禁图| 欧美女激情福利| 99精品国产热久久91蜜凸| 亚洲午夜精品久久| 国产精品女人毛片| 欧美一区=区| 欧美+日本+国产+在线a∨观看| 精品成人免费| 欧美电影美腿模特1979在线看 | 亚洲欧美日韩另类| 国产欧美精品在线观看| 欧美一级欧美一级在线播放| 免费91麻豆精品国产自产在线观看| 精品成人一区| 欧美精品久久久久久| 亚洲视频一区| 狂野欧美激情性xxxx欧美| 亚洲激情成人网| 国产精品福利在线观看网址| 午夜精品久久久| 免费久久久一本精品久久区| 亚洲伦理在线观看| 国产视频久久久久| 女仆av观看一区| 亚洲一区二区三区乱码aⅴ蜜桃女| 欧美在线free| 亚洲人成在线观看一区二区| 国产精品久久久久久久久久久久久 | 国产一区二区0| 欧美大香线蕉线伊人久久国产精品| 一区二区三区免费看| 久久青青草原一区二区| 一区二区欧美日韩视频| 国产一区欧美| 欧美日韩一区国产| 久久精品亚洲一区二区三区浴池| 亚洲成人中文| 久久精品国产成人| 一区二区三区国产在线观看| 国产一区二区主播在线| 欧美日韩大陆在线| 久久久精品tv| 亚洲一区高清| 亚洲免费激情| 嫩草影视亚洲| 久久久噜噜噜久噜久久| 亚洲网站在线| 亚洲激情第一页| 在线精品观看| 国产精品视频精品视频| 欧美日韩一区在线观看| 免费成人你懂的| 久久久久久尹人网香蕉| 亚洲一区一卡| 亚洲视频综合在线| 亚洲精品中文在线| 亚洲电影免费在线观看| 久久在线播放| 久久久久久久欧美精品| 午夜亚洲精品| 亚洲宅男天堂在线观看无病毒| 亚洲黑丝一区二区| 亚洲国产成人午夜在线一区| 国产一区欧美日韩| 国内精品视频在线播放| 国产视频一区在线| 国产视频久久网| 国产一区二区观看| 国产一区二区0| 黄色在线一区| 亚洲国产激情| 亚洲人午夜精品| 99精品欧美一区二区三区综合在线| 在线看一区二区| 亚洲第一色在线| 午夜精品美女自拍福到在线| 一区二区三区色| 一区二区高清在线| 亚洲手机在线| 亚洲自拍电影| 欧美一区二区| 久久久美女艺术照精彩视频福利播放| 亚洲欧美视频| 久久久久国产免费免费| 久久精品视频播放| 鲁鲁狠狠狠7777一区二区| 六月婷婷久久| 亚洲国产成人精品女人久久久| 亚洲国产另类久久精品| 日韩亚洲综合在线| 亚洲色图自拍| 久久国产视频网站| 欧美chengren| 国产精品久久久一本精品| 国产目拍亚洲精品99久久精品| 国产一区二区日韩精品欧美精品| 激情成人在线视频| 一区二区三区高清在线观看| 亚洲欧美日韩另类| 农村妇女精品| 亚洲午夜视频| 麻豆九一精品爱看视频在线观看免费| 欧美成人午夜剧场免费观看| 欧美色视频日本高清在线观看| 国产日产亚洲精品系列| 亚洲激精日韩激精欧美精品| 亚洲一区二区影院| 免费不卡欧美自拍视频| aa日韩免费精品视频一| 欧美一级播放| 欧美日韩久久精品| 在线观看免费视频综合| 亚洲永久视频| 亚洲成人在线免费| 欧美中文日韩| 亚洲国产精品久久久久秋霞蜜臀 | 免费成人高清| 一区二区电影免费观看| 久久精品国产亚洲aⅴ| 欧美日韩一区二区在线播放| 禁久久精品乱码| 亚洲与欧洲av电影| 亚洲大片精品永久免费| 亚洲欧美日韩国产一区| 欧美高清在线视频观看不卡|