青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10034) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            亚洲精品少妇网址| 亚洲免费在线视频| 久久伊伊香蕉| 久久激情网站| 激情欧美丁香| 欧美国产一区视频在线观看| 欧美gay视频| 99精品视频免费| 亚洲网站在线观看| 含羞草久久爱69一区| 女人天堂亚洲aⅴ在线观看| 久久综合中文字幕| 在线综合亚洲欧美在线视频| 亚洲少妇自拍| 黑丝一区二区三区| 亚洲国产欧美一区| 麻豆av福利av久久av| 亚洲视频免费| 亚洲欧美日韩天堂| 91久久精品国产91久久性色tv| 亚洲免费播放| 国产一区亚洲| 亚洲精品久久久久久久久| 国产精品影片在线观看| 欧美大色视频| 国产精品视频一区二区高潮| 老鸭窝91久久精品色噜噜导演| 欧美护士18xxxxhd| 久久久久国产一区二区三区四区| 免费在线国产精品| 欧美一区二区三区在线| 狂野欧美激情性xxxx| 午夜在线a亚洲v天堂网2018| 欧美freesex8一10精品| 久久国产精品免费一区| 欧美日韩国产欧| 另类天堂av| 国产精品自拍视频| 亚洲精品一二| 精品成人国产| 午夜精品久久久久久久白皮肤| 亚洲久久一区| 久久亚洲私人国产精品va媚药 | 久久一区二区精品| 欧美亚洲综合在线| 欧美日韩亚洲一区三区| 欧美不卡三区| 激情综合在线| 午夜日韩在线观看| 亚洲欧美日韩一区在线| 欧美日韩国产美女| 欧美激情久久久久久| 狠狠噜噜久久| 欧美在线视频不卡| 久久精品成人一区二区三区| 国产精品久久久对白| 亚洲精品一区二区三区av| 亚洲国产精品电影| 玖玖玖国产精品| 麻豆乱码国产一区二区三区| 国产日韩欧美在线观看| 亚洲综合三区| 久久9热精品视频| 国产乱码精品一区二区三区av | 久久爱www久久做| 国产精品二区影院| 亚洲天堂黄色| 欧美一区影院| 国产亚洲欧美另类一区二区三区| 亚洲欧美清纯在线制服| 欧美一区二区视频观看视频| 国产美女精品人人做人人爽| 香蕉乱码成人久久天堂爱免费 | 午夜伦欧美伦电影理论片| 欧美午夜电影网| 午夜精品久久久久久久99樱桃| 久久精品日韩一区二区三区| 国内一区二区在线视频观看| 久久精品一区中文字幕| 男人天堂欧美日韩| 亚洲看片一区| 欧美先锋影音| 欧美一区二区免费观在线| 美女露胸一区二区三区| 亚洲青色在线| 国产精品啊啊啊| 久久大逼视频| 亚洲国产精品视频| 亚洲一区二区少妇| 狠狠久久亚洲欧美专区| 欧美大秀在线观看| 亚洲午夜电影| 蜜臀va亚洲va欧美va天堂| 日韩视频免费观看| 国产精品资源| 欧美1区免费| 亚洲尤物在线视频观看| 免费成人美女女| 中国成人在线视频| 一区二区三区在线看| 欧美精品在线一区二区三区| 亚洲欧美激情一区二区| 亚洲第一页在线| 午夜精品短视频| 亚洲欧洲日夜超级视频| 国产免费亚洲高清| 欧美激情视频一区二区三区免费 | 久久综合一区二区| 在线亚洲伦理| 欧美激情精品久久久| 午夜久久久久久| 一区二区三区四区五区在线| 国内精品视频在线观看| 欧美性大战久久久久| 欧美sm视频| 久久精品视频va| 亚洲一区在线观看视频| 亚洲高清自拍| 久热精品在线视频| 羞羞漫画18久久大片| 亚洲精选在线观看| 亚洲大片免费看| 国产三级欧美三级日产三级99| 欧美另类在线观看| 久久在线免费观看| 久久国产精品99国产| 亚洲欧美日韩一区二区三区在线观看| 亚洲韩国日本中文字幕| 免费不卡在线观看av| 欧美在线视频观看免费网站| 亚洲少妇一区| 亚洲精品网址在线观看| 亚洲国产99| 在线观看欧美亚洲| 精品999在线播放| 国产午夜亚洲精品理论片色戒| 欧美日韩国产综合视频在线观看| 欧美wwwwww| 蜜臀av性久久久久蜜臀aⅴ| 久久久午夜视频| 久久深夜福利| 久久亚洲捆绑美女| 免费毛片一区二区三区久久久| 久久久久久高潮国产精品视| 久久精品在线| 可以免费看不卡的av网站| 免费成人性网站| 欧美成人69| 欧美理论电影网| 欧美日韩精品系列| 国产精品夜夜夜| 国产欧美精品日韩区二区麻豆天美| 国产精品美女www爽爽爽| 国产精品狼人久久影院观看方式| 国产精品久久久久9999吃药| 国产精品视频一区二区高潮| 国产一区二区无遮挡| 在线观看视频一区二区| 亚洲人成在线播放网站岛国| 一区二区精品在线| 亚洲欧美日韩在线| 久久国产精品亚洲77777| 蜜臀久久99精品久久久画质超高清 | 欧美亚洲不卡| 国产亚洲一区二区三区| 一区免费观看| 99re66热这里只有精品3直播| 中国成人亚色综合网站| 久久国产夜色精品鲁鲁99| 久久久久久高潮国产精品视| 欧美黑人多人双交| 99精品视频免费全部在线| 亚洲欧美怡红院| 欧美成人资源网| 国产精品美女久久久| ●精品国产综合乱码久久久久| 日韩午夜三级在线| 欧美制服丝袜第一页| 欧美电影在线观看| 亚洲天堂免费观看| 免费观看成人www动漫视频| 欧美视频一区在线观看| 一区二区亚洲精品| 亚洲欧美另类综合偷拍| 欧美xx69| 午夜精品免费视频| 欧美日产一区二区三区在线观看 | 久久这里只有精品视频首页| 欧美日韩一区二区三区在线看 | 国产三级欧美三级日产三级99| 亚洲精品乱码久久久久久日本蜜臀 | 1024成人网色www| 亚洲综合视频在线| 欧美高清视频在线| 羞羞视频在线观看欧美| 欧美日韩国产一区二区| 亚洲国产电影| 久久久久国产精品www| 亚洲午夜免费福利视频| 欧美精品日韩精品|