青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10051) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            久久视频一区二区| 亚洲国产成人91精品| 亚洲一区二区三区精品在线 | 欧美激情中文字幕一区二区| 久久国产一区二区| 亚洲国产成人精品久久久国产成人一区| 久久久久久网址| 久热精品视频在线观看一区| 亚洲激情不卡| 夜夜嗨av色综合久久久综合网| 欧美日韩综合不卡| 久久精品成人| 久久综合亚州| 亚洲一区二区三区精品在线观看 | 99精品99久久久久久宅男| 欧美日韩精选| 久久不射中文字幕| 免费黄网站欧美| 亚洲性夜色噜噜噜7777| 欧美一区二区性| 亚洲六月丁香色婷婷综合久久| 99成人在线| 国产一区二区三区四区老人| 欧美大片在线观看| 国产精品久久久久毛片软件| 久久亚洲春色中文字幕| 欧美黄污视频| 久久久91精品国产| 欧美日韩三级电影在线| 久久久久久久久蜜桃| 欧美黄色一级视频| 久久久久久久综合日本| 欧美精品播放| 老司机一区二区三区| 国产精品大片| 亚洲激情视频在线播放| 国产在线观看一区| 一区二区三区日韩欧美| 亚洲国产日韩欧美| 午夜久久电影网| 制服丝袜亚洲播放| 麻豆国产精品777777在线| 欧美一区二区三区视频| 欧美精品一二三| 欧美成人一二三| 国一区二区在线观看| 在线亚洲高清视频| 在线视频精品| 欧美精品一区二区三区蜜桃 | 一区二区免费在线观看| 久久婷婷av| 久久久久9999亚洲精品| 国产精品视屏| 亚洲欧美日韩在线高清直播| 99在线精品视频| 欧美91大片| 亚洲国产国产亚洲一二三| 红桃视频成人| 久久国产乱子精品免费女| 欧美尤物巨大精品爽| 国产精品黄视频| 一本色道久久综合狠狠躁的推荐| 亚洲老板91色精品久久| 欧美国产综合视频| 亚洲欧洲日产国码二区| 亚洲精品在线免费| 欧美精品91| 夜夜嗨av色综合久久久综合网| 99re在线精品| 欧美日韩午夜激情| 亚洲午夜免费视频| 欧美与黑人午夜性猛交久久久| 国产精品爱啪在线线免费观看| 一区二区三区www| 亚洲欧美高清| 国产精品尤物福利片在线观看| 亚洲婷婷在线| 久久精品主播| 亚洲国产毛片完整版 | 一区二区91| 欧美一区二区三区精品| 国产色爱av资源综合区| 久久精品女人的天堂av| 欧美成人免费va影院高清| 91久久久亚洲精品| 欧美色精品天天在线观看视频| 亚洲精品中文字幕在线观看| 午夜精品久久久99热福利| 国产免费亚洲高清| 久久综合狠狠| 一二三区精品福利视频| 久久激情久久| 亚洲精品视频免费| 国产精品青草久久| 久久久精品动漫| 亚洲美女黄色| 久久久久久网址| 一本一道久久综合狠狠老精东影业| 国产精品啊啊啊| 久久人人看视频| 一本在线高清不卡dvd| 久久久久久69| 99re亚洲国产精品| 精品99视频| 欧美色精品在线视频| 久久久久88色偷偷免费| 亚洲免费观看高清完整版在线观看熊| 欧美一区二区三区播放老司机| 亚洲激情电影中文字幕| 国产精品久久久久久亚洲调教 | 一区二区三区毛片| 久久亚洲综合色一区二区三区| 99精品国产一区二区青青牛奶| 国产一二精品视频| 欧美视频免费看| 欧美freesex8一10精品| 香蕉久久国产| 国产精品99久久久久久www| 欧美成人精品在线播放| 亚洲自拍三区| 一区二区三区免费网站| 在线视频观看日韩| 国精品一区二区| 国产精品嫩草影院av蜜臀| 欧美激情久久久久久| 久久久久久穴| 欧美一区二区三区久久精品| 99精品视频免费在线观看| 亚洲大胆av| 国产精品亚洲综合色区韩国| 美女网站久久| 久久久久久久久综合| 欧美一级在线播放| 性做久久久久久久免费看| 一区二区三区四区国产精品| 亚洲激情一区二区| 欧美激情自拍| 亚洲成色最大综合在线| 欧美大片在线观看一区| 免播放器亚洲一区| 欧美成年人在线观看| 久久综合国产精品| 鲁鲁狠狠狠7777一区二区| 久久嫩草精品久久久久| 久久精品国产第一区二区三区最新章节| 亚洲午夜一区| 欧美一区2区三区4区公司二百| 亚洲欧美日韩一区在线观看| 亚洲午夜精品久久| 亚洲女同同性videoxma| 午夜电影亚洲| 欧美一区二区在线| 久久久一区二区| 毛片基地黄久久久久久天堂| 狂野欧美激情性xxxx欧美| 免费看的黄色欧美网站| 亚洲高清资源| 亚洲免费电影在线观看| 亚洲一区亚洲| 欧美在线视频日韩| 另类人畜视频在线| 欧美日产国产成人免费图片| 欧美日韩你懂的| 国产日韩av一区二区| 在线观看欧美日韩| 正在播放亚洲| 久久激情中文| 亚洲国产精品精华液2区45| 91久久精品国产91久久性色| 一本久道综合久久精品| 欧美专区在线| 欧美激情精品久久久久久免费印度| 欧美日韩精品免费观看视频完整 | 国产一区自拍视频| 亚洲欧洲一区二区在线观看| 99国产麻豆精品| 久久国产高清| 亚洲人成网在线播放| 亚洲欧美在线一区二区| 老司机久久99久久精品播放免费| 欧美日韩1080p| 国产一区二区三区四区在线观看| 亚洲日本成人网| 久久av资源网站| 亚洲精品国产精品久久清纯直播 | 国产精品免费一区二区三区观看| 国产午夜久久久久| 一区二区不卡在线视频 午夜欧美不卡在 | 欧美调教vk| 亚洲高清自拍| 欧美在线播放高清精品| 91久久精品日日躁夜夜躁欧美| 午夜精品久久久久久久久久久| 米奇777超碰欧美日韩亚洲| 国产精品私房写真福利视频| 亚洲黄网站在线观看| 久久精品欧洲| 中日韩男男gay无套| 老色批av在线精品| 狠狠色丁香久久婷婷综合丁香 |