青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10051) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国模精品一区二区三区色天香| 亚洲国产综合91精品麻豆| 宅男66日本亚洲欧美视频| 91久久视频| 欧美日韩精品一区| 亚洲一区二区精品| 亚洲欧美成人精品| 国产一区在线免费观看| 欧美成人激情视频| 欧美人与性禽动交情品 | 麻豆成人精品| 亚洲精品资源美女情侣酒店| aa级大片欧美三级| 国产美女诱惑一区二区| 老司机精品视频网站| 欧美mv日韩mv国产网站| 亚洲系列中文字幕| 久久精品99无色码中文字幕| 亚洲欧洲中文日韩久久av乱码| 日韩视频免费在线观看| 国产免费成人av| 欧美成人精品福利| 国产精品久久久久秋霞鲁丝 | 亚洲精品日韩久久| 亚洲自拍都市欧美小说| 亚洲国产精品精华液2区45| 亚洲区在线播放| 国产欧美日韩激情| 欧美激情一区二区三区在线视频观看 | 欧美一区二区三区在线播放| 久久永久免费| 欧美一区二区三区免费视频| 麻豆久久婷婷| 欧美在线视频免费观看| 欧美激情一二区| 久久三级视频| 国产精品www网站| 亚洲第一黄网| 国产一区二区三区黄| 亚洲精选91| 亚洲精品国产精品国自产观看 | 欧美大片在线看免费观看| 国产精品久久久亚洲一区 | 99热这里只有成人精品国产| 国内精品福利| 亚洲欧美高清| 国产精品99久久久久久久vr| 两个人的视频www国产精品| 欧美一区二区三区啪啪| 欧美日韩调教| 亚洲国产精品成人| 精品成人一区二区| 欧美一级视频一区二区| 亚洲综合精品四区| 欧美日韩一区二区三区高清| 欧美激情日韩| 亚洲欧洲精品一区二区三区| 久久久一本精品99久久精品66| 欧美亚洲三区| 国产精品久久久久久久久借妻 | 欧美日韩日本视频| 国产精品久久久久免费a∨大胸| 欧美岛国在线观看| 永久免费视频成人| 久久婷婷人人澡人人喊人人爽 | 91久久精品日日躁夜夜躁国产| 欧美一区二区成人6969| 欧美一区二区视频免费观看| 国产精品久久久久久久免费软件| 日韩一级在线| 亚洲性av在线| 国产欧美韩日| 久久岛国电影| 欧美电影美腿模特1979在线看| 在线欧美日韩精品| 久久综合给合久久狠狠狠97色69| 麻豆精品一区二区av白丝在线| 精品成人一区二区三区| 久久五月婷婷丁香社区| 亚洲国产高清在线| 一区电影在线观看| 国产精品尤物| 性欧美xxxx大乳国产app| 久久综合中文字幕| 亚洲国产日韩欧美在线99| 欧美护士18xxxxhd| 一本色道久久加勒比88综合| 香蕉免费一区二区三区在线观看| 国产色综合久久| 免费成人高清在线视频| 99亚洲伊人久久精品影院红桃| 亚洲免费在线电影| 狠狠色伊人亚洲综合网站色| 免费看亚洲片| 在线视频一区观看| 蜜桃av噜噜一区| 一区二区三区 在线观看视| 国产欧美日韩另类一区| 免费视频一区二区三区在线观看| 亚洲精品欧美| 午夜精品久久久久久久99樱桃| 国产一区二区丝袜高跟鞋图片| 欧美成人一区二免费视频软件| 亚洲色诱最新| 欧美激情va永久在线播放| 亚洲午夜精品17c| 亚洲盗摄视频| 国产精品午夜在线| 欧美激情一区二区| 久久国产精品久久久久久| 亚洲美女毛片| 欧美91视频| 久久激情视频| 正在播放亚洲| 亚洲国产一区在线| 国模大胆一区二区三区| 国产精品v亚洲精品v日韩精品| 久久午夜精品| 欧美一区二区三区在线播放| 日韩午夜在线视频| 欧美激情综合| 老司机成人在线视频| 午夜精品视频在线观看| 亚洲美女视频在线免费观看| 在线欧美不卡| 激情五月***国产精品| 国产精品一区二区欧美| 欧美日韩伦理在线| 欧美激情五月| 欧美电影美腿模特1979在线看 | 99精品99| 亚洲欧洲在线观看| 亚洲第一搞黄网站| 欧美福利一区二区| 老司机精品久久| 久久人人爽人人爽爽久久| 欧美一区二区三区免费视| 亚洲制服av| 亚洲男人的天堂在线aⅴ视频| 一本色道久久综合亚洲91| 亚洲激情婷婷| 亚洲精品一区二区在线| 日韩视频二区| 一区二区三区欧美在线观看| 91久久精品日日躁夜夜躁国产| 亚洲高清资源综合久久精品| 亚洲国产精品电影| 亚洲人成艺术| 亚洲精品视频在线播放| 一本色道久久精品| 中文一区字幕| 午夜精品免费| 久久久久久网址| 久热精品视频在线观看一区| 免费视频一区二区三区在线观看| 免费在线观看精品| 91久久精品国产91性色| 99re热这里只有精品视频| 艳女tv在线观看国产一区| 亚洲一区二三| 久久国产视频网| 欧美福利一区二区| 国产精品扒开腿爽爽爽视频| 国产欧美精品一区aⅴ影院| 国内精品免费午夜毛片| 亚洲高清二区| 在线亚洲电影| 久久久www成人免费精品| 欧美成人免费大片| 日韩亚洲欧美一区二区三区| 亚洲一区二区在线播放| 久久久蜜桃一区二区人| 欧美日韩国产片| 国产日产欧美a一级在线| 136国产福利精品导航| 中日韩美女免费视频网站在线观看| 亚洲欧美在线aaa| 麻豆国产精品777777在线| 亚洲精品国产精品乱码不99 | 亚洲激情视频在线| 亚洲欧美激情诱惑| 免费久久99精品国产自在现线| 欧美日韩一区三区| 在线日韩视频| 欧美在线观看视频一区二区三区| 免费观看欧美在线视频的网站| 一道本一区二区| 久久久久一区二区三区| 欧美性大战久久久久久久| 在线国产精品播放| 欧美中文在线字幕| 亚洲国产天堂久久综合网| 先锋影音国产精品| 欧美日韩精品在线播放| 影音先锋亚洲视频| 校园激情久久| 一区二区三区成人| 欧美精品粉嫩高潮一区二区 | 亚洲无毛电影| 欧美福利一区二区|