青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統計

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

Kullback–Leibler divergence KL散度

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence,information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions P and Q. KL measures the expected number of extra bits required to code samples from P when using a code based on Q, rather than using a code based on P. Typically P represents the "true" distribution of data, observations, or a precise calculated theoretical distribution. The measure Q typically represents a theory, model, description, or approximation of P.

Although it is often intuited as a distance metric, the KL divergence is not a true metric – for example, the KL from P to Q is not necessarily the same as the KL from Q to P.

KL divergence is a special case of a broader class of divergences called f-divergences. Originally introduced by Solomon Kullbackand Richard Leibler in 1951 as the directed divergence between two distributions, it is not the same as a divergence incalculus. However, the KL divergence can be derived from the Bregman divergence.

 

 

注意P通常指數據集,我們已有的數據集,Q表示理論結果,所以KL divergence 的物理含義就是當用Q來編碼P中的采樣時,比用P來編碼P中的采用需要多用的位數!

 

KL散度,也有人稱為KL距離,但是它并不是嚴格的距離概念,其不滿足三角不等式

 

KL散度是不對稱的,當然,如果希望把它變對稱,

Ds(p1, p2) = [D(p1, p2) + D(p2, p1)] / 2

 

下面是KL散度的離散和連續定義!

D_{\mathrm{KL}}(P\|Q) = \sum_i P(i) \log \frac{P(i)}{Q(i)}. \!

D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty p(x) \log \frac{p(x)}{q(x)} \; dx, \!

注意的一點是p(x) 和q(x)分別是pq兩個隨機變量的PDF,D(P||Q)是一個數值,而不是一個函數,看下圖!

 

注意:KL Area to be Integrated!

 

File:KL-Gauss-Example.png

 

KL 散度一個很強大的性質:

The Kullback–Leibler divergence is always non-negative,

D_{\mathrm{KL}}(P\|Q) \geq 0, \,

a result known as , with DKL(P||Q) zero if and only if P = Q.

 

計算KL散度的時候,注意問題是在稀疏數據集上KL散度計算通常會出現分母為零的情況!

 

 

Matlab中的函數:KLDIV給出了兩個分布的KL散度

Description

KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.

KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two distributions specified over the M variable values in vector X. P1 is a length-M vector of probabilities representing distribution 1, and P2 is a length-M vector of probabilities representing distribution 2. Thus, the probability of value X(i) is P1(i) for distribution 1 and P2(i) for distribution 2. The Kullback-Leibler divergence is given by:

   KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]

If X contains duplicate values, there will be an warning message, and these values will be treated as distinct values. (I.e., the actual values do not enter into the computation, but the probabilities for the two duplicate values will be considered as probabilities corresponding to two unique values.) The elements of probability vectors P1 and P2 must each sum to 1 +/- .00001.

A "log of zero" warning will be thrown for zero-valued probabilities. Handle this however you wish. Adding 'eps' or some other small value to all probabilities seems reasonable. (Renormalize if necessary.)

KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic (2001).

KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article for "Kullback–Leibler divergence". This is equal to 1/2 the so-called "Jeffrey divergence." See Rubner et al. (2000).

EXAMPLE: Let the event set and probability sets be as follow:
   X = [1 2 3 3 4]';
   P1 = ones(5,1)/5;
   P2 = [0 0 .5 .2 .3]' + eps;
Note that the event set here has duplicate values (two 3's). These will be treated as DISTINCT events by KLDIV. If you want these to be treated as the SAME event, you will need to collapse their probabilities together before running KLDIV. One way to do this is to use UNIQUE to find the set of unique events, and then iterate over that set, summing probabilities for each instance of each unique event. Here, we just leave the duplicate values to be treated independently (the default):
   KL = kldiv(X,P1,P2);
   KL =
        19.4899

Note also that we avoided the log-of-zero warning by adding 'eps' to all probability values in P2. We didn't need to renormalize because we're still within the sum-to-one tolerance.

REFERENCES:
1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley, 1991.
2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler distance." IEEE Transactions on Information Theory (Submitted).
3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's distance as a metric for image retrieval." International Journal of Computer Vision, 40(2): 99-121.
4) <a href="
http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence"&gt;Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.

posted on 2010-10-16 15:04 Sosi 閱讀(10051) 評論(2)  編輯 收藏 引用 所屬分類: Taps in Research

評論

# re: Kullback&ndash;Leibler divergence KL散度 2010-11-30 16:17 tintin0324

博主,本人的研究方向需要了解kl距離,有些問題想請教下,怎么聯系呢?
  回復  更多評論    

# re: Kullback&ndash;Leibler divergence KL散度 2010-12-05 22:37 Sosi

@tintin0324
KL 距離本身很簡單,如果就是那樣子定義的,意義也如上面所說。。如果你想深入了解的話,可以讀以下相關文獻
  回復  更多評論    
統計系統
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            久久久久国产成人精品亚洲午夜| 亚洲乱码国产乱码精品精| 欧美一区二区三区精品 | 亚洲欧美日韩区| 麻豆精品传媒视频| 亚洲一区国产一区| 欧美日韩国产三级| 亚洲欧洲在线看| 欧美成人精品一区二区| 久久精品视频亚洲| 国产一区二区三区在线观看免费视频 | 国产精品国产成人国产三级| 在线观看中文字幕不卡| 先锋影院在线亚洲| 99热精品在线| 欧美日韩国产综合久久| 欧美日韩一区综合| 日韩小视频在线观看| 模特精品在线| 欧美激情亚洲国产| 免费在线欧美视频| 亚洲精品自在久久| 亚洲国产日韩欧美在线动漫| 欧美一区二区三区在线视频 | 国产在线视频欧美一区二区三区| 国产精品毛片va一区二区三区| 亚洲国产一区二区视频| 欧美激情1区2区| 欧美好骚综合网| 99精品欧美一区| 这里只有精品丝袜| 国产欧美日韩精品专区| 久久精品国产一区二区三区免费看| 亚洲在线视频| 国产一区二区精品丝袜| 欧美成人一品| 欧美国产日本| 亚洲影视中文字幕| 欧美影视一区| 亚洲精品一二| 亚洲一区国产视频| 精品福利免费观看| 亚洲高清资源综合久久精品| 欧美日韩美女| 久久国产精彩视频| 欧美综合激情网| 免费中文日韩| 老司机午夜精品| 久久亚洲私人国产精品va| 狠狠久久五月精品中文字幕| 久久精品国产91精品亚洲| 欧美一区观看| 亚洲一区二区在线看| 国产日韩欧美精品在线| 狂野欧美激情性xxxx欧美| 美女黄毛**国产精品啪啪| 欧美黄色aa电影| 亚洲一区日韩在线| 午夜精彩视频在线观看不卡| 国产精品www色诱视频| 欧美色图五月天| 欧美在线观看日本一区| 久久久久久久999精品视频| 亚洲国产aⅴ天堂久久| 亚洲人被黑人高潮完整版| 欧美日韩在线播放三区四区| 另类人畜视频在线| 国产精品日韩高清| 亚洲欧洲日产国码二区| 欧美日本在线一区| 久久国产精品一区二区三区| 久久综合精品国产一区二区三区| 最新日韩欧美| 在线视频日韩精品| 亚洲国产经典视频| 中文久久乱码一区二区| 久久亚洲综合网| 欧美91大片| 老司机67194精品线观看| 亚洲国产美女精品久久久久∴| 亚洲电影成人| 国产精品乱人伦一区二区| 欧美 日韩 国产一区二区在线视频| 欧美精品一区二| 久久夜色精品国产| 国产精品九色蝌蚪自拍| 欧美电影美腿模特1979在线看| 国产精品视频区| 亚洲三级色网| 久久久综合网| av成人激情| 久久成人久久爱| 亚洲精品一区二| 亚洲自拍偷拍福利| 日韩亚洲欧美综合| 久久欧美肥婆一二区| 一区二区激情视频| 欧美一区二区视频在线观看2020| 国产欧美亚洲精品| 亚洲高清免费在线| 国语自产精品视频在线看抢先版结局| 亚洲精品无人区| 国产亚洲欧美日韩精品| 巨乳诱惑日韩免费av| 国产精品日本| 一区二区三区日韩欧美精品| 亚洲人成在线观看网站高清| 9色精品在线| 亚洲二区精品| 欧美日本不卡| 亚洲激情av| 亚洲国产欧美一区| 久久精品国内一区二区三区| 欧美在线啊v| 欧美日韩国产三区| 亚洲专区一区二区三区| 亚洲一区在线观看视频 | 亚洲尤物在线| 亚洲专区免费| 欧美性久久久| 一区二区不卡在线视频 午夜欧美不卡在 | 一区二区精品国产| 亚洲欧美日韩精品久久久久| 国产精品国产三级国产普通话蜜臀| 欧美激情一区二区三区高清视频| 国产日韩精品在线播放| 亚洲天堂网站在线观看视频| 亚洲制服av| 国产日韩欧美自拍| 久久久91精品国产| 欧美电影在线| 99精品视频一区二区三区| 欧美搞黄网站| 91久久久一线二线三线品牌| aa级大片欧美三级| 欧美成人在线免费观看| 久久精品人人做人人爽| 国产一区二区三区视频在线观看 | 久久综合色播五月| 国产亚洲激情| 免费不卡亚洲欧美| 99精品久久久| 欧美a一区二区| 亚洲少妇最新在线视频| 久久久久久久综合日本| 99天天综合性| 欧美大片免费久久精品三p | 欧美精品99| 亚洲伊人观看| 欧美在线日韩| 99日韩精品| 欧美午夜激情视频| 久久成人精品视频| 最新日韩欧美| 欧美一级淫片aaaaaaa视频| 黄色日韩网站| 欧美日本韩国一区| 香蕉成人久久| 亚洲第一中文字幕| 校园春色综合网| 亚洲福利在线观看| 欧美三级电影一区| 久久精品视频免费观看| 午夜精品久久久久久久男人的天堂| 亚洲国产精品女人久久久| 在线观看国产成人av片| 国产色产综合色产在线视频| 欧美69wwwcom| 欧美一区二区精品| 国产精品高精视频免费| 一个色综合av| 国产精品av免费在线观看| 欧美精品三级| 欧美在线免费看| 日韩亚洲欧美成人| 亚洲国产91精品在线观看| 欧美一区二区三区日韩| 日韩午夜av在线| 精品69视频一区二区三区| 国产精品久久久久久久久久久久 | 欧美久久一区| 久久女同精品一区二区| 销魂美女一区二区三区视频在线| 久热精品视频在线免费观看| 亚洲天堂网在线观看| 日韩视频免费在线观看| 一区免费在线| 国产一区二区三区在线播放免费观看 | 久久久国产成人精品| 亚洲综合电影| 亚洲清纯自拍| 亚洲激情校园春色| 国产一区清纯| 国产亚洲免费的视频看| 国产精品久久久久久久app| 欧美a级一区| 免费欧美日韩国产三级电影| 久久婷婷麻豆| 欧美午夜一区二区福利视频| 久久综合伊人77777蜜臀|