青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

公告

記錄我的生活和工作。。。
<2010年9月>
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789

統(tǒng)計(jì)

  • 隨筆 - 182
  • 文章 - 1
  • 評(píng)論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評(píng)論

閱讀排行榜

評(píng)論排行榜

Expectation-maximization algorithm EM算法

     In statistics, an expectation-maximization (EM) algorithm is a method for finding maximum likelihood ormaximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. EM is an iterative method which alternates between performing an expectation (E) step, which computes the expectation of the log-likelihood evaluated using the current estimate for the latent variables, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step.

 

  EM算法可用于很多問(wèn)題的框架,其中需要估計(jì)一組描述概率分布的參數(shù)\boldsymbol\theta,只給定了由此產(chǎn)生的全部數(shù)據(jù)中能觀察到的一部分!

  EM算法是一種迭代算法,它由基本的兩個(gè)步驟組成:

  E step:估計(jì)期望步驟

  使用對(duì)隱變量的現(xiàn)有估計(jì)來(lái)計(jì)算log極大似然

  M step: 最大化期望步驟

  計(jì)算一個(gè)對(duì)隱變量更好的估計(jì),使其最大化log似然函數(shù)對(duì)隱變量Y的期望。用新計(jì)算的隱變量參數(shù)代替之前的對(duì)隱變量的估計(jì),進(jìn)行下一步的迭代!

 

 

觀測(cè)數(shù)據(jù):觀測(cè)到的隨機(jī)變量X的IID樣本:

image

缺失數(shù)據(jù):未觀測(cè)到的隱含變量(隱變量)Y的值:

image

完整數(shù)據(jù): 包含觀測(cè)到的隨機(jī)變量X和未觀測(cè)到的隨機(jī)變量Y的數(shù)據(jù),Z=(X,Y)

 

似然函數(shù):(似然函數(shù)的幾種寫(xiě)法)

JL})D_HBNI489~H}GCRMWVJ

log似然函數(shù)為:

image

E step:用對(duì)隱變量的現(xiàn)有估計(jì)\boldsymbol\theta^{(t)}計(jì)算隱變量Y的期望

  image

其中需要用到貝葉斯公式:

image 

M step:最大化期望,獲得對(duì)隱變量更好的估計(jì)

image

 

維基中的表述是這樣子:

Given a statistical model consisting of a set \mathbf{X} of observed data, a set of unobserved latent data or missing values Y, and a vector of unknown parameters \boldsymbol\theta, along with a likelihood function L(\boldsymbol\theta; \mathbf{X}, \mathbf{Z}) = p(\mathbf{X}, \mathbf{Z}|\boldsymbol\theta), the maximum likelihood estimate (MLE) of the unknown parameters is determined by the marginal likelihood of the observed data 

       CR%M2I[QD88[N5$3(H))%ZR

However, this quantity is often intractable.

The EM algorithm seeks to find the MLE of the marginal likelihood by iteratively applying the following two steps:

Expectation step (E-step): Calculate the expected value of the log likelihood function, with respect to the conditional distribution of Y given \mathbf{X} under the current estimate of the parameters \boldsymbol\theta^{(t)}:

       A7DFNWMY)KAI]T5)_OMKRUD

Maximization step (M-step): Find the parameter that maximizes this quantity:
\boldsymbol\theta^{(t+1)} = \underset{\boldsymbol\theta} \operatorname{arg\,max} \ Q(\boldsymbol\theta|\boldsymbol\theta^{(t)}) \,

Note that in typical models to which EM is applied:

  1. The observed data points \mathbf{X} may be discrete (taking one of a fixed number of values, or taking values that must be integers) or continuous (taking a continuous range of real numbers, possibly infinite). There may in fact be a vector of observations associated with each data point.
  2. The missing values (aka latent variables) Y are discrete, drawn from a fixed number of values, and there is one latent variable per observed data point.
  3. The parameters are continuous, and are of two kinds: Parameters that are associated with all data points, and parameters associated with a particular value of a latent variable (i.e. associated with all data points whose corresponding latent variable has a particular value).

However, it is possible to apply EM to other sorts of models.

The motivation is as follows. If we know the value of the parameters \boldsymbol\theta, we can usually find the value of the latent variables Y by maximizing the log-likelihood over all possible values of Y, either simply by iterating over Y or through an algorithm such as the Viterbi algorithm for hidden Markov models. Conversely, if we know the value of the latent variables Y, we can find an estimate of the parameters \boldsymbol\theta fairly easily, typically by simply grouping the observed data points according to the value of the associated latent variable and averaging the values, or some function of the values, of the points in each group. This suggests an iterative algorithm, in the case where both \boldsymbol\theta and Y are unknown:

  1. First, initialize the parameters \boldsymbol\theta to some random values.
  2. Compute the best value for Y given these parameter values.
  3. Then, use the just-computed values of Y to compute a better estimate for the parameters \boldsymbol\theta. Parameters associated with a particular value of Y will use only those data points whose associated latent variable has that value.
  4. Finally, iterate until convergence.

The algorithm as just described will in fact work, and is commonly called hard EM. The K-means algorithm is an example of this class of algorithms.

However, we can do somewhat better by, rather than making a hard choice for Y given the current parameter values and averaging only over the set of data points associated with a particular value of Y, instead determining the probability of each possible value of Y for each data point, and then using the probabilities associated with a particular value of Y to compute a weighted average over the entire set of data points. The resulting algorithm is commonly called soft EM, and is the type of algorithm normally associated with EM. The counts used to compute these weighted averages are called soft counts (as opposed to the hard counts used in a hard-EM-type algorithm such as K-means). The probabilities computed for Y areposterior probabilities and are what is computed in the E-step. The soft counts used to compute new parameter values are what is computed in the M-step.

總結(jié):

EM is frequently used for data clustering in machine learning and computer vision.

EM會(huì)收斂到局部極致,但不能保證收斂到全局最優(yōu)。

EM對(duì)初值比較敏感,通常需要一個(gè)好的,快速的初始化過(guò)程。

 

這是我的Machine Learning課程,先總結(jié)到這里, 下面的工作是做一個(gè)GM_EM的總結(jié),多維高斯密度估計(jì)!

posted on 2010-10-20 14:44 Sosi 閱讀(2524) 評(píng)論(0)  編輯 收藏 引用 所屬分類: Courses

統(tǒng)計(jì)系統(tǒng)
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美福利视频在线| 欧美亚洲专区| 欧美日韩国产首页在线观看| 亚洲激情六月丁香| 亚洲人成啪啪网站| 欧美日韩天堂| 欧美一区二区三区免费看| 欧美一区二区三区在线观看| 在线看片一区| 日韩一区二区免费高清| 国产精品免费一区二区三区在线观看| 欧美制服第一页| 久久久999| 这里只有精品在线播放| 午夜精品免费在线| 亚洲国产婷婷| 亚洲图片激情小说| 永久域名在线精品| 99国产精品久久久| 国产自产精品| 亚洲精品美女在线| 国产一区二区三区在线观看免费 | 中日韩美女免费视频网址在线观看| 国产精品永久免费在线| 欧美18av| 国产模特精品视频久久久久| 亚洲黄色视屏| 红桃视频成人| 亚洲一区二区三区高清| 亚洲精品视频在线观看免费| 亚洲欧美电影在线观看| 日韩一级精品视频在线观看| 久久久av水蜜桃| 午夜视频一区二区| 欧美华人在线视频| 久久综合久久综合久久| 国产精品国产自产拍高清av| 欧美国产在线观看| 国产一区香蕉久久| 亚洲一区二区三区激情| 一区二区三区高清在线观看| 麻豆成人在线播放| 久久久精品欧美丰满| 欧美三日本三级少妇三2023| 亚洲成人在线视频播放 | 国产精品区二区三区日本| 亚洲国产欧美在线人成| 在线 亚洲欧美在线综合一区| 中文在线不卡| 亚洲视频在线一区观看| 欧美xxx成人| 免费欧美高清视频| 国产精品美女久久久免费| 欧美激情第六页| 黄色成人av| 久久aⅴ乱码一区二区三区| 欧美在线播放视频| 国产精品99一区| 激情久久影院| 国产精品入口日韩视频大尺度| 免费观看成人鲁鲁鲁鲁鲁视频| 国产精品黄页免费高清在线观看| 亚洲激情视频在线观看| 亚洲人成小说网站色在线| 美女亚洲精品| 欧美黄色大片网站| 91久久久一线二线三线品牌| 久久亚洲色图| 欧美成人午夜77777| 在线精品福利| 免费不卡在线观看av| 亚洲第一在线综合网站| 亚洲国产精品久久久久秋霞影院| 久久综合久色欧美综合狠狠| 久久精品国产成人| 伊人久久综合97精品| 久久久久久欧美| 亚洲激情在线观看| 一本色道久久综合| 欧美日韩国产成人在线| 中文亚洲字幕| 久久久久久久久久看片| 在线欧美日韩精品| 欧美黄色精品| 在线视频日韩| 久久精品欧洲| 最新高清无码专区| 欧美日韩精品一区二区天天拍小说| 日韩亚洲欧美高清| 欧美一区二区三区免费看| 国产主播一区二区三区四区| 免费亚洲一区| 一区二区三区高清不卡| 久久精品人人做人人爽| 亚洲欧洲日产国产网站| 国产精品h在线观看| 欧美一区激情| 亚洲精华国产欧美| 小黄鸭视频精品导航| 在线观看成人一级片| 欧美日韩亚洲高清| 欧美在线视频二区| 91久久精品日日躁夜夜躁欧美| 亚洲欧美日韩一区二区三区在线观看| 国产一区二区三区在线观看精品 | 美女视频黄a大片欧美| 亚洲精品看片| 国产一区二区无遮挡| 欧美激情精品久久久久久| 亚洲欧美成人一区二区在线电影 | 午夜欧美精品久久久久久久| 在线日韩精品视频| 国产精品久久久久久久久久久久久久| 久久九九有精品国产23| 一区二区三区福利| 亚洲第一偷拍| 久久综合999| 亚洲欧美一区二区三区久久| 91久久精品国产91久久性色tv| 国产精品免费网站| 欧美日韩高清区| 久久亚洲不卡| 欧美资源在线观看| 亚洲视频在线观看免费| 亚洲精品视频在线| 蜜臀av在线播放一区二区三区| 午夜精品久久久久久99热软件| 亚洲精品一品区二品区三品区| 国产日韩精品一区二区三区在线| 欧美女主播在线| 麻豆成人在线| 久久久久88色偷偷免费| 亚洲欧美自拍偷拍| 亚洲神马久久| 亚洲小少妇裸体bbw| 日韩亚洲精品电影| 亚洲区国产区| 亚洲国产精品久久久久婷婷老年 | 亚洲看片一区| 免费精品99久久国产综合精品| 久久一区中文字幕| 久久久久久久国产| 久久久久网站| 久久久久久久高潮| 久久夜色精品国产噜噜av| 久久黄色小说| 久久免费高清视频| 另类av导航| 男男成人高潮片免费网站| 久久视频在线免费观看| 狼狼综合久久久久综合网| 浪潮色综合久久天堂| 蜜臀91精品一区二区三区| 理论片一区二区在线| 欧美黑人在线观看| 日韩一级免费观看| 中日韩男男gay无套| 亚洲综合视频1区| 欧美在线国产精品| 久久亚洲影院| 欧美精品v日韩精品v国产精品| 欧美日本簧片| 国产精品一区二区久久国产| 国产亚洲激情视频在线| **网站欧美大片在线观看| 亚洲人成人77777线观看| 一区二区久久久久久| 欧美有码在线观看视频| 蜜臀av性久久久久蜜臀aⅴ| 欧美激情一区二区三区全黄| 亚洲精选91| 午夜免费日韩视频| 久久先锋资源| 国产精品白丝av嫩草影院| 国产亚洲在线观看| 亚洲日本在线视频观看| 午夜欧美大尺度福利影院在线看| 久久久噜噜噜久噜久久| 亚洲国产成人高清精品| 亚洲综合色噜噜狠狠| 久久久综合视频| 欧美日韩一区三区| 国际精品欧美精品| 一区二区三区视频在线观看| 久久久国产精品一区二区三区| 欧美激情一区二区三区全黄| 亚洲欧美bt| 欧美黄色成人网| 国产一区二区三区精品欧美日韩一区二区三区 | 亚洲精品综合久久中文字幕| 亚洲一区二区三区在线播放| 久久免费视频在线观看| 亚洲精选在线| 麻豆亚洲精品| 国产亚洲欧美日韩美女| 亚洲影院色无极综合| 欧美国产日韩亚洲一区| 欧美一进一出视频| 欧美视频免费在线| 亚洲日本免费|