青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

O(1) 的小樂

Job Hunting

公告

記錄我的生活和工作。。。
<2010年10月>
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

統(tǒng)計(jì)

  • 隨筆 - 182
  • 文章 - 1
  • 評論 - 41
  • 引用 - 0

留言簿(10)

隨筆分類(70)

隨筆檔案(182)

文章檔案(1)

如影隨形

搜索

  •  

最新隨筆

最新評論

閱讀排行榜

評論排行榜

k-means clustering

      In statistics and machine learning, k-means clustering is a method of cluster analysis which aims topartition n observations into k clusters in which each observation belongs to the cluster with the nearestmean. It is similar to the expectation-maximization algorithm for mixtures of Gaussians in that they both attempt to find the centers of natural clusters in the data as well as in the iterative refinement approach employed by both algorithms.

 

Description

Given a set of observations (x1, x2, …, xn), where each observation is a d-dimensional real vector, k-means clustering aims to partition the n observations into k sets (k < n) S = {S1, S2, …, Sk} so as to minimize the within-cluster sum of squares (WCSS):

\underset{\mathbf{S}} \operatorname{arg\,min} \sum_{i=1}^{k} \sum_{\mathbf x_j \in S_i} \left\| \mathbf x_j - \boldsymbol\mu_i \right\|^2

where μi is the mean of points in Si.

 

Algorithms

Regarding computational complexity, the k-means clustering problem is:

  • NP-hard in general Euclidean space d even for 2 clusters [4][5]
  • NP-hard for a general number of clusters k even in the plane [6]
  • If k and d are fixed, the problem can be exactly solved in time O(ndk+1 log n), where n is the number of entities to be clustered [7]

Thus, a variety of heuristic algorithms are generally used.

 

所以注意到Algorithm是一個(gè)典型的NP問題,所以通常我們尋找使用的是啟發(fā)式方法。

Standard algorithm

The most common algorithm uses an iterative refinement technique.最常用的一個(gè)技巧是迭代求精。

Due to its ubiquity it is often called the k-means algorithm; it is also referred to as , particularly in the computer science community.

Given an initial set of k means m1(1),…,mk(1), which may be specified randomly or by some heuristic, the algorithm proceeds by alternating between two steps:[8]

Assignment step: Assign each observation to the cluster with the closest mean (i.e. partition the observations according to the Voronoi diagram generated by the means(這里等價(jià)于把原空間根據(jù)Voronoi 圖劃分為k個(gè),此處的范數(shù)指的是2范數(shù),即歐幾里得距離,和Voronoi圖對應(yīng))).
S_i^{(t)} = \left\{ \mathbf x_j : \big\| \mathbf x_j - \mathbf m^{(t)}_i \big\| \leq \big\| \mathbf x_j - \mathbf m^{(t)}_{i^*} \big\| \text{ for all }i^*=1,\ldots,k \right\}
 
Update step: Calculate the new means to be the centroid of the observations in the cluster.
\mathbf m^{(t+1)}_i = \frac{1}{|S^{(t)}_i|} \sum_{\mathbf x_j \in S^{(t)}_i} \mathbf x_j
重新計(jì)算means

The algorithm is deemed to have converged when the assignments no longer change.

 

整個(gè)算法的流程就是如上圖所示

 

As it is a heuristic algorithm, there is no guarantee that it will converge to the global optimum, and the result may depend on the initial clusters. As the algorithm is usually very fast, it is common to run it multiple times with different starting conditions. However, in the worst case, k-means can be very slow to converge: in particular it has been shown that there exist certain point sets, even in 2 dimensions, on whichk-means takes exponential time, that is 2Ω(n), to converge[9][10]. These point sets do not seem to arise in practice: this is corroborated by the fact that the smoothed running time of k-means is polynomial[11].

最壞的時(shí)間復(fù)雜度是O(2Ω(n)),但是在實(shí)踐中,一般表現(xiàn)是一個(gè)多項(xiàng)式算法。

The "assignment" step is also referred to as expectation step, the "update step" as maximization step, making this algorithm a variant of the generalized expectation-maximization algorithm.

Variations

  • The expectation-maximization algorithm (EM algorithm) maintains probabilistic assignments to clusters, instead of deterministic assignments, and multivariate Gaussian distributions instead of means.
  • k-means++ seeks to choose better starting clusters.
  • The filtering algorithm uses kd-trees to speed up each k-means step.[12]
  • Some methods attempt to speed up each k-means step using coresets[13] or the triangle inequality.[14]
  • Escape local optima by swapping points between clusters.[15]

Discussion

File:Iris Flowers Clustering kMeans.svg

k-means clustering result for the Iris flower data set and actual species visualized using ELKI. Cluster means are marked using larger, semi-transparent symbols.

File:ClusterAnalysis Mouse.svg

k-means clustering and EM clustering on an artificial dataset ("mouse"). The tendency of k-means to produce equi-sized clusters leads to bad results, while EM benefits from the Gaussian distribution present in the data set

The two key features of k-means which make it efficient are often regarded as its biggest drawbacks:

A key limitation of k-means is its cluster model. The concept is based on spherical clusters that are separable in a way so that the mean value converges towards the cluster center. The clusters are expected to be of similar size, so that the assignment to the nearest cluster center is the correct assignment. When for example applying k-means with a value of k = 3 onto the well-known Iris flower data set, the result often fails to separate the three Iris species contained in the data set. With k = 2, the two visible clusters (one containing two species) will be discovered, whereas withk = 3 one of the two clusters will be split into two even parts. In fact, k = 2 is more appropriate for this data set, despite the data set containing 3 classes. As with any other clustering algorithm, the k-means result relies on the data set to satisfy the assumptions made by the clustering algorithms. It works very well on some data sets, while failing miserably on others.

The result of k-means can also be seen as the Voronoi cells of the cluster means. Since data is split halfway between cluster means, this can lead to suboptimal splits as can be seen in the "mouse" example. The Gaussian models used by the Expectation-maximization algorithm (which can be seen as a generalization of k-means) are more flexible here by having both variances and covariances. The EM result is thus able to accommodate clusters of variable size much better than k-means as well as correlated clusters (not in this example).

 

這篇是概念介紹篇,以后出代碼和一個(gè)K均值優(yōu)化的論文

Fast Hierarchical Clustering Algorithm Using Locality-Sensitive Hashing

posted on 2010-10-19 18:57 Sosi 閱讀(1606) 評論(0)  編輯 收藏 引用 所屬分類: Courses

統(tǒng)計(jì)系統(tǒng)
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            欧美一区亚洲一区| 久久精品色图| 欧美一区日韩一区| 国产精品久久77777| 99精品国产热久久91蜜凸| 一区二区精品在线| 国产精品wwwwww| 久热精品视频在线| 亚洲欧美在线x视频| 亚洲人成人77777线观看| 欧美中文字幕在线视频| 亚洲国产精品一区二区久| 亚洲电影毛片| 国产精品成人免费| 蜜桃av噜噜一区| 午夜精品99久久免费| 亚洲激情影院| 免费观看日韩av| 性色一区二区三区| 在线午夜精品| 9久草视频在线视频精品| 黄色另类av| 国产欧美日韩亚洲精品| 欧美视频你懂的| 欧美成人综合一区| 蜜臀av性久久久久蜜臀aⅴ四虎| 亚洲一区二区免费| 在线亚洲美日韩| 亚洲美女在线观看| 亚洲美女在线国产| 99re在线精品| 一区二区三区高清| 日韩一级免费观看| 一区二区三区回区在观看免费视频| 欧美+日本+国产+在线a∨观看| 亚洲欧美综合网| 欧美伊久线香蕉线新在线| 午夜一区在线| 久久久青草婷婷精品综合日韩| 久久精品夜色噜噜亚洲a∨| 久久精品综合网| 欧美大片在线看| 亚洲伦理网站| 亚洲欧美在线看| 久久亚洲精品一区二区| 欧美女同在线视频| 国产女人aaa级久久久级| 国产一级精品aaaaa看| 91久久精品视频| 亚洲一区中文| 欧美插天视频在线播放| 一本色道久久综合亚洲精品婷婷 | 国产热re99久久6国产精品| 欧美视频一区在线| 一区在线免费观看| 亚洲视频在线观看网站| 欧美一区在线看| 亚洲精品综合久久中文字幕| 性做久久久久久久免费看| 欧美噜噜久久久xxx| 精品99一区二区| 久久精精品视频| 亚洲永久免费精品| 欧美视频中文字幕| 亚洲人成久久| 裸体歌舞表演一区二区| 欧美亚洲尤物久久| 国产精品三上| 欧美一区二区视频观看视频| 亚洲精品色图| 欧美精品久久久久久久免费观看| 国产亚洲综合性久久久影院| 欧美专区中文字幕| 亚洲欧美激情视频| 国产日韩欧美在线观看| 久久国产精品亚洲77777| 亚洲欧美日韩视频一区| 国产日韩精品一区二区浪潮av| 亚洲综合国产精品| 先锋影音一区二区三区| 国产精品亚洲成人| 欧美一区二区视频网站| 欧美在线综合| 亚洲精品一区中文| 99re国产精品| 黄色成人精品网站| 亚洲丶国产丶欧美一区二区三区| 老司机一区二区三区| 亚洲精品美女久久久久| 中日韩男男gay无套| 国产精品福利在线观看网址| 欧美一区二区观看视频| 久久全球大尺度高清视频| 99精品久久| 久久黄色影院| 欧美亚洲一区二区在线观看| 欧美成人免费网| 欧美一区激情| 欧美新色视频| 欧美好吊妞视频| 国产一区日韩欧美| 9久草视频在线视频精品| 猫咪成人在线观看| 最近中文字幕日韩精品 | 国产精品久久久久久久久久直播 | 久久免费国产精品1| 欧美日韩大片| 亚洲成人自拍视频| 国产日韩精品久久| 亚洲欧美视频| 亚洲男同1069视频| 国产精品久久毛片a| 亚洲麻豆视频| 亚洲视频精选| 欧美日韩中文在线观看| 日韩网站在线| 亚洲综合日本| 国产欧美亚洲视频| 欧美在线视频全部完| 久久精品伊人| 在线日韩中文| 欧美成人激情在线| 亚洲另类视频| 亚洲伊人网站| 国内自拍视频一区二区三区| 久久久国产精品一区二区中文| 女同一区二区| 亚洲影视在线播放| 狠狠色丁香婷综合久久| 老司机一区二区三区| 国产精品99久久久久久久久久久久 | 久久久91精品国产一区二区三区 | 欧美精品成人| 性色av一区二区三区| 亚洲欧洲精品一区二区| 午夜精品免费| 亚洲精品影视| 国产色综合天天综合网| 欧美激情精品久久久久久久变态| 一本一本久久a久久精品牛牛影视| 欧美中文字幕在线视频| 亚洲欧洲日本在线| 国产一区二区三区网站| 欧美日韩国产一中文字不卡| 久久高清免费观看| 在线亚洲欧美| 亚洲伦理在线免费看| 亚洲欧洲日韩女同| 国产精品高潮呻吟视频| 欧美岛国激情| 久久综合九色欧美综合狠狠| 亚洲欧美日韩在线观看a三区| 亚洲乱码国产乱码精品精可以看| 免费视频一区| 久热精品视频| 欧美sm视频| 欧美大片一区| 亚洲理论在线| aa级大片欧美三级| 中文国产一区| 亚洲综合视频一区| 欧美在线免费观看视频| 久久国产精品久久久| 久久久精品久久久久| 美日韩精品免费观看视频| 久热精品视频在线免费观看| 美女福利精品视频| 欧美日韩精品一区二区三区四区| 欧美激情在线狂野欧美精品| 欧美了一区在线观看| 国产精品成av人在线视午夜片| 国产精品老牛| 伊人久久婷婷色综合98网| 亚洲成人在线免费| 亚洲一区二区三区乱码aⅴ蜜桃女| 亚洲深夜福利| 理论片一区二区在线| 一本一本大道香蕉久在线精品| 亚洲特级毛片| 欧美.www| 影院欧美亚洲| 欧美一区91| 一区二区三区免费网站| 老司机午夜精品视频| 国产免费成人av| 夜夜狂射影院欧美极品| 久久黄色网页| 在线一区观看| 欧美日韩国产美| 最新国产乱人伦偷精品免费网站| 先锋影音一区二区三区| 亚洲免费观看高清在线观看| 老牛影视一区二区三区| 国产精品综合不卡av | 欧美日韩一区二区三区| 亚洲黄色毛片| 亚洲黄网站黄| 欧美va天堂| 一二三区精品| 亚洲精品一二区|