• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>

            a tutorial on computer science

              C++博客 :: 首頁 :: 新隨筆 :: 聯系 :: 聚合  :: 管理 ::
              21 隨筆 :: 0 文章 :: 17 評論 :: 0 Trackbacks
            stander random forest:  random K features, enum all values as split, find best split.

            LINKS:https://en.wikipedia.org/wiki/Random_forest


            Extremely randomized trees: random K features, random a split value, find best split.
            ensemble Extremely randomized trees: use all data.

            LINKS:http://docs.opencv.org/2.4/modules/ml/doc/ertrees.html

            1. Extremely randomized trees don’t apply the bagging procedure to construct a set of the training samples for each tree. The same input training set is used to train all trees.
            2. Extremely randomized trees pick a node split very extremely (both a variable index and variable splitting value are chosen randomly), whereas Random Forest finds the best split (optimal one by variable index and variable splitting value) among random subset of variables.

              Extremely randomized trees用了所有的樣本作為訓練集;Extremely randomized trees隨機選一個特征和一個值作為分割標準;

              LINKS:http://scikit-learn.org/stable/modules/generated/sklearn.tree.ExtraTreeRegressor.html#sklearn.tree.ExtraTreeRegressor

              This class implements a meta estimator that fits a number of randomized decision trees (a.k.a. extra-trees) on various sub-samples of the dataset and use averaging to improve the predictive accuracy and control over-fitting.

              Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the 
              max_features randomly selected features and the best split among those is chosen. When max_features is set 1, this amounts to building a totally random decision tree.

              extra-trees 的ensemble用了bagging,然后選取多個特征,每個特征隨機選一個值作為分割標準建樹。

              一種實現方法:
                     樣本bagging, random n features & random k values ,求最優,建樹。 

            posted on 2016-02-28 21:01 bigrabbit 閱讀(338) 評論(0)  編輯 收藏 引用
            国产成人久久精品二区三区| 亚洲&#228;v永久无码精品天堂久久| 久久精品国产久精国产一老狼| 热久久最新网站获取| 久久精品国产亚洲av麻豆色欲| 久久成人影院精品777| 日韩电影久久久被窝网| 国产三级久久久精品麻豆三级| 93精91精品国产综合久久香蕉| 亚洲&#228;v永久无码精品天堂久久 | 无码八A片人妻少妇久久| 欧洲成人午夜精品无码区久久| A级毛片无码久久精品免费| 久久亚洲AV成人无码软件 | 狠狠久久综合伊人不卡| 一本久久a久久精品亚洲| 国产精品欧美久久久久天天影视 | 色综合久久88色综合天天 | 亚洲国产成人久久精品99| 久久精品中文字幕无码绿巨人| 亚洲色欲久久久久综合网| 91精品日韩人妻无码久久不卡| 久久丫精品国产亚洲av| 久久久国产精华液| 日本亚洲色大成网站WWW久久| 国产精品18久久久久久vr | 国产福利电影一区二区三区久久老子无码午夜伦不 | 伊人久久综合成人网| 久久精品女人天堂AV麻| 色妞色综合久久夜夜| 久久综合视频网站| 久久av高潮av无码av喷吹| 久久精品国产99国产电影网| 少妇高潮惨叫久久久久久| 亚洲伊人久久大香线蕉综合图片| 久久久久国产| 热综合一本伊人久久精品| 伊人久久大香线蕉综合网站| 久久婷婷五月综合成人D啪 | 狠狠干狠狠久久| 久久国产精品-久久精品|