• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>

            a tutorial on computer science

              C++博客 :: 首頁 :: 新隨筆 :: 聯系 :: 聚合  :: 管理 ::
              21 隨筆 :: 0 文章 :: 17 評論 :: 0 Trackbacks
            stander random forest:  random K features, enum all values as split, find best split.

            LINKS:https://en.wikipedia.org/wiki/Random_forest


            Extremely randomized trees: random K features, random a split value, find best split.
            ensemble Extremely randomized trees: use all data.

            LINKS:http://docs.opencv.org/2.4/modules/ml/doc/ertrees.html

            1. Extremely randomized trees don’t apply the bagging procedure to construct a set of the training samples for each tree. The same input training set is used to train all trees.
            2. Extremely randomized trees pick a node split very extremely (both a variable index and variable splitting value are chosen randomly), whereas Random Forest finds the best split (optimal one by variable index and variable splitting value) among random subset of variables.

              Extremely randomized trees用了所有的樣本作為訓練集;Extremely randomized trees隨機選一個特征和一個值作為分割標準;

              LINKS:http://scikit-learn.org/stable/modules/generated/sklearn.tree.ExtraTreeRegressor.html#sklearn.tree.ExtraTreeRegressor

              This class implements a meta estimator that fits a number of randomized decision trees (a.k.a. extra-trees) on various sub-samples of the dataset and use averaging to improve the predictive accuracy and control over-fitting.

              Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the 
              max_features randomly selected features and the best split among those is chosen. When max_features is set 1, this amounts to building a totally random decision tree.

              extra-trees 的ensemble用了bagging,然后選取多個特征,每個特征隨機選一個值作為分割標準建樹。

              一種實現方法:
                     樣本bagging, random n features & random k values ,求最優,建樹。 

            posted on 2016-02-28 21:01 bigrabbit 閱讀(327) 評論(0)  編輯 收藏 引用
            午夜精品久久久久久影视777| 亚洲狠狠婷婷综合久久蜜芽| 伊人色综合久久天天人手人婷| 精品久久久久久无码免费| 91精品国产色综合久久| 色偷偷久久一区二区三区| A级毛片无码久久精品免费| 久久亚洲国产最新网站| 亚洲伊人久久综合中文成人网| 久久强奷乱码老熟女| 久久亚洲中文字幕精品一区四| 久久精品综合一区二区三区| 久久精品国产一区二区电影| 日韩十八禁一区二区久久| 人妻无码αv中文字幕久久琪琪布 人妻无码精品久久亚瑟影视 | 中文字幕久久欲求不满| 亚洲欧美精品伊人久久| 久久久久国产亚洲AV麻豆| 色婷婷久久久SWAG精品| 污污内射久久一区二区欧美日韩| 久久中文字幕人妻熟av女| 欧洲人妻丰满av无码久久不卡 | 久久亚洲精品成人av无码网站| 日产精品久久久久久久性色| 99久久精品国产免看国产一区| 久久99国产精品久久| 国内精品久久久久久久影视麻豆| 亚洲日本va午夜中文字幕久久| 人人狠狠综合久久88成人| 嫩草影院久久99| 青青草原综合久久大伊人| 久久99国产精品一区二区| 久久99精品久久久久久野外 | 久久99国产精品成人欧美| 日产精品久久久久久久| 久久国产免费观看精品| 日韩亚洲国产综合久久久| 国产精品国色综合久久| 一97日本道伊人久久综合影院| 97久久综合精品久久久综合| 亚洲另类欧美综合久久图片区|