• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>

            a tutorial on computer science

              C++博客 :: 首頁 :: 新隨筆 :: 聯系 :: 聚合  :: 管理 ::
              21 隨筆 :: 0 文章 :: 17 評論 :: 0 Trackbacks
            stander random forest:  random K features, enum all values as split, find best split.

            LINKS:https://en.wikipedia.org/wiki/Random_forest


            Extremely randomized trees: random K features, random a split value, find best split.
            ensemble Extremely randomized trees: use all data.

            LINKS:http://docs.opencv.org/2.4/modules/ml/doc/ertrees.html

            1. Extremely randomized trees don’t apply the bagging procedure to construct a set of the training samples for each tree. The same input training set is used to train all trees.
            2. Extremely randomized trees pick a node split very extremely (both a variable index and variable splitting value are chosen randomly), whereas Random Forest finds the best split (optimal one by variable index and variable splitting value) among random subset of variables.

              Extremely randomized trees用了所有的樣本作為訓練集;Extremely randomized trees隨機選一個特征和一個值作為分割標準;

              LINKS:http://scikit-learn.org/stable/modules/generated/sklearn.tree.ExtraTreeRegressor.html#sklearn.tree.ExtraTreeRegressor

              This class implements a meta estimator that fits a number of randomized decision trees (a.k.a. extra-trees) on various sub-samples of the dataset and use averaging to improve the predictive accuracy and control over-fitting.

              Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the 
              max_features randomly selected features and the best split among those is chosen. When max_features is set 1, this amounts to building a totally random decision tree.

              extra-trees 的ensemble用了bagging,然后選取多個特征,每個特征隨機選一個值作為分割標準建樹。

              一種實現方法:
                     樣本bagging, random n features & random k values ,求最優,建樹。 

            posted on 2016-02-28 21:01 bigrabbit 閱讀(330) 評論(0)  編輯 收藏 引用
            国产成人综合久久综合| 人人狠狠综合88综合久久| 久久精品国产亚洲AV久| 色88久久久久高潮综合影院| 久久免费的精品国产V∧| 丰满少妇人妻久久久久久| 久久亚洲中文字幕精品一区四 | 久久精品国产99久久无毒不卡| 久久久久亚洲av无码专区| 国产精品久久久天天影视香蕉| 色婷婷狠狠久久综合五月| 无码人妻少妇久久中文字幕蜜桃| 伊人久久大香线蕉成人| 久久精品中文无码资源站| 久久无码精品一区二区三区| 久久人人爽人人爽人人AV| 久久九九久精品国产| 久久精品aⅴ无码中文字字幕不卡| 国内精品欧美久久精品| 蜜臀久久99精品久久久久久小说| 国产精品久久久久久久久久免费| 蜜臀久久99精品久久久久久小说| 免费一级做a爰片久久毛片潮| 99久久人妻无码精品系列| 亚洲日本va午夜中文字幕久久| 夜夜亚洲天天久久| 久久久久无码精品国产| 久久强奷乱码老熟女网站| 亚洲欧美精品一区久久中文字幕| 777久久精品一区二区三区无码 | 国产精品久久久久久久久| 欧美亚洲国产精品久久| 亚洲七七久久精品中文国产| 久久久免费观成人影院| 久久久久九国产精品| 久久久WWW免费人成精品| 久久国产美女免费观看精品| 精品无码久久久久久久久久| 国产精品亚洲综合专区片高清久久久| 国产91色综合久久免费| 久久国产精品久久精品国产|