## sklearn practice 02: random forest

1 RandomForestClassifier
1.1 parameters of control based evaluator
1.2 n_estimators
n_ The larger the estimators, the better the effect of the model. But correspondingly, any model has a decision boundary, n_ After the estimators reach a certain degree, the accuracy of random forest often does not rise or begin to fluctuate, and n_ The lar ...

Posted on *Wed, 17 Nov 2021 04:29:32 -0500* by **faydra92**

## Decision tree picking out good watermelon: pure algorithm

1, Theoretical knowledge
purity For a branch node, if the samples contained in the node belong to the same category, its purity is 1, and we always hope that the higher the purity, the better, that is, as many samples belong to the same category as possible. So how to measure "purity"? Therefore, the concept of "information ...

Posted on *Sat, 06 Nov 2021 08:05:11 -0400* by **ricroma**

## Decision tree -- ID3 algorithm, C4.5 algorithm, CART algorithm

catalogue
Steps of decision tree learning
Advantages and disadvantages of decision tree
Generate decision tree for example code
Decision tree is a tree structure. Each internal node represents the judgment of an attribute, each branch represents the output of a result, and the last leaf node represents the result of a classif ...

Posted on *Thu, 04 Nov 2021 09:55:03 -0400* by **darksniperx**

## The decision tree picks out the good watermelon

1, Decision tree
1.1 INTRODUCTION
Decision tree is a decision analysis method based on the known probability of occurrence of various situations, which calculates the probability that the expected value of net present value is greater than or equal to zero by forming a decision tree, evaluates the project risk and judges its feasibility. ...

Posted on *Sun, 31 Oct 2021 08:18:47 -0400* by **iknownothing**

## Algorithm code implementation of SK learn decision tree ID3, C4.5 and CART

1, ID3 algorithm
1. Pseudo code
ID3 (Examples, Target_Attribute, Attributes)
Create a root node for the tree
If all examples are positive, Return the single-node tree Root, with label = +.
If all examples are negative, Return the single-node tree Root, with label = -.
If number of predicting attributes is empty, then Retur ...

Posted on *Sat, 30 Oct 2021 11:34:47 -0400* by **djelica**

## Machine learning_ 3: Construction and application of decision tree

Experimental background
In previous experiments: Machine learning_ 1:K-nearest neighbor algorithm Machine learning_ 2:K-nearest neighbor algorithm We have learned that K-nearest neighbor algorithm is an algorithm that can be used for classification without training, but it also has many disadvantages. The biggest disadvantage is that it ca ...

Posted on *Wed, 27 Oct 2021 14:08:46 -0400* by **The_Black_Knight**

## [machine learning] decision tree

This experiment will realize a simple binary decision tree. I wanted to finish my homework without being familiar with the theory. As a result, I encountered a bottleneck... So I began to organize my ideas carefully from the beginning. It seems that the shortcuts taken will eventually be redoubled. Knowledge should be accumulated honestly. It's ...

Posted on *Thu, 21 Oct 2021 11:46:06 -0400* by **dancingbear**

## Random forest [machine learning notes]

In machine learning, random forest is a classifier containing multiple decision trees. It is a set algorithm, and its output category is determined by the mode of the category output by individual trees.
Random forest = Bagging + decision tree
Bagging integration principle
bagging integration process 1. Sampling: take a part of all samples 2 ...

Posted on *Thu, 21 Oct 2021 09:39:05 -0400* by **jlh3590**

## [machine learning] hidden glasses selection based on decision tree

Experimental introduction
1. Experimental contents
This experiment learns and implements the decision tree algorithm.
2. Experimental objectives
Through this experiment, master the basic principle of decision tree algorithm.
3. Experimental knowledge points
Shannon entropyinformation gain
4. Experimental environment
python 3.6.5
5. Pr ...

Posted on *Tue, 12 Oct 2021 19:57:50 -0400* by **JasonHarper**

## Principles and common parameters of decision tree and random forest of machine learning algorithm

Summary: random forest can be used for classification and regression as decision tree, but the results of random forest model are often better than decision tree. This article mainly explains the principles and common parameters of the above two ML algorithms.
1, Principle
1.1 decision tree
1.1.1 definition of decision tree
Decision tree is ...

Posted on *Sat, 02 Oct 2021 17:21:30 -0400* by **garblar**