Gradient Boosting and XGBoost in Decision Trees

 
G
r
a
d
i
e
n
t
 
B
o
o
s
t
i
n
g
 
D
e
c
i
s
i
o
n
T
r
e
e
/
e
X
t
r
e
m
e
 
G
r
a
d
i
e
n
t
 
B
o
o
s
t
i
n
g
 
2021.12.29
Jialin Li
 
Boosting
 
Samples are strongly relative.
 
 
Y=f(x)+L
L:redusial function
 
CART 
回归树
:
 
 
 
 
Gradient Boosting Decision Tree
 
XGboost
 
optimized GBDT
 
1.L
函数正则化(防止过拟合)
 
 
 
 
 
 
2.
L
函数进行二阶泰勒展开
3.
支持并行处理
 
Application
 
 
Import xgboost
1.
导入数据
2.
建立模型
(train and test)
3.
评估
 
Some examples
https://github.com/dmlc/xgboost
 
Next to do
 
1.
Try to train with our samples
2.
Learn the meaning of parameters (important to adjust the params)
 
Backup
 
 
GBDT
常见的损失函数:
 
Backup
 
 
About XGboost
https://indico.cern.ch/event/382895/contributions/910921/attachments/763480/104
7450/XGBoost_tianqi.pdf
Slide Note
Embed
Share

Dive into the world of Gradient Boosting and XGBoost techniques with a focus on Decision Trees, their applications, optimization, and training methods. Explore the significance of parameter tuning and training with samples to enhance your machine learning skills. Access resources to deepen your understanding and stay updated on the latest advancements in the field.

  • Gradient Boosting
  • XGBoost
  • Decision Trees
  • Machine Learning
  • Parameter Tuning

Uploaded on Aug 14, 2024 | 1 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Gradient B Boosting Decision Tree/eXtremeGradient Boosting 2021.12.29 Jialin Li

  2. Boosting Y=f(x)+L L:redusial function Samples are strongly relative.

  3. CART :

  4. Gradient Boosting Decision Tree fM(x)= m=1 ?m(?) , Tm is the m tree m is number of trees. M

  5. XGboost optimized GBDT 1.L 2. L 3.

  6. Application Import xgboost 1. 2. (train and test) 3. Some examples https://github.com/dmlc/xgboost

  7. Next to do 1. Try to train with our samples 2. Learn the meaning of parameters (important to adjust the params)

  8. Backup GBDT

  9. Backup About XGboost https://indico.cern.ch/event/382895/contributions/910921/attachments/763480/104 7450/XGBoost_tianqi.pdf

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#