Lessons Learned from Developing Automated Machine Learning on HPC

undefined
 
L
e
s
s
o
n
s
 
L
e
a
r
n
e
d
 
f
r
o
m
 
D
e
v
e
l
o
p
i
n
g
 
A
u
t
o
m
a
t
e
d
M
a
c
h
i
n
e
 
L
e
a
r
n
i
n
g
 
o
n
 
H
P
C
 
Romain EGELE
 
romain.egele@universite-paris-saclay.fr
 
2
 
Selecting the Baseline
 
Multi-Fidelity Hyperparameter Optimization
 
Is One Epoch All You Need for Multi-Fidelity Hyperparameter Optimization?
arXiv
: 2307.15422
3
 
Optimizer, 
Learning rate,
Number of layers, 
Type of layer
 
Augmentation, Normalization
4
From: 
https://amueller.github.io/aml/04-model-evaluation/parameter_tuning_automl.html#successive-halving-different-example
Multi-Fidelity Example with Successive Halving
5
Learning Curve Extrapolation (LCE)
 
Observation
 
Probability of
Performing
Worse
RoBER
 versus 
Weighted Prob. Mixture
 
FAILURE
 
SUCCESS
7
Baselines for Budget
 
M
i
n
i
m
u
m
 
N
u
m
b
e
r
 
o
f
Training Steps
 
M
a
x
i
m
u
m
 
N
u
m
b
e
r
 
o
f
Training Steps
 
Bounds on budget
“Training Steps”
 
1-Epoch
 
100-Epoch
 
WINNERS
 
10
 
Low fidelity evaluations
“1-Epoch”
 
can be accurate predictors for model-selection.
 
Paper
 
Software
 
E
N
D
 
 
B
A
C
K
U
P
 
S
L
I
D
E
S
 
⚠️
 
11
 
12
Slide Note

Hello, my name is Romain EGELE, graduate student from Université Paris-Saclay and with a joint appointment at Argonne National Laboratory.

Today, I will be talking about some negative results we had when developing automated machine learning algorithms.

Embed
Share

This presentation by Romain EGELE explores various aspects of developing automated machine learning on High-Performance Computing (HPC) systems. Topics covered include multi-fidelity optimization, hyperparameters, model evaluation methods, learning curve extrapolation, and more valuable insights for efficient machine learning development.

  • Machine Learning
  • Automated
  • HPC
  • Optimization
  • Hyperparameters

Uploaded on Sep 21, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Lessons Learned from Developing Automated Machine Learning on HPC Romain EGELE romain.egele@universite-paris-saclay.fr

  2. Selecting the Baseline Multi-Fidelity Hyperparameter Optimization Is One Epoch All You Need for Multi-Fidelity Hyperparameter Optimization? arXiv: 2307.15422 2

  3. Augmentation, Normalization Optimizer, Learning rate, Number of layers, Type of layer 3

  4. Multi-Fidelity Example with Successive Halving From: https://amueller.github.io/aml/04-model-evaluation/parameter_tuning_automl.html#successive-halving-different-example 4

  5. Learning Curve Extrapolation (LCE) Probability of Performing Worse Observation 5

  6. RoBER versus Weighted Prob. Mixture FAILURE SUCCESS

  7. Baselines for Budget Bounds on budget Training Steps Minimum Number of Training Steps Maximum Number of Training Steps 1-Epoch 100-Epoch 7

  8. WINNERS

  9. Low fidelity evaluations 1-Epoch can be accurate predictors for model-selection. Paper Software 10

  10. END BACKUP SLIDES 11

  11. Hyper par am eter Sear ch Space Sear ch continue Sugges t Conf igur ation Tr ue F al s e Tr aining continue Execute Tr aining Step Model Sel ection Tr ue F al s e Tr ained Model with Es tim ated Bes t Hyper par ameter s 12

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#