Sampling and Parameter Fitting with Hawkes Processes

H
UMAN
-
CENTERED
 M
ACHINE
 L
EARNING
Sampling and parameter fitting with
Hawkes Processes
http://courses.mpi-sws.org/hcml-ws18/
2
Recap: How to fit and why sample?
Raw Data
 
Infer parameters
(Learning)
 
Parametrize intensity
 
Derive Log-likelihood:
 
Maximum likelihood estimation:
 
Sampling
(Predicting)
 
o
Prediction
o
Model Checking
o
Sanity check
o
Gaining Intuition
o
S
i
m
u
l
a
t
o
r
o
Summary statistics
 
Event times drawn from:
 
Helps with:
We will first 
sample
 and then 
fit
.
3
Recap: What are Hawkes processes?
time
Intensity of self-exciting
(or Hawkes) process:
 
Observations:
 
1.
Clustered (or bursty) occurrence of events
2.
Intensity is stochastic and history dependent
History,
Recap: How to fit a Hawkes process?
time
 
Maximum
likelihood
 
The max. likelihood
is 
jointly 
convex
 
in     and
 
(use CVX!)
5
Recap: How to sample from a Hawkes process
time
Thinning procedure (similar to rejection sampling):
 
1.
Sample     from Poisson process with intensity
 
3.
Keep the sample if
 
Inversion
sampling
 
2.
Generate
 
Keep sample with
prob.
6
Coding assignment overview
Sampler
(Assignment 1)
Parameter fitting
(Assignment 2)
Sanity check
(provided)
7
Sampling: Ogata’s algorithm
O
g
a
t
a
s
 
m
e
t
h
o
d
 
o
f
 
t
h
i
n
n
i
n
g
 
Drawing 1 sample with intensity
Accepting it with prob
8
Sampling: Achtung I
O
g
a
t
a
s
 
m
e
t
h
o
d
 
o
f
 
t
h
i
n
n
i
n
g
 
C
a
r
e
f
u
l
 
a
b
o
u
t
 
e
x
p
l
o
d
i
n
g
i
n
t
e
n
s
i
t
i
e
s
!
I
n
c
l
u
d
e
 
a
 
c
h
e
c
k
 
o
n
 
i
.
E
n
s
u
r
e
 
9
Sampling: Achtung II
O
g
a
t
a
s
 
m
e
t
h
o
d
 
o
f
 
t
h
i
n
n
i
n
g
 
History up to
but not including
t.
10
Sampling: Achtung III
O
g
a
t
a
s
 
m
e
t
h
o
d
 
o
f
 
t
h
i
n
n
i
n
g
 
is not an actual sample!
11
Sampling: Evaluation
python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png
Juxtaposed events
from sequences
Theoretical value
Empirical average
12
Sampling: Evaluation
python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png
Juxtaposed events
from sequences
Theoretical value
Empirical average
Poisson (incorrect)
sampler
13
Sampling: Evaluation
python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png
Juxtaposed events
from sequences
Theoretical value
Empirical average
Deterministic
(incorrect) sampler
14
Live coding
Show baseline + sanity check
Show Poisson + sanity check
15
Parameter fitting: Problem setting
Parameter fitting
(Assignment 2)
Sampler
(Assignment 1)
 
samples.txt
 
samples-test.txt
16
Parameter fitting: Method
 
Nested sum
Maximum likelihood estimation:
Can we do it faster for
exponential kernels?
17
Parameter fitting: Using 
cvxpy
18
Parameter fitting: Using 
cvxpy
Change this to maximize the likelihood 
19
Live coding
Show baseline + desired output
Evaluation via sampling
20
Happy coding and holidays!
Drop me an e-mail at 
utkarshu@mpi-sws.org
Skype: 
utkarsh.upadhyay
Q
u
e
s
t
i
o
n
s
?
Slide Note
Embed
Share

Learn about sampling and parameter fitting with Hawkes processes in the context of human-centered machine learning. Understand the importance of fitting parameters and sampling raw data event times. Explore the characteristics and fitting methods of Hawkes processes, along with coding assignments and sampling algorithms.

  • Sampling
  • Parameter Fitting
  • Hawkes Processes
  • Machine Learning
  • Coding

Uploaded on Sep 18, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Sampling and parameter fitting with Hawkes Processes HUMAN-CENTERED MACHINE LEARNING http://courses.mpi-sws.org/hcml-ws18/

  2. Recap: How to fit and why sample? Infer parameters (Learning) Sampling (Predicting) Raw Data Event times drawn from: Parametrize intensity Derive Log-likelihood: Helps with: Prediction o Model Checking o Sanity check o Gaining Intuition o Maximum likelihood estimation: Simulator o Summary statistics o We will first sample and then fit. 2

  3. Recap: What are Hawkes processes? time History, Intensity of self-exciting (or Hawkes) process: Observations: 1. Clustered (or bursty) occurrence of events 2. Intensity is stochastic and history dependent 3

  4. Recap: How to fit a Hawkes process? time Maximum likelihood The max. likelihood is jointly convex in and

  5. Recap: How to sample from a Hawkes process time Thinning procedure (similar to rejection sampling): 1. Sample from Poisson process with intensity Inversion sampling 2. Generate Keep sample with prob. 3. Keep the sample if 5

  6. Coding assignment overview Parameter fitting (Assignment 2) Sampler (Assignment 1) Sanity check (provided) 6

  7. Sampling: Ogatas algorithm Ogata s method of thinning Drawing 1 sample with intensity Accepting it with prob 7

  8. Sampling: Achtung I Ogata s method of thinning Careful about exploding intensities! Include a check on i. Ensure 8

  9. Sampling: Achtung II Ogata s method of thinning History up to but not including t. 9

  10. Sampling: Achtung III Ogata s method of thinning is not an actual sample! 10

  11. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Juxtaposed events from sequences 11

  12. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Poisson (incorrect) sampler Juxtaposed events from sequences 12

  13. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Deterministic (incorrect) sampler Juxtaposed events from sequences 13

  14. Live coding Show baseline + sanity check Show Poisson + sanity check 14

  15. Parameter fitting: Problem setting Parameter fitting (Assignment 2) Sampler (Assignment 1) samples.txt samples-test.txt 15

  16. Parameter fitting: Method Maximum likelihood estimation: Nested sum Can we do it faster for exponential kernels? 16

  17. Parameter fitting: Using cvxpy 17

  18. Parameter fitting: Using cvxpy Change this to maximize the likelihood 18

  19. Live coding Show baseline + desired output Evaluation via sampling 19

  20. Happy coding and holidays! Questions? Drop me an e-mail at utkarshu@mpi-sws.org Skype: utkarsh.upadhyay 20

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#