Sampling and Parameter Fitting with Hawkes Processes
Learn about sampling and parameter fitting with Hawkes processes in the context of human-centered machine learning. Understand the importance of fitting parameters and sampling raw data event times. Explore the characteristics and fitting methods of Hawkes processes, along with coding assignments and sampling algorithms.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Sampling and parameter fitting with Hawkes Processes HUMAN-CENTERED MACHINE LEARNING http://courses.mpi-sws.org/hcml-ws18/
Recap: How to fit and why sample? Infer parameters (Learning) Sampling (Predicting) Raw Data Event times drawn from: Parametrize intensity Derive Log-likelihood: Helps with: Prediction o Model Checking o Sanity check o Gaining Intuition o Maximum likelihood estimation: Simulator o Summary statistics o We will first sample and then fit. 2
Recap: What are Hawkes processes? time History, Intensity of self-exciting (or Hawkes) process: Observations: 1. Clustered (or bursty) occurrence of events 2. Intensity is stochastic and history dependent 3
Recap: How to fit a Hawkes process? time Maximum likelihood The max. likelihood is jointly convex in and
Recap: How to sample from a Hawkes process time Thinning procedure (similar to rejection sampling): 1. Sample from Poisson process with intensity Inversion sampling 2. Generate Keep sample with prob. 3. Keep the sample if 5
Coding assignment overview Parameter fitting (Assignment 2) Sampler (Assignment 1) Sanity check (provided) 6
Sampling: Ogatas algorithm Ogata s method of thinning Drawing 1 sample with intensity Accepting it with prob 7
Sampling: Achtung I Ogata s method of thinning Careful about exploding intensities! Include a check on i. Ensure 8
Sampling: Achtung II Ogata s method of thinning History up to but not including t. 9
Sampling: Achtung III Ogata s method of thinning is not an actual sample! 10
Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Juxtaposed events from sequences 11
Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Poisson (incorrect) sampler Juxtaposed events from sequences 12
Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Deterministic (incorrect) sampler Juxtaposed events from sequences 13
Live coding Show baseline + sanity check Show Poisson + sanity check 14
Parameter fitting: Problem setting Parameter fitting (Assignment 2) Sampler (Assignment 1) samples.txt samples-test.txt 15
Parameter fitting: Method Maximum likelihood estimation: Nested sum Can we do it faster for exponential kernels? 16
Parameter fitting: Using cvxpy Change this to maximize the likelihood 18
Live coding Show baseline + desired output Evaluation via sampling 19
Happy coding and holidays! Questions? Drop me an e-mail at utkarshu@mpi-sws.org Skype: utkarsh.upadhyay 20