Sampling and Parameter Fitting with Hawkes Processes

Slide Note
Embed
Share

Learn about sampling and parameter fitting with Hawkes processes in the context of human-centered machine learning. Understand the importance of fitting parameters and sampling raw data event times. Explore the characteristics and fitting methods of Hawkes processes, along with coding assignments and sampling algorithms.


Uploaded on Sep 18, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Sampling and parameter fitting with Hawkes Processes HUMAN-CENTERED MACHINE LEARNING http://courses.mpi-sws.org/hcml-ws18/

  2. Recap: How to fit and why sample? Infer parameters (Learning) Sampling (Predicting) Raw Data Event times drawn from: Parametrize intensity Derive Log-likelihood: Helps with: Prediction o Model Checking o Sanity check o Gaining Intuition o Maximum likelihood estimation: Simulator o Summary statistics o We will first sample and then fit. 2

  3. Recap: What are Hawkes processes? time History, Intensity of self-exciting (or Hawkes) process: Observations: 1. Clustered (or bursty) occurrence of events 2. Intensity is stochastic and history dependent 3

  4. Recap: How to fit a Hawkes process? time Maximum likelihood The max. likelihood is jointly convex in and

  5. Recap: How to sample from a Hawkes process time Thinning procedure (similar to rejection sampling): 1. Sample from Poisson process with intensity Inversion sampling 2. Generate Keep sample with prob. 3. Keep the sample if 5

  6. Coding assignment overview Parameter fitting (Assignment 2) Sampler (Assignment 1) Sanity check (provided) 6

  7. Sampling: Ogatas algorithm Ogata s method of thinning Drawing 1 sample with intensity Accepting it with prob 7

  8. Sampling: Achtung I Ogata s method of thinning Careful about exploding intensities! Include a check on i. Ensure 8

  9. Sampling: Achtung II Ogata s method of thinning History up to but not including t. 9

  10. Sampling: Achtung III Ogata s method of thinning is not an actual sample! 10

  11. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Juxtaposed events from sequences 11

  12. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Poisson (incorrect) sampler Juxtaposed events from sequences 12

  13. Sampling: Evaluation python plot_hawkes.py 1.0 0.5 1.0 10 sampled-sequences.txt output-plot.png Theoretical value Empirical average Deterministic (incorrect) sampler Juxtaposed events from sequences 13

  14. Live coding Show baseline + sanity check Show Poisson + sanity check 14

  15. Parameter fitting: Problem setting Parameter fitting (Assignment 2) Sampler (Assignment 1) samples.txt samples-test.txt 15

  16. Parameter fitting: Method Maximum likelihood estimation: Nested sum Can we do it faster for exponential kernels? 16

  17. Parameter fitting: Using cvxpy 17

  18. Parameter fitting: Using cvxpy Change this to maximize the likelihood 18

  19. Live coding Show baseline + desired output Evaluation via sampling 19

  20. Happy coding and holidays! Questions? Drop me an e-mail at utkarshu@mpi-sws.org Skype: utkarsh.upadhyay 20

Related


More Related Content