Learning general temporal point processes based on dynamic weight generation

2021 
Most real-world events relate to historical cases scholastically and chronologically. Some increase the probability of the next event in the future, while some do not. Many approaches leverage temporal point processes with explicitly defined probability intensity functions concerning time t and history ${\mathscr{H}}$ , such as Poisson Process and Hawkes Process, to measure the time-varying probability. However, fixed-form intensity functions can limit the performance of temporal point process models, owing to the lack of prior knowledge of required intensity functions and the complexity of the real world. Neural networks’ ability to approximate functions makes the neural temporal point process a promising approach compared with the traditional temporal point process. In the paper, we identify several drawbacks the previous works have in meeting the mathematical constraints before designing a matrix-multiplication-based model to tackle these drawbacks. The experimental results reveal that our model is superior to other neural temporal point processes with better mathematical interpretability and extrapolation capability.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    41
    References
    0
    Citations
    NaN
    KQI
    []