Alternatives To Sahp_repo
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Danet1,556
3 years ago33mitPython
Dual Attention Network for Scene Segmentation (CVPR2019)
Deepvoice3_pytorch1,449
2 years ago29otherPython
PyTorch implementation of convolutional neural networks-based text-to-speech synthesis models
Image_segmentation880
3 years ago20Python
Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.
Transformer644
2 months ago3Python
PyTorch Implementation of "Attention Is All You Need"
Vad632
2 years ago32MATLAB
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Moran_v2593
4 months ago24mitPython
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Transformer583
6 months ago7gpl-3.0Jupyter Notebook
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Attention Networks For Classification477
3 years ago8Jupyter Notebook
Hierarchical Attention Networks for Document Classification in PyTorch
Long Range Arena456
6 months ago18apache-2.0Python
Long Range Arena for Benchmarking Efficient Transformers
Mac Network445
2 years ago9apache-2.0Python
Implementation for the paper "Compositional Attention Networks for Machine Reasoning" (Hudson and Manning, ICLR 2018)
Alternatives To Sahp_repo
Select To Compare


Alternative Project Comparisons
Readme

SAHP

This is the repository for the Self-Attentive Hawkes Processes paper where self-attention is used to adapt the intensity function of Hawkes process.

Dataset

The realword datasets are available on this [Google drive] (https://drive.google.com/drive/folders/0BwqmV0EcoUc8UklIR1BKV25YR1U) while the synthetic dataset is at this [link] (https://drive.google.com/file/d/1lRUIJx5UIPMx4TMwKy6GiAiP-k2vwvDc/view?usp=sharing). To run the model, you should download them to the parent directory of the source code, with the folder name data.

To make the data format consistent, it is necessary to run the script convert_realdata_syntheform.py first.

Package

The Python version should be at least 3.5 and the torch version can be 0.4.1

Scripts

models defines the self-attentive Hawkes model, multi-head attention and the related.

main_func.py is the main function to run the experiments, hyper-parameters are provided here.

utils contains utility functions

To run the model: python main_func.py

Citation

@article{zhang2019self,
  title={Self-attentive Hawkes processes},
  author={Zhang, Qiang and Lipani, Aldo and Kirnap, Omer and Yilmaz, Emine},
  journal={arXiv preprint arXiv:1907.07561},
  year={2019}
}
Popular Dataset Projects
Popular Attention Projects
Popular Data Processing Categories

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Dataset
Attention
Google Drive