Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Draw | 501 | 5 years ago | 9 | apache-2.0 | Python | |||||
TensorFlow Implementation of "DRAW: A Recurrent Neural Network For Image Generation" | ||||||||||
Gran | 363 | a year ago | 7 | mit | C++ | |||||
Efficient Graph Generation with Graph Recurrent Attention Networks, Deep Generative Model of Graphs, Graph Neural Networks, NeurIPS 2019 | ||||||||||
Ccm | 187 | 5 years ago | 6 | apache-2.0 | Python | |||||
This project is a tensorflow implement of our work, CCM. | ||||||||||
Draw | 131 | 8 years ago | 4 | Lua | ||||||
Torch implementation of DRAW: A Recurrent Neural Network For Image Generation | ||||||||||
Neural Question Generation | 103 | 3 years ago | 9 | mit | Python | |||||
Pytorch implementation of Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks | ||||||||||
Treegen | 70 | a year ago | 23 | mit | Python | |||||
A Tree-Based Transformer Architecture for Code Generation. (AAAI'20) | ||||||||||
Tf Var Attention | 55 | 3 years ago | mit | Python | ||||||
Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018) | ||||||||||
Show_attend_and_tell_pytorch | 51 | 4 years ago | 5 | Python | ||||||
Pytorch implement Show, Attend and Tell: Neural Image Caption Generation with Visual Attention | ||||||||||
Encoder Agnostic Adaptation | 38 | 3 years ago | 8 | mit | Python | |||||
Encoder-Agnostic Adaptation for Conditional Language Generation | ||||||||||
Recosa | 36 | 3 years ago | Python | |||||||
ReCoSa: Detecting the Relevant Contexts with Self-Attention for Multi-turn Dialogue Generation |
This is the official PyTorch implementation of Efficient Graph Generation with Graph Recurrent Attention Networks as described in the following NeurIPS 2019 paper:
@inproceedings{liao2019gran,
title={Efficient Graph Generation with Graph Recurrent Attention Networks},
author={Liao, Renjie and Li, Yujia and Song, Yang and Wang, Shenlong and Nash, Charlie and Hamilton, William L. and Duvenaud, David and Urtasun, Raquel and Zemel, Richard},
booktitle={NeurIPS},
year={2019}
}
Python 3, PyTorch(1.2.0)
Other dependencies can be installed via
pip install -r requirements.txt
To run the training of experiment X
where X
is one of {gran_grid
, gran_DD
, gran_DB
, gran_lobster
}:
python run_exp.py -c config/X.yaml
Note:
config
for a full list of configuration yaml files.After training, you can specify the test_model
field of the configuration yaml file with the path of your best model snapshot, e.g.,
test_model: exp/gran_grid/xxx/model_snapshot_best.pth
To run the test of experiments X
:
python run_exp.py -c config/X.yaml -t
Note:
You could use our trained model for comparisons. Please make sure you are using the same split of the dataset. Running the following script will download the trained model:
./download_model.sh
Proteins Graphs from Training Set:
Proteins Graphs Sampled from GRAN:
Please cite our paper if you use this code in your research work.
Please submit a Github issue or contact [email protected] if you have any questions or find any bugs.