Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Spektral | 2,259 | 6 | 4 months ago | 34 | June 01, 2023 | 61 | mit | Python | ||
Graph Neural Networks with Keras and Tensorflow 2. | ||||||||||
Gat | 2,078 | 2 years ago | 27 | mit | Python | |||||
Graph Attention Networks (https://arxiv.org/abs/1710.10903) | ||||||||||
Nlp Journey | 1,528 | 5 months ago | 3 | April 29, 2020 | apache-2.0 | Python | ||||
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0. | ||||||||||
Hopfield Layers | 1,258 | 2 years ago | other | Python | ||||||
Hopfield Networks is All You Need | ||||||||||
Rcan | 1,252 | 3 months ago | 76 | Python | ||||||
PyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks" | ||||||||||
Deep_architecture_genealogy | 1,196 | 3 years ago | 6 | Python | ||||||
Deep Learning Architecture Genealogy Project | ||||||||||
Attention Gated Networks | 1,099 | 3 years ago | mit | Python | ||||||
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation | ||||||||||
Textclassifier | 1,003 | 2 years ago | 29 | apache-2.0 | Python | |||||
Text classifier for Hierarchical Attention Networks for Document Classification | ||||||||||
Nlp Paper | 820 | 10 months ago | 2 | May 09, 2022 | 1 | apache-2.0 | Python | |||
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本) | ||||||||||
Spatial Transformer Network | 661 | 1 | 5 years ago | 3 | June 02, 2018 | 17 | mit | Python | ||
A Tensorflow implementation of Spatial Transformer Networks. |
DEPRECATED |
---|
This implementation of GAT is no longer actively maintained and may not work with modern versions of Tensorflow and Keras. Check out Spektral and its GAT example for a Tensorflow/Keras implementation of GAT. |
This is a Keras implementation of the Graph Attention Network (GAT) model by Veličković et al. (2017, [arXiv link]).
I have no affiliation with the authors of the paper and I am implementing this code for non-commercial reasons.
The authors published their reference Tensorflow implementation here, so check it out for something that is guaranteed to work as intended. Their implementation is slightly different than mine, so that may be something to keep in mind.
You should cite the paper if you use any of this code for your research:
@article{
velickovic2018graph,
title="{Graph Attention Networks}",
author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=rJXMpikCZ},
note={Accepted as poster},
}
If you would like to give me credit, feel free to link to my Github profile, blog, or Twitter.
I also copied the code in utils.py
almost verbatim from this repo by Thomas Kipf, who I thank sincerely for sharing his work on GCNs and GAEs, and for giving me a few pointers on how to split the data into train/test/val sets.
Thanks to mawright, matthias-samwald, and vermaMachineLearning for helping me out with bugs, performance improvements, and running experiments.
I do not own any rights to the datasets distributed with this code, but they are publicly available at the following links:
To install as a module:
$ git clone https://github.com/danielegrattarola/keras-gat.git
$ cd keras-gat
$ pip install .
$ python
>>> from keras_gat import GraphAttention
Or you can just copy and paste graph_attention_layer.py
into your project.
To replicate the experimental results of the paper, simply run:
$ python examples/gat.py