Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)

Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|

Nlp_paper_study | 3,373 | a month ago | 1 | C++ | ||||||

该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记 | ||||||||||

Spektral | 2,259 | 6 | 4 months ago | 34 | June 01, 2023 | 61 | mit | Python | ||

Graph Neural Networks with Keras and Tensorflow 2. | ||||||||||

Gat | 2,078 | 2 years ago | 27 | mit | Python | |||||

Graph Attention Networks (https://arxiv.org/abs/1710.10903) | ||||||||||

Pygat | 1,684 | 2 years ago | 32 | mit | Python | |||||

Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903) | ||||||||||

Knowledge_graph_attention_network | 434 | 3 years ago | 24 | mit | Python | |||||

KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019 | ||||||||||

Source Code Notebook | 428 | a year ago | 3 | Python | ||||||

关于一些经典论文源码的逐行中文笔记 | ||||||||||

Awesome Gcn | 377 | 4 years ago | 1 | |||||||

resources for graph convolutional networks （图卷积神经网络相关资源） | ||||||||||

Gran | 363 | a year ago | 7 | mit | C++ | |||||

Efficient Graph Generation with Graph Recurrent Attention Networks, Deep Generative Model of Graphs, Graph Neural Networks, NeurIPS 2019 | ||||||||||

Appnp | 322 | a year ago | gpl-3.0 | Python | ||||||

A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019). | ||||||||||

Keras Gat | 301 | 3 years ago | 3 | mit | Python | |||||

Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903) |

Alternatives To PygatSelect To Compare

Alternative Project Comparisons

Readme

This is a pytorch implementation of the Graph Attention Network (GAT) model presented by Veličković et. al (2017, https://arxiv.org/abs/1710.10903).

The repo has been forked initially from tkipf/pygcn. The official repository for the GAT (Tensorflow) is available in PetarV-/GAT. Therefore, if you make advantage of the pyGAT model in your research, please cite the following:

```
@article{
velickovic2018graph,
title="{Graph Attention Networks}",
author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=rJXMpikCZ},
note={accepted as poster},
}
```

The branch **master** contains the implementation from the paper. The branch **similar_impl_tensorflow** the implementation from the official Tensorflow repository.

For the branch **master**, the training of the transductive learning on Cora task on a Titan Xp takes ~0.9 sec per epoch and 10-15 minutes for the whole training (~800 epochs). The final accuracy is between 84.2 and 85.3 (obtained on 5 different runs). For the branch **similar_impl_tensorflow**, the training takes less than 1 minute and reach ~83.0.

A small note about initial sparse matrix operations of tkipf/pygcn: they have been removed. Therefore, the current model take ~7GB on GRAM.

We develop a sparse version GAT using pytorch. There are numerically instability because of softmax function. Therefore, you need to initialize carefully. To use sparse version GAT, add flag `--sparse`

. The performance of sparse version is similar with tensorflow. On a Titan Xp takes 0.08~0.14 sec.

pyGAT relies on Python 3.5 and PyTorch 0.4.1 (due to torch.sparse_coo_tensor).

Don't hesitate to contact for any feedback or create issues/pull requests.

Popular Attention Projects

Popular Graph Projects

Popular Machine Learning Categories

Related Searches

Get A Weekly Email With Trending Projects For These Categories

No Spam. Unsubscribe easily at any time.

Python

Graph

Pytorch

Tensorflow

Neural Network

Attention

Attention Mechanism