Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Nlp_paper_study | 3,210 | 21 days ago | 3 | C++ | ||||||
该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记 | ||||||||||
Spektral | 2,259 | 3 | 8 days ago | 33 | April 09, 2022 | 61 | mit | Python | ||
Graph Neural Networks with Keras and Tensorflow 2. | ||||||||||
Gat | 2,078 | 2 years ago | 27 | mit | Python | |||||
Graph Attention Networks (https://arxiv.org/abs/1710.10903) | ||||||||||
Pygat | 1,684 | 2 years ago | 32 | mit | Python | |||||
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903) | ||||||||||
Knowledge_graph_attention_network | 434 | 3 years ago | 24 | mit | Python | |||||
KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019 | ||||||||||
Source Code Notebook | 428 | 8 months ago | 3 | Python | ||||||
关于一些经典论文源码的逐行中文笔记 | ||||||||||
Awesome Gcn | 377 | 4 years ago | 1 | |||||||
resources for graph convolutional networks (图卷积神经网络相关资源) | ||||||||||
Gran | 363 | a year ago | 7 | mit | C++ | |||||
Efficient Graph Generation with Graph Recurrent Attention Networks, Deep Generative Model of Graphs, Graph Neural Networks, NeurIPS 2019 | ||||||||||
Appnp | 322 | 7 months ago | gpl-3.0 | Python | ||||||
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019). | ||||||||||
Keras Gat | 301 | 3 years ago | 3 | mit | Python | |||||
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903) |
This is our Tensorflow implementation for the paper:
Xiang Wang, Xiangnan He, Yixin Cao, Meng Liu and Tat-Seng Chua (2019). KGAT: Knowledge Graph Attention Network for Recommendation. Paper in ACM DL or Paper in arXiv. In KDD'19, Anchorage, Alaska, USA, August 4-8, 2019.
Author: Dr. Xiang Wang (xiangwang at u.nus.edu)
Knowledge Graph Attention Network (KGAT) is a new recommendation framework tailored to knowledge-aware personalized recommendation. Built upon the graph neural network framework, KGAT explicitly models the high-order relations in collaborative knowledge graph to provide better recommendation with item side information.
If you want to use our codes and datasets in your research, please cite:
@inproceedings{KGAT19,
author = {Xiang Wang and
Xiangnan He and
Yixin Cao and
Meng Liu and
Tat{-}Seng Chua},
title = {{KGAT:} Knowledge Graph Attention Network for Recommendation},
booktitle = {{KDD}},
pages = {950--958},
year = {2019}
}
The code has been tested running under Python 3.6.5. The required packages are as follows:
To demonstrate the reproducibility of the best performance reported in our paper and faciliate researchers to track whether the model status is consistent with ours, we provide the best parameter settings (might be different for the custormized datasets) in the scripts, and provide the log for our trainings.
The instruction of commands has been clearly stated in the codes (see the parser function in Model/utility/parser.py).
python Main.py --model_type kgat --alg_type bi --dataset yelp2018 --regs [1e-5,1e-5] --layer_size [64,32,16] --embed_size 64 --lr 0.0001 --epoch 1000 --verbose 50 --save_flag 1 --pretrain -1 --batch_size 1024 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1] --use_att True --use_kge True
python Main.py --model_type kgat --alg_type bi --dataset amazon-book --regs [1e-5,1e-5] --layer_size [64,32,16] --embed_size 64 --lr 0.0001 --epoch 1000 --verbose 50 --save_flag 1 --pretrain -1 --batch_size 1024 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1] --use_att True --use_kge True
python Main.py --model_type kgat --alg_type bi --dataset last-fm --regs [1e-5,1e-5] --layer_size [64,32,16] --embed_size 64 --lr 0.0001 --epoch 1000 --verbose 50 --save_flag 1 --pretrain -1 --batch_size 1024 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1] --use_att True --use_kge True
Some important arguments:
model_type
kgat
(by default), proposed in KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019. Usage: --model_type kgat
.bprmf
, proposed in BPR: Bayesian Personalized Ranking from Implicit Feedback, UAI2009. Such model only uses user-item interactions. Usage: --model_type bprmf
.fm
, proposed in Fast context-aware recommendations with factorization machines, SIGIR2011. Usage: --model_type fm
.nfm
, proposed in Neural Factorization Machines for Sparse Predictive Analytics, SIGIR2017. Usage: --model_type nfm
.cke
, proposed in Collaborative Knowledge Base Embedding for Recommender Systems, KDD2016. Usage: --model_type cke
.cfkg
, proposed in Learning Heterogeneous Knowledge Base Embeddings for Explainable Recommendation, Algorithm2018. Usage: --model_type cfkg
.alg_type
kgat
(by default), proposed in KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019. Usage: --alg_type kgat
.gcn
, proposed in Semi-Supervised Classification with Graph Convolutional Networks, ICLR2018. Usage: --alg_type gcn
.graphsage
, propsed in Inductive Representation Learning on Large Graphs., NeurIPS2017. Usage: --alg_type graphsage
.adj_type
si
(by default), where each decay factor between two connected nodes (say, x->y) is set as 1/(out degree of x), while each node is also assigned with 1 for self-connections. Usage: --adj_type si
.bi
, where each decay factor between two connected nodes (say, x->y) is set as 1/sqrt((out degree of x)*(out degree of y)). Usage: --adj_type bi
.mess_dropout
--mess_dropout [0.1,0.1,0.1]
.pretrain
pretrain
as -1
.pretrain
as 0
. In this case, please set the number of epoch and the criteria of early stopping larger.We provide three processed datasets: Amazon-book, Last-FM, and Yelp2018.
Amazon-book | Last-FM | Yelp2018 | ||
---|---|---|---|---|
User-Item Interaction | #Users | 70,679 | 23,566 | 45,919 |
#Items | 24,915 | 48,123 | 45,538 | |
#Interactions | 847,733 | 3,034,796 | 1,185,068 | |
Knowledge Graph | #Entities | 88,572 | 58,266 | 90,961 |
#Relations | 39 | 9 | 42 | |
#Triplets | 2,557,746 | 464,567 | 1,853,704 |
train.txt
userID
and a list of itemID
).test.txt
userID
and a list of itemID
).user_list.txt
org_id
, remap_id
) for one user, where org_id
and remap_id
represent the ID of such user in the original and our datasets, respectively.item_list.txt
org_id
, remap_id
, freebase_id
) for one item, where org_id
, remap_id
, and freebase_id
represent the ID of such item in the original, our datasets, and freebase, respectively.entity_list.txt
freebase_id
, remap_id
) for one entity in knowledge graph, where freebase_id
and remap_id
represent the ID of such entity in freebase and our datasets, respectively.relation_list.txt
freebase_id
, remap_id
) for one relation in knowledge graph, where freebase_id
and remap_id
represent the ID of such relation in freebase and our datasets, respectively.Any scientific publications that use our datasets should cite the following paper as the reference:
@inproceedings{KGAT19,
author = {Xiang Wang and
Xiangnan He and
Yixin Cao and
Meng Liu and
Tat-Seng Chua},
title = {KGAT: Knowledge Graph Attention Network for Recommendation},
booktitle = {{KDD}},
year = {2019}
}
Nobody guarantees the correctness of the data, its suitability for any particular purpose, or the validity of results based on the use of the data set. The data set may be used for any research purposes under the following conditions:
This research is supported by the National Research Foundation, Singapore under its International Research Centres in Singapore Funding Initiative. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not reflect the views of National Research Foundation, Singapore.