Two Are Better Than One

Code associated with the paper **Two are Better Than One: Joint Entity and Relation Extraction with Table-Sequence Encoders**, at EMNLP 2020
Alternatives To Two Are Better Than One
Project NameStarsDownloadsRepos Using ThisPackages Using ThisMost Recent CommitTotal ReleasesLatest ReleaseOpen IssuesLicenseLanguage
Gpt Neox6,366
5 months ago81apache-2.0Python
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Deberta1,673
9 months ago13February 09, 202163mitPython
The implementation of DeBERTa
Nlp Paper579
5 months ago1
NLP Paper
Neural_sp466
3 years ago43apache-2.0Python
End-to-end ASR/LM implementation with PyTorch
Two Are Better Than One168
2 years ago4Python
Code associated with the paper **Two are Better Than One: Joint Entity and Relation Extraction with Table-Sequence Encoders**, at EMNLP 2020
Xlnet Gen166
3 years ago11mitPython
XLNet for generating language.
Tldr Transformers163
2 years ago1mit
The "tl;dr" on a few notable transformer papers.
Awesome Nlp Resources157
3 years agomit
This repository contains landmark research papers in Natural Language Processing that came out in this century.
Prime80
a year ago1otherPython
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
Languagemodel Using Attention25
6 years agoPython
Pytorch implementation of a basic language model using Attention in LSTM network
Alternatives To Two Are Better Than One
Select To Compare


Alternative Project Comparisons
Popular Attention Projects
Popular Language Model Projects
Popular Machine Learning Categories
Related Searches

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Python
Attention
Language Model