Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Pytorch Cyclegan And Pix2pix | 20,036 | 4 months ago | 493 | other | Python | |||||
Image-to-Image Translation in PyTorch | ||||||||||
Deeplearningexamples | 11,463 | 14 days ago | 279 | Jupyter Notebook | ||||||
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure. | ||||||||||
Attention Is All You Need Pytorch | 7,813 | a month ago | 72 | mit | Python | |||||
A PyTorch implementation of the Transformer model in "Attention is All You Need". | ||||||||||
Opennmt Py | 6,267 | 2 | 8 | 2 days ago | 31 | June 22, 2023 | 28 | mit | Python | |
Open Source Neural Machine Translation and (Large) Language Models in PyTorch | ||||||||||
Practical Pytorch | 4,272 | 2 years ago | 91 | mit | Jupyter Notebook | |||||
Go to https://github.com/pytorch/tutorials - this repo is deprecated and no longer maintained | ||||||||||
Photo2cartoon | 2,819 | 2 years ago | 6 | mit | Python | |||||
人像卡通化探索项目 (photo-to-cartoon translation project) | ||||||||||
Contrastive Unpaired Translation | 1,955 | 25 days ago | 94 | other | Python | |||||
Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch) | ||||||||||
Sockeye | 1,189 | 2 | 23 days ago | 85 | June 12, 2017 | 2 | apache-2.0 | Python | ||
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch | ||||||||||
Nlp Tutorial | 836 | 3 years ago | 6 | mit | Jupyter Notebook | |||||
A list of NLP(Natural Language Processing) tutorials | ||||||||||
Attentiongan | 564 | 3 months ago | 16 | other | Python | |||||
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation |
OpenNMT-py is the PyTorch version of the OpenNMT project, an open-source (MIT) neural machine translation (and beyond!) framework. It is designed to be research friendly to try out new ideas in translation, language modeling, summarization, and many other NLP tasks. Some companies have proven the code to be production ready.
We love contributions! Please look at issues marked with the contributions welcome tag.
Before raising an issue, make sure you read the requirements and the Full Documentation examples.
Unless there is a bug, please use the Forum or Gitter to ask questions.
There is a step-by-step and explained tuto (Thanks to Yasmin Moslem): Tutorial
Please try to read and/or follow before raising newbies issues.
Otherwise you can just have a look at the Quickstart steps
For all usecases including NMT, you can now use Multiquery instead of Multihead attention (faster at training and inference) and remove biases from all Linear (QKV as well as FeedForward modules).
If you used previous versions of OpenNMT-py, you can check the Changelog or the Breaking Changes
OpenNMT-py requires:
Install OpenNMT-py
from pip
:
pip install OpenNMT-py
or from the sources:
git clone https://github.com/OpenNMT/OpenNMT-py.git
cd OpenNMT-py
pip install -e .
Note: if you encounter a MemoryError
during installation, try to use pip
with --no-cache-dir
.
(Optional) Some advanced features (e.g. working pretrained models or specific transforms) require extra packages, you can install them with:
pip install -r requirements.opt.txt
Special note on flash attention support:
When using regular position_encoding=True
or Rotary with max_relative_positions=-1
OpenNMT-py will try to use an optimized dot-product path.
if you want to use flash attention 2 then you need to manually install it first:
pip install flash-attn --no-build-isolation
if flash attention 2 is not installed, then we will use F.scaled_dot_product_attention
from pytorch 2.x
When using max_relative_positions > 0
or Alibi max_relative_positions=-2
OpenNMT-py will use its legacy code for matrix multiplications.
flash attention is a bit faster and saves some GPU memory.
OpenNMT-py is run as a collaborative open-source project. Project was incubated by Systran and Harvard NLP in 2016 in Lua and ported to Pytorch in 2017.
Current maintainers (since 2018):
François Hernandez and Ubiqus Team. Vincent Nguyen (Seedfall)
If you are using OpenNMT-py for academic work, please cite the initial system demonstration paper published in ACL 2017:
@inproceedings{klein-etal-2017-opennmt,
title = "{O}pen{NMT}: Open-Source Toolkit for Neural Machine Translation",
author = "Klein, Guillaume and
Kim, Yoon and
Deng, Yuntian and
Senellart, Jean and
Rush, Alexander",
booktitle = "Proceedings of {ACL} 2017, System Demonstrations",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P17-4012",
pages = "67--72",
}