Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for distributed training data parallelism
data-parallelism
x
distributed-training
x
5 search results found
Libai
⭐
371
LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Easyparallellibrary
⭐
201
Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.
Terngrad
⭐
152
Ternary Gradients to Reduce Communication in Distributed Deep Learning (TensorFlow)
Sm Distributed Training Step By Step
⭐
9
This repository provides hands-on labs on PyTorch-based Distributed Training and SageMaker Distributed Training. It is written to make it easy for beginners to get started, and guides you through step-by-step modifications to the code based on the most basic BERT use cases.
Pytorch Transformer Distributed
⭐
5
Distributed training (multi-node) of a Transformer model
Related Searches
Python Distributed Training (53)
Deep Learning Distributed Training (34)
Pytorch Distributed Training (22)
Tensorflow Distributed Training (18)
Machine Learning Distributed Training (16)
Python Data Parallelism (14)
Jupyter Notebook Distributed Training (12)
Deep Learning Data Parallelism (9)
Gpu Distributed Training (8)
Pytorch Data Parallelism (6)
1-5 of 5 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.