Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Powersgd | 112 | a year ago | mit | Python | ||||||
Practical low-rank gradient compression for distributed optimization: https://arxiv.org/abs/1905.13727 | ||||||||||
Deep Gradient Compression | 106 | 3 years ago | 2 | other | Python | |||||
[ICLR 2018] Deep Gradient Compression: Reducing the Communication Bandwidth for Distributed Training | ||||||||||
Grace | 98 | 2 years ago | 2 | bsd-2-clause | Python | |||||
GRACE - GRAdient ComprEssion for distributed deep learning | ||||||||||
Microexr | 17 | 10 years ago | 1 | other | C++ | |||||
A lightweight subset of the OpenEXR library. | ||||||||||
Atomo | 12 | 6 years ago | 2 | Python | ||||||
Atomo: Communication-efficient Learning via Atomic Sparsification | ||||||||||
Hdr_tonemapping_fattal02 | 12 | 7 years ago | C++ | |||||||
Fattal02 HDR Tone mapping operator. Gradient Domain High Dynamic Range Compression. | ||||||||||
Deepgradientcompression | 11 | 5 years ago | 1 | Python | ||||||
It is implementation of Research paper "DEEP GRADIENT COMPRESSION: REDUCING THE COMMUNICATION BANDWIDTH FOR DISTRIBUTED TRAINING". Deep gradient compression is a technique by which the gradients are compressed before they are being sent. This approach greatly reduces the communication bandwidth and thus improves multi node training. |