Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for pytorch mixture of experts
mixture-of-experts
x
pytorch
x
10 search results found
Deepspeed
⭐
32,358
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Mixtral Offloading
⭐
1,943
Run Mixtral-8x7B models in Colab or consumer desktops
Hivemind
⭐
1,716
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
Mixture Of Experts
⭐
656
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Tutel
⭐
599
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Mixture Of Experts
⭐
347
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
Awesome Adaptive Computation
⭐
79
A curated reading list of research in Adaptive Computation (AC) & Mixture of Experts (MoE).
Generalizable Mixture Of Experts
⭐
75
GMoE could be the next backbone model for many kinds of generalization task.
Soft Mixture Of Experts
⭐
30
PyTorch implementation of Soft MoE by Google Brain in "From Sparse to Soft Mixtures of Experts" (https://arxiv.org/pdf/2308.00951.pdf)
Learning At Home
⭐
13
«Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts» (NeurIPS 2020), original PyTorch implementation
Related Searches
Python Pytorch (15,943)
Deep Learning Pytorch (7,533)
Jupyter Notebook Pytorch (4,892)
Machine Learning Pytorch (2,934)
Dataset Pytorch (1,848)
Pytorch Convolutional Neural Networks (1,777)
Pytorch Neural Network (1,631)
Pytorch Natural Language Processing (1,408)
Pytorch Computer Vision (1,230)
Pytorch Neural (1,217)
1-10 of 10 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.