Awesome Open Source
Search
Programming Languages
Languages
All Categories
Categories
About
Search results for mixture of experts moe
mixture-of-experts
x
moe
x
4 search results found
Mixture Of Experts
⭐
656
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Tutel
⭐
599
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Llama Moe
⭐
497
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Pipegoose
⭐
58
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
Related Searches
Python Mixture Of Experts (28)
Python Moe (14)
Deep Learning Mixture Of Experts (13)
Pytorch Mixture Of Experts (10)
Machine Learning Mixture Of Experts (10)
Artificial Intelligence Mixture Of Experts (7)
Jupyter Notebook Mixture Of Experts (6)
1-4 of 4 search results
Privacy
|
About
|
Terms
|
Follow Us On Twitter
Copyright 2018-2024 Awesome Open Source. All rights reserved.