Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Data Science Ipython Notebooks | 25,668 | 6 months ago | 34 | other | Python | |||||
Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines. | ||||||||||
D2l En | 20,613 | 3 months ago | 2 | November 13, 2022 | 115 | other | Python | |||
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge. | ||||||||||
Best Of Ml Python | 14,990 | 3 months ago | 21 | cc-by-sa-4.0 | ||||||
🏆 A ranked list of awesome machine learning Python libraries. Updated weekly. | ||||||||||
Cheatsheets Ai | 13,281 | 5 years ago | 6 | mit | ||||||
Essential Cheat Sheets for deep learning and machine learning researchers https://medium.com/@kailashahirwar/essential-cheat-sheets-for-machine-learning-and-deep-learning-researchers-efb6a8ebd2e5 | ||||||||||
Eat_tensorflow2_in_30_days | 9,591 | 2 years ago | 25 | apache-2.0 | Python | |||||
Tensorflow2.0 🍎🍊 is delicious, just eat it! 😋😋 | ||||||||||
T81_558_deep_learning | 5,590 | 5 months ago | 3 | other | Jupyter Notebook | |||||
T81-558: Keras - Applications of Deep Neural Networks @Washington University in St. Louis | ||||||||||
Practical_rl | 5,572 | 4 months ago | 40 | unlicense | Jupyter Notebook | |||||
A course in reinforcement learning in the wild | ||||||||||
Keras Rl | 5,348 | 51 | 3 | a year ago | 8 | June 01, 2018 | 43 | mit | Python | |
Deep Reinforcement Learning for Keras. | ||||||||||
Machinelearning | 4,895 | 3 months ago | 39 | Python | ||||||
Basic Machine Learning and Deep Learning | ||||||||||
Bigdl | 4,728 | 10 | 3 months ago | 16 | April 19, 2021 | 958 | apache-2.0 | Jupyter Notebook | ||
Accelerate LLM with low-bit (FP4 / INT4 / FP8 / INT8) optimizations using bigdl-llm |