Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Transformers | 112,535 | 64 | 1,869 | 5 hours ago | 114 | July 18, 2023 | 844 | apache-2.0 | Python | |
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. | ||||||||||
Stable Diffusion Webui | 103,800 | 18 hours ago | 2 | January 17, 2022 | 1,539 | agpl-3.0 | Python | |||
Stable Diffusion web UI | ||||||||||
Pytorch | 71,175 | 3,341 | 6,728 | 5 hours ago | 37 | May 08, 2023 | 12,795 | other | Python | |
Tensors and Dynamic neural networks in Python with strong GPU acceleration | ||||||||||
Keras | 59,445 | 578 | 9 hours ago | 80 | June 27, 2023 | 98 | apache-2.0 | Python | ||
Deep Learning for humans | ||||||||||
Real Time Voice Cloning | 47,152 | 4 days ago | 168 | other | Python | |||||
Clone a voice in 5 seconds to generate arbitrary speech in real-time | ||||||||||
Yolov5 | 41,907 | a day ago | 8 | September 21, 2021 | 222 | agpl-3.0 | Python | |||
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite | ||||||||||
Annotated_deep_learning_paper_implementations | 36,223 | 1 | 8 days ago | 78 | September 24, 2022 | 27 | mit | Jupyter Notebook | ||
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠 | ||||||||||
Made With Ml | 34,217 | a day ago | 5 | May 15, 2019 | 4 | mit | Jupyter Notebook | |||
Learn how to design, develop, deploy and iterate on production-grade ML applications. | ||||||||||
Gfpgan | 32,185 | 9 | 16 days ago | 11 | September 20, 2022 | 271 | other | Python | ||
GFPGAN aims at developing Practical Algorithms for Real-world Face Restoration. | ||||||||||
Mockingbird | 30,784 | 24 days ago | 2 | February 28, 2022 | 446 | other | Python | |||
🚀AI拟声: 5秒内克隆您的声音并生成任意语音内容 Clone a voice in 5 seconds to generate arbitrary speech in real-time |
Distributed Deep Learning Library for Apache Spark
The AI for Big Data community includes the following projects:
BigDL is a distributed deep learning library for Apache Spark; with BigDL, users can write their deep learning applications as standard Spark programs, which can directly run on top of existing Spark or Hadoop clusters.
Rich deep learning support. Modeled after Torch, BigDL provides comprehensive support for deep learning, including numeric computing (via Tensor) and high level neural networks; in addition, users can load pre-trained Caffe or Torch models into Spark programs using BigDL.
Extremely high performance. To achieve high performance, BigDL uses Intel oneMKL, oneDNN and multi-threaded programming in each Spark task. Consequently, it is orders of magnitude faster than out-of-box open source Caffe or Torch on a single-node Xeon (i.e., comparable with mainstream GPU).
Efficiently scale-out. BigDL can efficiently scale out to perform data analytics at "Big Data scale", by leveraging Apache Spark (a lightning fast distributed data processing framework), as well as efficient implementations of synchronous SGD and all-reduce communications on Spark.
You may want to write your deep learning programs using BigDL if:
You want to analyze a large amount of data on the same Big Data (Hadoop/Spark) cluster where the data are stored (in, say, HDFS, HBase, Hive, Parquet, etc.).
You want to add deep learning functionalities (either training or prediction) to your Big Data (Spark) programs and/or workflow.
You want to leverage existing Hadoop/Spark clusters to run your deep learning applications, which can be then dynamically shared with other workloads (e.g., ETL, data warehouse, feature engineering, classical machine learning, graph analytics, etc.)
It is highly recommended to use the high-level APIs provided by Analytics Zoo, including:
For additional information, you may refer to:
If you've found BigDL useful for your project, you may cite the paper as follows:
@inproceedings{SOCC2019_BIGDL,
title={BigDL: A Distributed Deep Learning Framework for Big Data},
author={Dai, Jason (Jinquan) and Wang, Yiheng and Qiu, Xin and Ding, Ding and Zhang, Yao and Wang, Yanzhang and Jia, Xianyan and Zhang, Li (Cherry) and Wan, Yan and Li, Zhichao and Wang, Jiao and Huang, Shengsheng and Wu, Zhongyuan and Wang, Yang and Yang, Yuhao and She, Bowen and Shi, Dongjie and Lu, Qi and Huang, Kai and Song, Guoqiong},
booktitle={Proceedings of the ACM Symposium on Cloud Computing},
publisher={Association for Computing Machinery},
pages={50--60},
year={2019},
series={SoCC'19},
doi={10.1145/3357223.3362707},
url={https://arxiv.org/pdf/1804.05839.pdf}
}