Waiting hours for a future prediction is unacceptable. Hyperlearn makes AI and ML algorithms 50% faster, use 90% less memory and doesn't require you to use new hardware! ML Algorithms like PCA, Linear Regression, NMF are all faster!
We're building a full Earth simulation to predict the future of everything and make JARVIS reality.
Ask complex questions you never thought possible like
How long will I live?
When will the market next crash?
How will climate change affect me?
Hyperlearn
Would you wait a second for a Google search? How about 5 hours for a prediction?
Hyperlearn makes Moonshot run fast and makes ML algos faster and use less memory.
! Hyperlearn is under construction! A stable package will be reuploaded mid 2022! Stay tuned!
In 2018-2020, I was at NVIDIA helping make GPU ML algos faster! I incorporated Hyperlearn's methods to make TSNE 2000x faster, and others faster. Since then, I have 50+ fast algos, but didn't have time to update Hyperlearn since Moonshot was priority one! I'll be updating Hyperlearn mid 2022!
Hyperlearn's algorithms, methods and repo has been featured or mentioned in 5 research papers!
University of Washington: Ariel Rokem, Kendrick Kay. Fractional ridge regression: a fast, interpretable reparameterization of ridge regression https://arxiv.org/abs/2005.03220
National Center for Scientific Research 'Demokritos', Greece: Christos Platias, Georgios Petasis. A Comparison of Machine Learning Methods for Data Imputation https://dl.acm.org/doi/10.1145/3411408.3411465
NVIDIA: Raschka et al. RAPIDS: Machine Learning in Python: Main developments and technology trends in data science, machine learning, and artificial intelligence https://arxiv.org/abs/2002.04803(Incorporated Hyperlearn methods into NVIDIA RAPIDS TSNE)
Hyperlearn's methods and algorithms have been incorporated into 5 organizations and repositories!
HyperLearn is written completely in PyTorch, NoGil Numba, Numpy, Pandas, Scipy & LAPACK, C++, C, Python, Cython and Assembly, and mirrors (mostly) Scikit Learn.
HyperLearn also has statistical inference measures embedded, and can be called just like Scikit Learn's syntax.
Some key current achievements of HyperLearn:
70% less time to fit Least Squares / Linear Regression than sklearn + 50% less memory usage
50% less time to fit Non Negative Matrix Factorization than sklearn due to new parallelized algo
40% faster full Euclidean / Cosine distance algorithms
50% less time LSMR iterative least squares
New Reconstruction SVD - use SVD to impute missing data! Has .fit AND .transform. Approx 30% better than mean imputation
Hyperlearn will be revamped in the following months to become Moonshot GreenAI with over an extra 150 optimized algorithms! Stay tuned!!
Also you made it this far! If you want to join Moonshot, complete the secretive quiz!
Hyperlearn is intended for academic, research and personal purposes only. Any explicit commercialisation of the algorithms, methods and anything inside Hyperlearn is strictly prohibited unless explicit notice is given to Daniel Han-Chen. The usage must also be approved by Daniel Han-Chen.
Hyperlearn uses the BSD 3 License now (previously GNU v3). However, as stated, commercialisation on top of Hyperlearn must be first approved by Daniel Han-Chen.
Get A Weekly Email With Trending Projects For These Topics
"Hyperlearn" and other potentially trademarked words, copyrighted images and copyrighted readme contents likely belong to the legal entity who owns the "Danielhanchen" organization. Awesome Open Source is not affiliated with the legal entity who owns the "Danielhanchen" organization.