|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Deeplearningproject||4,043||3 years ago||3||mit||HTML|
|An in-depth machine learning tutorial introducing readers to a whole machine learning pipeline from scratch.|
|Pipeline||732||4 years ago||7|
|A step by step guide on creating build and deployment pipelines for Kubernetes.|
|Littlevulkanengine||558||9 days ago||11||mit||C++|
|Code repo for video tutorial series teaching Vulkan and computer graphics|
|Tutorials||321||2 months ago||30||apache-2.0||Java|
|Pipelines Tutorial||244||2 months ago||1||apache-2.0||Shell|
|A step-by-step tutorial showing OpenShift Pipelines|
|Prism||209||a year ago||16||gpl-3.0||Python|
|Unity_resources||177||a year ago||gpl-3.0|
|A list of resources and tutorials for those doing programming in Unity.|
|Bash Streams Handbook||148||a year ago||mit|
|💻 Learn Bash streams, pipelines and redirection, from beginner to advanced.|
|Metal Tutorial||118||5 years ago||Swift|
|Building A Multibranch Pipeline Project||95||2 months ago||1||Shell|
|For an advanced tutorial on how to use Jenkins to build a multibranch Pipeline project with selectively executed stages.|
This tutorial tries to do what most Most Machine Learning tutorials available online do not. It is not a 30 minute tutorial which teaches you how to "Train your own neural network" or "Learn deep learning in under 30 minutes". It's a full pipeline which you would need to do if you actually work with machine learning - introducing you to all the parts, and all the implementation decisions and details that need to be made. The dataset is not one of the standard sets like MNIST or CIFAR, you will make you very own dataset. Then you will go through a couple conventional machine learning algorithms, before finally getting to deep learning!
In the fall of 2016, I was a Teaching Fellow (Harvard's version of TA) for the graduate class on "Advanced Topics in Data Science (CS209/109)" at Harvard University. I was in-charge of designing the class project given to the students, and this tutorial has been built on top of the project I designed for the class.
The tutorial has now been re-written in PyTorch thanks to Anshul Basia (https://github.com/AnshulBasia)
You can access the HTML here: https://spandan-madan.github.io/DeepLearningProject/PyTorch_version/Deep_Learning_Project-Pytorch.html and the IPython Notebook with the code in PyTorch here:https://github.com/Spandan-Madan/DeepLearningProject/blob/master/PyTorch_version/Deep_Learning_Project-Pytorch.ipynb
If you would like to use this work, please cite the work using the doi -
To view the project as an HTML file, visit - https://spandan-madan.github.io/DeepLearningProject/
If you would like to access to Code, please go through the ipython notebook
To make setup easy, we are going to use conda.
conda env create -f deeplearningproject_environment.yml
source activate deeplearningproject
jupyter notebookIf all the isntallations go through, you are good to go! If not, here is a list of packages that need to be installed:
requests imDbPy wget tmdbsimple seaborn sklearn Pillow keras tensorflow h5py gensim nltk stop_words
Please install imdbpy using 'pip install imdbpy==6.6' since earlier versions are broken
To be able to run the environment you just created on a juputer notebook, first check that you have the python package
ipykernel installed. If you don't simply install it using
pip install ipykernel
Now, add this to your jupyter notebook using the command:
python -m ipykernel install --user --name deeplearningproject --display-name "deeplearningproject"
Needless to say, remove all single quotes before running commands.
Go to the directory and run jupyter notbeook by "jupyter notebook" and open the respective notebook on browser. TO install TMDB: pip install tmdbsimple Use "import tmdbsimple as tmdb"
To work with an isolate environment and be able to run it on many systems without troubles, you can run this docker-compose command:
It will build
deeplearningproject image according to Dockerfile. And then run dokcer container via docker-compose. See Docker and docker-compose docs for more informations :
Then access notebooks through your web browser at http://localhost:8888
You should notice that notebooks have been copied from root to notebooks folder to mount them into container via bind volume. Any changes you make, will be saved on host (notebooks dir).
You can add conda or pip packages to image (and thus, container) by updating
deeplearningproject_environment.yml file and then run
It will build a new
deeplearningproject image with new conda/pip packages installed. Stop your running container (
CTRL-C) and then
docker-compose up to rerun a fresh new container.
I will keep updating this as issues pop up on this repository.
VGG16. If so, just update keras using the following command:
sudo pip install git+git://github.com/fchollet/keras.git --upgrade
-OS Error: Too Many Open Files Refer to: https://stackoverflow.com/questions/16526783/python-subprocess-too-many-open-files or, shut down notebook and execute following the the same terminal ``bash ulimit -Sn 10000
And restart the jupyter notebook. Hope this repo helps introduce you to a full machine learning pipeline! If you spot an error, please create an issue to help out others using this resource! To prevent problems with installation and setting up, this repository comes with a conda environment profile. The only thing you will need is to install the newest version of conda, and use this profile to create a new environment and it will come set up with all the libraries you will need for the tutorial.