Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Data Science Ipython Notebooks | 25,668 | 2 months ago | 34 | other | Python | |||||
Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines. | ||||||||||
Awesome Pytorch List | 14,103 | 6 months ago | 4 | |||||||
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc. | ||||||||||
Stanford Cs 230 Deep Learning | 4,248 | 4 years ago | 5 | mit | ||||||
VIP cheatsheets for Stanford's CS 230 Deep Learning | ||||||||||
D2l Pytorch | 3,933 | a year ago | 13 | apache-2.0 | Jupyter Notebook | |||||
This project reproduces the book Dive Into Deep Learning (https://d2l.ai/), adapting the code from MXNet into PyTorch. | ||||||||||
Deep Learning Book | 2,650 | 5 years ago | 1 | December 24, 2016 | 2 | other | Jupyter Notebook | |||
Repository for "Introduction to Artificial Neural Networks and Deep Learning: A Practical Guide with Applications in Python" | ||||||||||
Datasciencer | 1,497 | 6 years ago | 3 | mit | R | |||||
a curated list of R tutorials for Data Science, NLP and Machine Learning | ||||||||||
Andrew Ng Notes | 1,367 | a year ago | 2 | Jupyter Notebook | ||||||
This is Andrew NG Coursera Handwritten Notes. | ||||||||||
Cracking The Data Science Interview | 1,291 | 2 years ago | 1 | Jupyter Notebook | ||||||
A Collection of Cheatsheets, Books, Questions, and Portfolio For DS/ML Interview Prep | ||||||||||
Tutorials | 847 | a year ago | 3 | other | Jupyter Notebook | |||||
AI-related tutorials. Access any of them for free โ https://towardsai.net/editorial | ||||||||||
Numpycnn | 531 | 6 months ago | 3 | May 24, 2018 | 1 | Python | ||||
Building Convolutional Neural Networks From Scratch using NumPy |
Going beyond BEDMAP2 using a super resolution deep neural network. Also a convenient flat file data repository for high resolution bed elevation datasets around Antarctica.
deepbedmap/
โโโ features/ (files describing the high level behaviour of various features)
โ โโโ *.feature... (easily understandable specifications written using the Given-When-Then gherkin language)
โ โโโ README.md (markdown information on the feature files)
โโโ highres/ (contains high resolution localized DEMs)
โ โโโ *.txt/csv/grd/xyz... (input vector file containing the point-based bed elevation data)
โ โโโ *.json (the pipeline file used to process the xyz point data)
โ โโโ *.nc (output raster netcdf files)
โ โโโ README.md (markdown information on highres data sources)
โโโ lowres/ (contains low resolution whole-continent DEMs)
โ โโโ bedmap2_bed.tif (the low resolution DEM!)
โ โโโ README.md (markdown information on lowres data sources)
โโโ misc/ (miscellaneous raster datasets)
โ โโโ *.tif (Surface DEMs, Ice Flow Velocity, etc. See list in Issue #9)
โ โโโ README.md (markdown information on miscellaneous data sources)
โโโ model/ (*hidden in git, neural network model related files)
โ โโโ train/ (a place to store the raster tile bounds and model training data)
โ โโโ weights/ (contains the neural network model's architecture and weights)
โโโ .env (environment variable config file used by pipenv)
โโโ .<something>ignore (files ignored by a particular piece of software)
โโโ .<something else> (stuff to make the code in this repo look and run nicely e.g. linters, CI/CD config files, etc)
โโโ Dockerfile (set of commands to fully reproduce the software stack here into a docker image, used by binder)
โโโ LICENSE.md (the license covering this repository)
โโโ Pipfile (what you want, the summary list of core python dependencies)
โโโ Pipfile.lock (what you need, all the pinned python dependencies for full reproducibility)
โโโ README.md (the markdown file you're reading now)
โโโ data_list.yml (human and machine readable list of the datasets and their metadata)
โโโ data_prep.ipynb/py (paired jupyter notebook/python script that prepares the data)
โโโ deepbedmap.ipynb/py (paired jupyter notebook/python script that predicts an Antarctic bed digital elevation model)
โโโ environment.yml (conda binary packages to install)
โโโ paper_figures.ipynb/py (paired jupyter notebook/python script to produce figures for DeepBedMap paper
โโโ srgan_train.ipynb/py (paired jupyter notebook/python script that trains the ESRGAN neural network model)
โโโ test_ipynb.ipynb/py (paired jupyter notebook/python script that runs doctests in the other jupyter notebooks!)
Launch in Binder (Interactive jupyter notebook/lab environment in the cloud).
Start by cloning this repo-url
git clone <repo-url>
Then I recommend using conda to install the non-python binaries (e.g. GMT, CUDA, etc). The conda virtual environment will also be created with Python and pipenv installed.
cd deepbedmap
conda env create -f environment.yml
Activate the conda environment first.
conda activate deepbedmap
Then set some environment variables before using pipenv to install the necessary python libraries,
otherwise you may encounter some problems (see Common problems below).
You may want to ensure that which pipenv
returns something similar to ~/.conda/envs/deepbedmap/bin/pipenv.
export HDF5_DIR=$CONDA_PREFIX/
export LD_LIBRARY_PATH=$CONDA_PREFIX/lib/
pipenv install --python $CONDA_PREFIX/bin/python --dev
#or just
HDF5_DIR=$CONDA_PREFIX/ LD_LIBRARY_PATH=$CONDA_PREFIX/lib/ pipenv install --python $CONDA_PREFIX/bin/python --dev
Finally, double-check that the libraries have been installed.
pipenv graph
conda env update -f environment.yml
pipenv sync --dev
Note that the .env file stores some environment variables.
So if you run conda activate deepbedmap
followed by some other command and get an ...error while loading shared libraries: libpython3.7m.so.1.0...
,
you may need to run pipenv shell
or do pipenv run <cmd>
to have those environment variables registered properly.
Or just run this first:
export LD_LIBRARY_PATH=$CONDA_PREFIX/lib/
Also, if you get a problem when using pipenv
to install netcdf4, make sure you have done:
export HDF5_DIR=$CONDA_PREFIX/
and then you can try using pipenv install
or pipenv sync
again.
See also this issue for more information.
conda activate deepbedmap
pipenv shell
python -m ipykernel install --user --name deepbedmap #to install conda env properly
jupyter kernelspec list --json #see if kernel is installed
jupyter lab &
The paper is published at The Cryosphere and can be referred to using the following BibTeX code:
@Article{tc-14-3687-2020,
AUTHOR = {Leong, W. J. and Horgan, H. J.},
TITLE = {DeepBedMap: a deep neural network for resolving the bed topography of Antarctica},
JOURNAL = {The Cryosphere},
VOLUME = {14},
YEAR = {2020},
NUMBER = {11},
PAGES = {3687--3705},
URL = {https://tc.copernicus.org/articles/14/3687/2020/},
DOI = {10.5194/tc-14-3687-2020}
}
The DeepBedMap_DEM v1.1.0 dataset is available from Zenodo at https://doi.org/10.5281/zenodo.4054246. Neural network model training experiment runs are also recorded at https://www.comet.ml/weiji14/deepbedmap. Python code for the DeepBedMap model here on Github is also mirrored on Zenodo at https://doi.org/10.5281/zenodo.3752613.