|Project Name||Stars||Downloads||Repos Using This||Packages Using This||Most Recent Commit||Total Releases||Latest Release||Open Issues||License||Language|
|Awesome Deep Learning||21,009||5 days ago||27|
|A curated list of awesome Deep Learning tutorials, projects and communities.|
|Lectures||14,554||6 years ago||10|
|Oxford Deep NLP 2017 course|
|Numerical Linear Algebra||9,325||2 months ago||11||Jupyter Notebook|
|Free online textbook of Jupyter notebooks for fast.ai Computational Linear Algebra course|
|Mlcourse.ai||8,803||17 days ago||4||other||Python|
|Open Machine Learning Course|
|Awesome Artificial Intelligence||7,727||a month ago||43|
|A curated list of Artificial Intelligence (AI) courses, books, video lectures and papers.|
|T81_558_deep_learning||5,408||10 days ago||4||other||Jupyter Notebook|
|Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks|
|Start Machine Learning||3,543||2 days ago||4||mit|
|A complete guide to start and improve in machine learning (ML), artificial intelligence (AI) in 2023 without ANY background in the field and stay up-to-date with the latest news and state-of-the-art techniques!|
|Course Nlp||3,271||3 months ago||55||Jupyter Notebook|
|A Code-First Introduction to NLP course|
|Awesome Ml Courses||2,324||2 months ago||3|
|Awesome free machine learning and AI courses with video lectures.|
|Spacy Course||2,156||a year ago||10||mit||Python|
|👩🏫 Advanced NLP with spaCy: A free online course|
You can find out about the course in this blog post and all lecture videos are available here.
This course was originally taught in the University of San Francisco's Masters of Science in Data Science program, summer 2019. The course is taught in Python with Jupyter Notebooks, using libraries such as sklearn, nltk, pytorch, and fastai.
The following topics will be covered:
1. What is NLP?
2. Topic Modeling with NMF and SVD
3. Sentiment classification with Naive Bayes, Logistic regression, and ngrams
4. Regex (and re-visiting tokenization)
5. Language modeling & sentiment classification with deep learning
6. Translation with RNNs
7. Translation with the Transformer architecture
8. Bias & ethics in NLP
This course is structured with a top-down teaching method, which is different from how most math courses operate. Typically, in a bottom-up approach, you first learn all the separate components you will be using, and then you gradually build them up into more complex structures. The problems with this are that students often lose motivation, don't have a sense of the "big picture", and don't know what they'll need.
Harvard Professor David Perkins has a book, Making Learning Whole in which he uses baseball as an analogy. We don't require kids to memorize all the rules of baseball and understand all the technical details before we let them play the game. Rather, they start playing with a just general sense of it, and then gradually learn more rules/details as time goes on.
If you took the fast.ai deep learning course, that is what we used. You can hear more about my teaching philosophy in this blog post or this talk I gave at the San Francisco Machine Learning meetup.
All that to say, don't worry if you don't understand everything at first! You're not supposed to. We will start using some "black boxes" and then we'll dig into the lower level details later.
To start, focus on what things DO, not what they ARE.