Learning About Deep Learning
I’m thrilled to be a part of the Scholars Program this summer at OpenAI (update: I’m continuing on this fall as an OpenAI Fellow!). I was a physics major and I’ve always loved math, but a year ago I didn’t have any deep learning or AI knowledge. This is how I got up to speed. I’d love to hear from everyone else what they’ve found useful – please add comments about ideas, courses, competitions, scholarships, etc. that I’ve missed.
It’s both thrilling and completely overwhelming the amount you can learn online at this point.ย Here’s what I found to be the main challenges of learning independently:
Choosing what to work on
These are the courses I’ve found to be very high-yield and worth the time. I do browse at other courses, particularly if there’s a specific topic I need to learn, but these are the ones I recommend start-to-finish.
- Jeremy Howard & Rachel Thomas’ FastAI sequence. The 2017-2018 course uses PyTorch. There’s also an older version of the course which has mostly similar material in Keras and Tensorflow.
- Andrew Ng’s Deep Learning Specialization on Coursera. This dives more into the math behind deep learning and is a fantastic overall introduction. Sometimes the techniques taught are less cutting-edge than the FastAI ones.
- Jose Portilla’s Python courses on Udemy, particularly Python for Data Science. I came into this not knowing Python at all, so appreciated having this great introduction to python, numpy, scipy, and pandas.
- 3 Brown 1 Blue‘s presentation of Linear Algebra. I thought I already knew linear algebra well from my college courses, but these videos are fantastic.
- Similarly, StatQuestย is my go-to for statistics questions
- https://www.kaggle.com/ – the way to test your skills and learn the current cutting edge.
- HackerRank – great place to prepare for interviews and programming tests (it’s not deep learning specific)
Staying on Schedule
Here I think the most important is to know your own personality and to play to that. I’m very project-oriented and goal-oriented. Once I’m working on a specific task, I have no trouble staying with that for hours. So I’ve tended to binge-watch courses (particularly Andrew Ng’s series). On the other hand, I know I’m not very good at jumping between several different projects, and when I don’t have a specific goal. I try to keep this in mind when planning my schedule for the week.
I also like Jeremy Howard’s advice for working on Kaggle competitions. He suggests without fail working a half-hour *every* day on the project. Each day you make some slight incremental progress.
With that in mind, I try to learn one new topic every day (even if it’s a relatively small detail). Either by reading a paper, watching a course video, or reading a blog post. Recently I met some of the Google Brain team, and when the topic turned to Variational Auto-Encoders, by chance I knew all about them since they’d happened to be my chosen topic one day. I keep a small journal with a half a page of notes on whatever I learn each day.
Learning actively (not passively!)
The big danger of online courses is that it’s far too easy to watch a bunch of videos (on 1.5x speed) and then a week later not remember any of it.
Both the FastAI and Deep Learning Specialization have very active forums. It’s definitely worth participating in those – both asking questions and trying to answer others. After taking the Coursera sequence, I was invited to become a mentor for the CNN and RNN courses, and I’m sure I’ve learned far more from trying to teach others than I did taking the course on my own.
This is also where the Kaggle competitions are extremely valuable. It’s one thing to be able to follow through a Jupyter Notebook. It’s something totally different to start with just the data and create all the structure from scratch.
Proving how much you’ve learned
After all the online courses, it’s helpful to create some tangible proof of what you know. My suggestions are a github project, a Kaggle competition, and some blog posts.
Jeremy Howard gave me the advice to focus on one great project, rather than several decent, half-baked ones. He says to really polish it up, even to make a nice web interface.
Along the same lines, it’s great practice to try out several different Kaggle competitions, but it’s important to focus on one and score highly on that one.
I’ve written a lot as a mentor for Andrew Ng’s courses. I’ve always been impressed how I have to understand things so much more deeply in order to explain them well in writing. This is my first foray into blog post writing – I’m naturally a fairly quiet and reserved person, so I’m having to consciously push myself to do this, but it’s also an exciting way to connect with the data science world.
Update (8/17/2018):
Adding two more courses I found this summer and really enjoyed:
-
[…] wrote a blog post about my favorite resources for getting started. My main advice is just to dive in. It’s easy […]
Leave a CommentYou must be logged in to post a comment.
Really enjoying the posts — not to mention, the inspiration! Thank you.
Hey! Christine. Thank you for writing this blog. Really enjoy reading it.
It also giving me the motivation to focus on my work and join kaggle soon as possible.
Thanks! So glad you liked it, and good luck with everything!
Very nicely written.
Thank you so much.
If you don’t have any Machine Learning background, I would start with Jay Alammar’s couple of blog posts available here: http://jalammar.github.io/visual-interactive-guide-basics-neural-networks/ . I think the animations are pretty good and the approach he took is very good.
After that, I would watch the 4 neural network videos from 3Blue1Brown. They have the best animations I have seen. https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
If you want something a little more advanced, but with nice explanations and graphs, check out Christopher Olah’s blog: http://colah.github.io/
Thanks for the ideas, these are great resources! Chris is now with us here at OpenAI, he’s amazing!
Susan Zhang wrote a nice blog post if you want to have links to the original papers related to deep learning: https://lilianweng.github.io/lil-log/2017/06/21/an-overview-of-deep-learning.html
Sorry, I meant Lilian Weng. ๐
I created just another Deep Learning Tutorial. I hope it helps someone wanting to get started. See https://bit.ly/2CfcjSX
Hey Christine, Enjoyed reading your post especially * Staying on Schedule* part and inspired from your work