Today, I'm launching my first course, all about Autograd.
Here is a brief content description:
The course comprises four video lectures (three hours total) with Jupyter notebooks and exercises. I have set up a private zulip instance (see below), so you can also ask on the forums, and I will do regular Q&A sessions (next ones on Friday, August 6, usually 2-3 weeks depending on demand).
You are an intermediate to advanced PyTorch user and want to know everything about autograd in Python. This course tries to cover autograd in full detail, so
- if you ever spent a day or two debugging why it said
Trying to backward through the graph a second time, but the buffers have already been freed.,
- built a custom
autograd.Functionand wondered why the
gradcheckworked, but the
gradgradcheckdid not, or
- want to know about these for making your work more efficient,
this course is for you.
You will learn:
- How autograd keeps track of your computation for autodifferentiation,
- common error messages, their causes, and how to debug them,
- all about defining your own
- how higher order derivatives work in PyTorch.
- Other APIs (functional, forward-mode autograd).
Edit (August 5): Registration was open for one week and is now closed, thank you for your understanding.
The next batch of this course will open for registration Sep 24 - Sep 30.