Lernapparat

Launching my first online course: All about Autograd

July 30, 2021

Today, I'm launching my first course, all about Autograd.

Here is a brief content description:

The course comprises five video lectures (~4.5 hours total) with Jupyter notebooks and exercises. I have set up a private zulip instance (see below), so you can also ask on the forums, and I will do regular Q&A sessions (approximately every 3 weeks depending on demand).

You are an intermediate to advanced PyTorch user and want to know everything about autograd in Python. This course tries to cover autograd in full detail, so

  • if you ever spent a day or two debugging why it said Trying to backward through the graph a second time, but the buffers have already been freed.,
  • built a custom autograd.Function and wondered why the gradcheck worked, but the gradgradcheck did not, or
  • want to know about these for making your work more efficient,

this course is for you.

You will learn:

  • How autograd keeps track of your computation for autodifferentiation,
  • common error messages, their causes, and how to debug them,
  • all about defining your own autograd.Functions
  • how higher order derivatives work in PyTorch,
  • other APIs (functional, forward-mode autograd),
  • autograd graph debugging with multiple levels of detail,
  • finding out how exactly inference mode works by using a debugger and the PyTorch source code.

Price: EUR 60.

What former participants say:

I was working on PyTorch for the last two years, but this course just made things so much interesting and clear for me. Kudos to Thomas for providing me this opportunity to learn.

Usama Hasan, Computer Vision Engineer

Edit: Registration is currently closed. Thank you for your interest!

See you!