PyTorch, JIT, Android

Dec. 14, 2018

This week, we had a PyTorch Meetup in Munich at Microsoft. It was great to see more than 90 people visit for the two talks and PyTorch chat over Pizza and drinks afterwards! Piotr Bialecki gave a talk on semantic search on the PyTorch forums, and I had the honor of talking about PyTorch, the JIT, and Android. Thank you Piotr and Microsoft!

Here is a short synopsis of my talk:

To me the two main usecases for the PyTorch JIT are

  • Automatic optimization and
  • Export of models to C++.

For a simple loss function - Intersection over Union, used to measure how well predicted bounding boxes match the ground truth in detection models - I first investigated two "classic" optimizations:

  • Moving from Python to C++ only saved 10% - less then the audience expected (the median of my ad-hoc poll semed to be 2-10 fold speedup).
  • Writing custom CUDA kernels gives fast functions - but is tedious.

The good news is that for many functions - namely those involving pointwise calculations - the JIT can give you considerable speedups more or less for free.

The second part of my talk was about exporting to C++ and getting those exported models to run on Android. Easy deployment!

Here are my slides and a detailed Jupyter notebook writeup of the IoU optimization. I'd love to hear from you if you find these interesting!

Through my company, MathInf, I'm offering PyTorch consulting and training.

Most of the time training is inhouse for clients, but I'm currently considering to do a small (8-10 participants) one day training in February. My trainings don't start with MNIST, so we'll have something that is beginner-to-intermediate but not watered down. Let me know if you're in and what kind of things you'd like to see covered!