Depthwise-Separable convolutions in Pytorch

Diego Velez
FAUN — Developer Community 🐾
2 min readJun 17, 2022

--

In the context of machine learning, Convolution is the process of computing 2 matrices A and B, where matrix A is going to be the input and B is the filter — also called kernel -, this will then generate a new matrix C which is called feature map.

There are different types of convolutions, each one of them with their pros and cons, but here we will take a look at depthwise-separable convolutions as well as how to implement them in Pytorch

Depthwise-separable convolutions were first utilized in Rigid-Motion Sctattering and then popularized by Xception networks

Depthwise Convolution is a type of convolution where we apply a single convolutional filter for each input channel

source: https://paperswithcode.com/method/depthwise-convolution

Pointwise Convolution is a type of convolution that uses a 1x1 kernel

source: https://paperswithcode.com/method/pointwise-convolution

A depthwise-separable convolution is the combination of both depthwise followed by a pointwise convolutions.

Main advantages of depthwise-separable convolutions

  • They have lesser number of parameters to adjust as compared to the standard CNN’s, which reduces overfitting
  • They are computationally cheaper because of fewer computations which makes them a great option to run on low-end hardware (see MobileNet)

Pytorch implementation

Depthwise-separable convolution implementation in Pytorch
Output from above code

As you can see there are way less parameters in the depthwise-convolution (1/4), which simply means a depthwise-separable convolution runs much faster than a normal convolution.

If you want to dig deeper on depthwise-convolutions as well as performance, I recomend you to read the original Xception paper explaining all that https://arxiv.org/pdf/1610.02357v3.pdf

If this post was helpful, please click the clap 👏 button below a few times to show your support for the author 👇

🚀Developers: Learn and grow by keeping up with what matters, JOIN FAUN.

--

--