Supervised Contrastive Learning

During the last years, the tendency for applying transfer learning was to directly fine-tune the ImageNet weights in the new domain problem where (usually) there was a lack of images available for training. However, a new trend has arisen recently; in this case, before fine-tuning, there is a previous step called pre-training where the neural network trains with the target images to learn how to put images with the same class as close as possible in the latent space. And, on the other hand, to put images with a different class as far as possible in the latent space. Depending on whether this pre-training phase uses labels or not this stage will be supervised or unsupervised.

Finding the pipeline’s hyper-parameters that lead to the best performances can be an endless road

Therefore, Supervised Contrastive Learning [1] is a training approach that may outperform supervised training with the traditional cross-entropy loss function on classification tasks. Essentially, training an image classifier under this approach has two phases:

  1. (Pre-)Training an encoder (e.g.: ResNet or EfficientNet) to produce vector representations of input images where the representations of images in the same class will be more similar compared to representations of images in different classes. Specifically, during this phase, the supervised contrastive loss is used as an alternative to cross-entropy. 
  2. Training a classifier on top of the frozen encoder. The classifier will take advantage of the clusters of points belonging to the same class which are pulled together in embedding space

In this notebook, we are going to evaluate the Supervised Contrastive Learning approach for training a plant-species classifier. Additionally, a classical transfer learning approach without pre-training is also evaluated to check the differences in implementation and performance. More information can also be found in [2]:

Besides this notebook, you can find other working examples in the Eden Library. They all are ready for execution and to test your modifications. Check them by clicking on the link below and if you find this repo interesting, help us with a star and share it with your colleagues.

[1] Khosla, P., Teterwak, P., Wang, C., Sarna, A., Tian, Y., Isola, P., Maschinot, A., Liu, C., & Krishnan, D. (2020). Supervised Contrastive Learning. ArXiv, abs/2004.11362.