Argmax
A show where three machine learning enthusiasts talk about recent papers and developments in machine learning. Watch our video on YouTube https://www.youtube.com/@argmaxfm
Argmax
3: VICReg
        
        •
        Vahe Hagopian, Taka Hasegawa, Farrukh Rahman
          •
          Season 1
          •
          Episode 3
      
      Todays paper: VICReg (https://arxiv.org/abs/2105.04906)
Summary of the paper
VICReg prevents representation collapse using a mixture of variance, invariance and covariance when calculating the loss. It does not require negative samples and achieves great performance on downstream tasks.
Highlights of discussion
- The VICReg architecture (Figure 1)
 - Sensitivity to hyperparameters (Table 7)
 - Top 5 metric usefulness