Abstract
Contrastive unsupervised learning has recently shown encouraging progress,
e.g., in Momentum Contrast (MoCo) and SimCLR. In this note, we verify the
effectiveness of two of SimCLR's design improvements by implementing them in
the MoCo framework. With simple modifications to MoCo---namely, using an MLP
projection head and more data augmentation---we establish stronger baselines
that outperform SimCLR and do not require large training batches. We hope this
will make state-of-the-art unsupervised learning research more accessible. Code
will be made public.
Users
Please
log in to take part in the discussion (add own reviews or comments).