Constrained Mean Shift Using Distant yet Related Neighbors for Representation Learning

2Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We are interested in representation learning in self-supervised, supervised, and semi-supervised settings. Some recent self-supervised learning methods like mean-shift (MSF) cluster images by pulling the embedding of a query image to be closer to its nearest neighbors (NNs). Since most NNs are close to the query by design, the averaging may not affect the embedding of the query much. On the other hand, far away NNs may not be semantically related to the query. We generalize the mean-shift idea by constraining the search space of NNs using another source of knowledge so that NNs are far from the query while still being semantically related. We show that our method (1) outperforms MSF in SSL setting when the constraint utilizes a different augmentation of an image from the previous epoch, and (2) outperforms PAWS in semi-supervised setting with less training resources when the constraint ensures that the NNs have the same pseudo-label as the query. Our code is available here: https://github.com/UCDvision/CMSF.

Cite

CITATION STYLE

APA

Navaneet, K. L., Abbasi Koohpayegani, S., Tejankar, A., Pourahmadi, K., Subramanya, A., & Pirsiavash, H. (2022). Constrained Mean Shift Using Distant yet Related Neighbors for Representation Learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13691 LNCS, pp. 23–41). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19821-2_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free