TY  - GEN
Y1  - 2021/12/18/
TI  - You Never Cluster Alone
PB  - NeurIPS
N1  - This version is the version of record. For information on re-use, please refer to the publisher?s terms and conditions.
CY  - Virtual
ID  - discovery10148476
AV  - public
N2  - Recent advances in self-supervised learning with instance-level contrastive objectives facilitate unsupervised clustering. However, a standalone datum is not perceiving the context of the holistic cluster, and may undergo sub-optimal assignment. In this paper, we extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation that encodes the context of each data group. Contrastive learning with this representation then rewards the assignment of each datum. To implement this vision, we propose twin-contrast clustering (TCC). We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one. On one hand, with the corresponding assignment variables being the weight, a weighted aggregation along the data points implements the set representation of a cluster. We further propose heuristic cluster augmentation equivalents to enable cluster-level contrastive learning. On the other hand, we derive the evidence lower-bound of the instance-level contrastive objective with the assignments. By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps. Extensive experiments show that TCC outperforms the state-of-the-art on benchmarked datasets.
UR  - https://proceedings.neurips.cc/paper/2021/hash/e96ed478dab8595a7dbda4cbcbee168f-Abstract.html
A1  - Shen, Yuming
A1  - Shen, Ziyi
A1  - Wang, Menghan
A1  - Qin, Jie
A1  - Torr, Philip
A1  - Shao, Ling
ER  -