Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

原文地址:Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

官方实现:msight-tech/research-ms-loss

摘要

A family of loss functions built on pair-based computation have been proposed in the literature which provide a myriad of solutions for deep metric learning. In this paper, we provide a general weighting framework for understanding recent pair-based loss functions. Our contributions are three-fold: (1) we establish a General Pair Weighting (GPW) framework, which casts the sampling problem of deep metric learning into a unified view of pair weighting through gradient analysis, providing a powerful tool for understanding recent pair-based loss functions; (2) we show that with GPW, various existing pair-based methods can be compared and discussed comprehensively, with clear differences and key limitations identified; (3) we propose a new loss called multi-similarity loss (MS loss) under the GPW, which is implemented in two iterative steps (i.e., mining and weighting). This allows it to fully consider three similarities for pair weighting, providing a more principled approach for collecting and weighting informative pairs. Finally, the proposed MS loss obtains new state-of-the-art performance on four image retrieval benchmarks, where it outperforms the most recent approaches, such as ABE and HTL by a large margin: 60.6% to 65.7% on CUB200, and 80.9% to 88.0% on In-Shop Clothes Retrieval dataset at Recall@1. Code is available at this https URL.

在深度度量学习的发展过程中,一系列基于成对计算的损失函数提供了无数解决方案。本文我们提供了一个通用的加权框架来理解最近的成对损失函数。我们的贡献有三个方面:(1)建立了一个通用成对加权(GPW)框架,通过梯度分析将深度度量学习的采样问题转化为成对加权的统一视图,为理解最近的成对损失函数提供了强大的工具;(2)可以对现有的各种基于成对的方法进行全面比较和讨论,并找出明显的差异和关键限制;(3)基于GPW框架提出了一种称为多重相似性损失(MS损失)的新损失函数。该损失通过两个迭代步骤(即挖掘和加权)实现。这使得它能够充分考虑成对加权的三种相似性,为收集和加权信息对提供了更具原则性的方法。最后,提出的MS loss在四个图像检索基准上实现了最优性能,大大超越了最近方法(比如ABE和HTL):评估Recall@1,在CUB200上将之前最优性能60.6%提升至65.7%,在In-Shop Clothes Retrieval数据集上将基准从80.9%提升至88.0%。代码已开源:https://github.com/MalongTech/research-ms-loss

相关阅读