On Sensitive Minima in Margin-Based Deep Distance Learning

Reza Serajeh, Seyran Khademi, Amir Mousavinia, Jan C. van Gemert

Research output: Contribution to journalArticleScientificpeer-review

1 Citation (Scopus)
39 Downloads (Pure)


This paper investigates sensitive minima in popular deep distance learning techniques such as Siamese and Triplet networks. We demonstrate that standard formulations may find solutions that are sensitive to small changes and thus do not generalize well. To alleviate sensitive minima we propose a new approach to regularize margin-based deep distance learning by introducing stochasticity in the loss that encourages robust solutions. Our experimental results on HPatches show promise compared to common regularization techniques including weight decay and dropout, especially for small sample sizes.

Original languageEnglish
Article number9154359
Pages (from-to)145067-145076
Number of pages10
JournalIEEE Access
Publication statusPublished - 2020


  • contrastive loss
  • Deep metric learning
  • feature point matching
  • generalization
  • regularization
  • triplet loss


Dive into the research topics of 'On Sensitive Minima in Margin-Based Deep Distance Learning'. Together they form a unique fingerprint.

Cite this