Supervised Contrastive Learning for Product Matching

by   Ralph Peeters, et al.

Contrastive learning has seen increasing success in the fields of computer vision and information retrieval in recent years. This poster is the first work that applies contrastive learning to the task of product matching in e-commerce using product offers from different e-shops. More specifically, we employ a supervised contrastive learning technique to pre-train a Transformer encoder which is afterwards fine-tuned for the matching problem using pair-wise training data. We further propose a source-aware sampling strategy which enables contrastive learning to be applied for use cases in which the training data does not contain product idenifiers. We show that applying supervised contrastive pre-training in combination with source-aware sampling significantly improves the state-of-the art performance on several widely used benchmark datasets: For Abt-Buy, we reach an F1 of 94.29 (+3.24 compared to the previous state-of-the-art), for Amazon-Google 79.28 (+ 3.7). For WDC Computers datasets, we reach improvements between +0.8 and +8.84 F1 depending on the training set size. Further experiments with data augmentation and self-supervised contrastive pre-training show, that the former can be helpful for smaller training sets while the latter leads to a significant decline in performance due to inherent label-noise. We thus conclude that contrastive pre-training has a high potential for product matching use cases in which explicit supervision is available.


page 1

page 2

page 3

page 4


Recovering Petaflops in Contrastive Semi-Supervised Learning of Visual Representations

We investigate a strategy for improving the computational efficiency of ...

Block-SCL: Blocking Matters for Supervised Contrastive Learning in Product Matching

Product matching is a fundamental step for the global understanding of c...

CCC-wav2vec 2.0: Clustering aided Cross Contrastive Self-supervised learning of speech representations

While Self-Supervised Learning has helped reap the benefit of the scale ...

Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

Contrastive pre-training on distant supervision has shown remarkable eff...

Robust Fraud Detection via Supervised Contrastive Learning

Deep learning models have recently become popular for detecting maliciou...

Towards noise robust trigger-word detection with contrastive learning pre-task for fast on-boarding of new trigger-words

Trigger-word detection plays an important role as the entry point of use...

Adversarial Momentum-Contrastive Pre-Training

Deep neural networks are vulnerable to semantic invariant corruptions an...

Please sign up or login with your details

Forgot password? Click here to reset