Semi-Siamese Bi-encoder Neural Ranking Model Using Lightweight Fine-Tuning

10/28/2021
by   Euna Jung, et al.
0

A BERT-based Neural Ranking Model (NRM) can be either a cross-encoder or a bi-encoder. Between the two, bi-encoder is highly efficient because all the documents can be pre-processed before the actual query time. Although query and document are independently encoded, the existing bi-encoder NRMs are Siamese models where a single language model is used for consistently encoding both of query and document. In this work, we show two approaches for improving the performance of BERT-based bi-encoders. The first approach is to replace the full fine-tuning step with a lightweight fine-tuning. We examine lightweight fine-tuning methods that are adapter-based, prompt-based, and hybrid of the two. The second approach is to develop semi-Siamese models where queries and documents are handled with a limited amount of difference. The limited difference is realized by learning two lightweight fine-tuning modules, where the main language model of BERT is kept common for both query and document. We provide extensive experiment results for monoBERT, TwinBERT, and ColBERT where three performance metrics are evaluated over Robust04, ClueWeb09b, and MS-MARCO datasets. The results confirm that both lightweight fine-tuning and semi-Siamese are considerably helpful for improving BERT-based bi-encoders. In fact, lightweight fine-tuning is helpful for cross-encoder, too.

READ FULL TEXT
research
03/11/2021

Improving Bi-encoder Document Ranking Models with Two Rankers and Multi-teacher Distillation

BERT-based Neural Ranking Models (NRMs) can be classified according to h...
research
11/09/2022

Distribution-Aligned Fine-Tuning for Efficient Neural Retrieval

Dual-encoder-based neural retrieval models achieve appreciable performan...
research
09/27/2021

Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations

In NLP, a large volume of tasks involve pairwise comparison between two ...
research
05/27/2021

Contrastive Fine-tuning Improves Robustness for Neural Rankers

The performance of state-of-the-art neural rankers can deteriorate subst...
research
06/10/2020

Revisiting Few-sample BERT Fine-tuning

We study the problem of few-sample fine-tuning of BERT contextual repres...
research
10/16/2020

Augmented SBERT: Data Augmentation Method for Improving Bi-Encoders for Pairwise Sentence Scoring Tasks

There are two approaches for pairwise sentence scoring: Cross-encoders, ...
research
11/29/2020

Coarse-to-Fine Memory Matching for Joint Retrieval and Classification

We present a novel end-to-end language model for joint retrieval and cla...

Please sign up or login with your details

Forgot password? Click here to reset