Scaling Sentence Embeddings with Large Language Models

07/31/2023
by   Ting Jiang, et al.
0

Large language models (LLMs) have recently garnered significant interest. With in-context learning, LLMs achieve impressive results in various natural language tasks. However, the application of LLMs to sentence embeddings remains an area of ongoing research. In this work, we propose an in-context learning-based method aimed at improving sentence embeddings performance. Our approach involves adapting the previous prompt-based representation method for autoregressive models, constructing a demonstration set that enables LLMs to perform in-context learning, and scaling up the LLMs to different model sizes. Through extensive experiments, in-context learning enables LLMs to generate high-quality sentence embeddings without any fine-tuning. It helps LLMs achieve performance comparable to current contrastive learning methods. By scaling model size, we find scaling to more than tens of billion parameters harms the performance on semantic textual similarity (STS) tasks. However, the largest model outperforms other counterparts and achieves the new state-of-the-art result on transfer tasks. We also fine-tune LLMs with current contrastive learning approach, and the 2.7B OPT model, incorporating our prompt-based method, surpasses the performance of 4.8B ST5, achieving the new state-of-the-art results on STS tasks. Our code is available at https://github.com/kongds/scaling_sentemb.

READ FULL TEXT
research
02/17/2022

SGPT: GPT Sentence Embeddings for Semantic Search

GPT transformers are the largest language models available, yet semantic...
research
05/25/2021

ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

Learning high-quality sentence representations benefits a wide range of ...
research
03/14/2022

Deep Continuous Prompt for Contrastive Learning of Sentence Embeddings

The performance of sentence representation has been remarkably improved ...
research
05/02/2022

Debiased Contrastive Learning of Unsupervised Sentence Representations

Recently, contrastive learning has been shown to be effective in improvi...
research
08/16/2023

Boosting Commit Classification with Contrastive Learning

Commit Classification (CC) is an important task in software maintenance,...
research
04/20/2018

Sequential Network Transfer: Adapting Sentence Embeddings to Human Activities and Beyond

We study the problem of adapting neural sentence embedding models to the...
research
04/18/2023

D2CSE: Difference-aware Deep continuous prompts for Contrastive Sentence Embeddings

This paper describes Difference-aware Deep continuous prompt for Contras...

Please sign up or login with your details

Forgot password? Click here to reset