Multilevel Sentence Embeddings for Personality Prediction

05/09/2023
by   Paolo Tirotta, et al.
0

Representing text into a multidimensional space can be done with sentence embedding models such as Sentence-BERT (SBERT). However, training these models when the data has a complex multilevel structure requires individually trained class-specific models, which increases time and computing costs. We propose a two step approach which enables us to map sentences according to their hierarchical memberships and polarity. At first we teach the upper level sentence space through an AdaCos loss function and then finetune with a novel loss function mainly based on the cosine similarity of intra-level pairs. We apply this method to three different datasets: two weakly supervised Big Five personality dataset obtained from English and Japanese Twitter data and the benchmark MNLI dataset. We show that our single model approach performs better than multiple class-specific classification models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

In Search for Linear Relations in Sentence Embedding Spaces

We present an introductory investigation into continuous-space vector re...
research
06/03/2018

Learning Semantic Sentence Embeddings using Pair-wise Discriminator

In this paper, we propose a method for obtaining sentence-level embeddin...
research
08/14/2019

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding

Attention based models have become the new state-of-the-art in natural l...
research
11/21/2022

L3Cube-MahaSBERT and HindSBERT: Sentence BERT Models and Benchmarking BERT Sentence Representations for Hindi and Marathi

Sentence representation from vanilla BERT models does not work well on s...
research
12/31/2019

Revisiting Paraphrase Question Generator using Pairwise Discriminator

In this paper, we propose a method for obtaining sentence-level embeddin...
research
10/05/2021

Exploiting Twitter as Source of Large Corpora of Weakly Similar Pairs for Semantic Sentence Embeddings

Semantic sentence embeddings are usually supervisedly built minimizing d...
research
10/07/2020

Why do you think that? Exploring Faithful Sentence-Level Rationales Without Supervision

Evaluating the trustworthiness of a model's prediction is essential for ...

Please sign up or login with your details

Forgot password? Click here to reset