Style transfer and classification in hebrew news items

12/06/2022
by   Nir Weingarten, et al.
0

Hebrew is a Morphological rich language, making its modeling harder than simpler language. Recent developments such as Transformers in general and Bert in particular opened a path for Hebrew models that reach SOTA results, not falling short from other non-MRL languages. We explore the cutting edge in this field performing style transfer, text generation and classification over news articles collected from online archives. Furthermore, the news portals that feed our collective consciousness are an interesting corpus to study, as their analysis and tracing might reveal insights about our society and discourse.

READ FULL TEXT
research
09/18/2021

Text Detoxification using Large Pre-trained Neural Models

We present two novel unsupervised methods for eliminating toxicity in te...
research
05/07/2020

Learning Implicit Text Generation via Feature Matching

Generative feature matching network (GFMN) is an approach for training i...
research
09/19/2023

Specializing Small Language Models towards Complex Style Transfer via Latent Attribute Pre-Training

In this work, we introduce the concept of complex text style transfer ta...
research
03/17/2018

Dear Sir or Madam, May I introduce the YAFC Corpus: Corpus, Benchmarks and Metrics for Formality Style Transfer

Style transfer is the task of automatically transforming a piece of text...
research
10/27/2022

He Said, She Said: Style Transfer for Shifting the Perspective of Dialogues

In this work, we define a new style transfer task: perspective shift, wh...
research
01/16/2020

Universal patterns of online news impact

Online news can quickly reach and affect millions of people, yet little ...

Please sign up or login with your details

Forgot password? Click here to reset