TourBERT: A pretrained language model for the tourism industry

01/19/2022
by   Veronika Arefieva, et al.
0

The Bidirectional Encoder Representations from Transformers (BERT) is currently one of the most important and state-of-the-art models for natural language. However, it has also been shown that for domain-specific tasks it is helpful to pretrain BERT on a domain-specific corpus. In this paper, we present TourBERT, a pretrained language model for tourism. We describe how TourBERT was developed and evaluated. The evaluations show that TourBERT is outperforming BERT in all tourism-specific tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset