Word Embeddings for Banking Industry

06/02/2023
by   Avnish Patel, et al.
0

Applications of Natural Language Processing (NLP) are plentiful, from sentiment analysis to text classification. Practitioners rely on static word embeddings (e.g. Word2Vec or GloVe) or static word representation from contextual models (e.g. BERT or ELMo) to perform many of these NLP tasks. These widely available word embeddings are built from large amount of text, so they are likely to have captured most of the vocabulary in different context. However, how well would they capture domain-specific semantics and word relatedness? This paper explores this idea by creating a bank-specific word embeddings and evaluates them against other sources of word embeddings such as GloVe and BERT. Not surprising that embeddings built from bank-specific corpora does a better job of capturing the bank-specific semantics and word relatedness. This finding suggests that bank-specific word embeddings could be a good stand-alone source or a complement to other widely available embeddings when performing NLP tasks specific to the banking industry.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset