Symbolic, Distributed and Distributional Representations for Natural Language Processing in the Era of Deep Learning: a Survey

02/02/2017
by   Lorenzo Ferrone, et al.
0

Natural language and symbols are intimately correlated. Recent advances in machine learning (ML) and in natural language processing (NLP) seem to contradict the above intuition: symbols are fading away, erased by vectors or tensors called distributed and distributional representations. However, there is a strict link between distributed/distributional representations and symbols, being the first an approximation of the second. A clearer understanding of the strict link between distributed/distributional representations and symbols will certainly lead to radically new deep learning networks. In this paper we make a survey that aims to draw the link between symbolic representations and distributed/distributional representations. This is the right time to revitalize the area of interpreting how symbols are represented inside neural networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset