Interactive Natural Language Acquisition in a Multi-modal Recurrent Neural Architecture

by   Stefan Heinrich, et al.

The human brain is one of the most complex dynamic systems that enables us to communicate in natural language. We have a good understanding of some principles underlying natural languages and language processing, some knowledge about socio-cultural conditions framing acquisition, and some insights about where activity is occurring in the brain. However, we were not yet able to understand the behavioural and mechanistic characteristics for natural language and how mechanisms in the brain allow to acquire and process language. In an effort to bridge the gap between insights from behavioural psychology and neuroscience, the goal of this paper is to contribute a computational understanding of the appropriate characteristics that favour language acquisition, in a brain-inspired neural architecture. Accordingly, we provide concepts and refinements in cognitive modelling regarding principles and mechanisms in the brain - such as the hierarchical abstraction of context - in a plausible recurrent architecture. On this basis, we propose neurocognitively plausible model for embodied language acquisition from real world interaction of a humanoid robot with its environment. The model is capable of learning language production grounded in both, temporal dynamic somatosensation and vision. In particular, the architecture consists of a continuous time recurrent neural network, where parts have different leakage characteristics and thus operate on multiple timescales for every modality and the association of the higher level nodes of all modalities into cell assemblies. Thus, this model features hierarchical concept abstraction in sensation as well as concept decomposition in production, multi-modal integration, and self-organisation of latent representations.


page 16

page 18

page 20

page 21

page 26


Crossmodal Language Grounding in an Embodied Neurocognitive Model

Human infants are able to acquire natural language seemingly easily at a...

A Grounded Approach to Modeling Generic Knowledge Acquisition

We introduce and implement a cognitively plausible model for learning fr...

Toward Abstraction from Multi-modal Data: Empirical Studies on Multiple Time-scale Recurrent Models

The abstraction tasks are challenging for multi- modal sequences as they...

Recurrent Neural Networks with Mixed Hierarchical Structures for Natural Language Processing

Hierarchical structures exist in both linguistics and Natural Language P...

One Model for the Learning of Language

A major target of linguistics and cognitive science has been to understa...

Please sign up or login with your details

Forgot password? Click here to reset