Basic syntax from speech: Spontaneous concatenation in unsupervised deep neural networks

05/02/2023
by   Gašper Beguš, et al.
15

Computational models of syntax are predominantly text-based. Here we propose that basic syntax can be modeled directly from raw speech in a fully unsupervised way. We focus on one of the most ubiquitous and basic properties of syntax – concatenation. We introduce spontaneous concatenation: a phenomenon where convolutional neural networks (CNNs) trained on acoustic recordings of individual words start generating outputs with two or even three words concatenated without ever accessing data with multiple words in the input. Additionally, networks trained on two words learn to embed words into novel unobserved word combinations. To our knowledge, this is a previously unreported property of CNNs trained on raw speech in the Generative Adversarial Network setting and has implications both for our understanding of how these architectures learn as well as for modeling syntax and its evolution from raw acoustic inputs.

READ FULL TEXT
research
06/04/2020

CiwGAN and fiwGAN: Encoding information in acoustic data to model lexical learning with Generative Adversarial Networks

How can deep neural networks encode information that corresponds to word...
research
10/27/2022

Articulation GAN: Unsupervised modeling of articulatory learning

Generative deep neural networks are widely used for speech synthesis, bu...
research
06/06/2020

Generative Adversarial Phonology: Modeling unsupervised phonetic and phonological learning with neural networks

Training deep neural networks on well-understood dependencies in speech ...
research
04/03/2013

Estimating Phoneme Class Conditional Probabilities from Raw Speech Signal using Convolutional Neural Networks

In hybrid hidden Markov model/artificial neural networks (HMM/ANN) autom...
research
11/10/2020

Artificial sound change: Language change and deep convolutional neural networks in iterative learning

This paper proposes a framework for modeling sound change that combines ...
research
08/31/2019

Quantity doesn't buy quality syntax with neural language models

Recurrent neural networks can learn to predict upcoming words remarkably...

Please sign up or login with your details

Forgot password? Click here to reset