Extracting and Learning a Dependency-Enhanced Type Lexicon for Dutch
This thesis is concerned with type-logical grammars and their practical applicability as tools of reasoning about sentence syntax and semantics. The focal point is narrowed to Dutch, a language exhibiting a large degree of word order variability. In order to overcome difficulties arising as a result of that variability, the thesis explores and expands upon a type grammar based on Multiplicative Intuitionistic Linear Logic, agnostic to word order but enriched with decorations that aim to reduce its proof-theoretic complexity. An algorithm for the conversion of dependency-annotated sentences into type sequences is then implemented, populating the type logic with concrete, data-driven lexical types. Two experiments are ran on the resulting grammar instantiation. The first pertains to the learnability of the type-assignment process by a neural architecture. A novel application of a self-attentive sequence transduction model is proposed; contrary to established practices, it constructs types inductively by internalizing the type-formation syntax, thus exhibiting generalizability beyond a pre-specified type vocabulary. The second revolves around a deductive parsing system that can resolve structural ambiguities by consulting both word and type information; preliminary results suggest both excellent computational efficiency and performance.
READ FULL TEXT