Dynamic Context Generation for Natural Language Understanding: A Multifaceted Knowledge Approach
Abstract
��We describe a comprehensive framework for text un- derstanding, based on the representation of context. It is designed to serve as a representation of semantics for the full range of in- terpretive and inferential needs of general natural language pro- cessing. Its most distinctive feature is its uniform representation of the various simple and independent linguistic sources that play a role in determining meaning: lexical associations, syntactic re- strictions, case-role expectations, and most importantly, contextual effects. Compositional syntactic structure from a shallow parsing is represented in a neural net-based associative memory, where it then interacts through a Bayesian network with semantic associa- tions and the context or “gist” of the passage carried forward from preceding sentences. Experiments with more than 2000 sentences in different languages are included.