Incremental Nonmonotonic Sentence Interpretation through Semantic Self-Organization (2008)
Subsymbolic systems have been successfully used to model several aspects of human language processing. Yet, it has proven difficult to scale them up to realistic language. They have limited memory capacity, long training times, and difficulty representing the wealth of linguistic structure. In this paper, a new connectionist model, InSomNet, is presented that scales up by utilizing semantic self-organization. InSomNet was trained on semantic dependency graph representations from the Redwoods Treebank of sentences from the VerbMobil project. The results show that InSomNet learns to represent these semantic dependencies accurately and generalizes to novel structures. Further evaluation of InSomNet on the original spoken language transcripts shows that it can also process noisy input robustly, and its performance degrades gracefully when noise is added to the network weights, underscoring how InSomNet tolerates damage. It interprets sentences nonmonotonically, i.e., it generates expectations and revises them, primes future inputs based on semantics, and coactivates multiple interpretations in the output. In other words, while scaling up it still retains the cognitively valid behavior typical of subsymbolic systems.
Technical Report AI08-12, Department of Computer Sciences, University of Texas at Austin, Austin, TX, 2008.

Marshall R. Mayberry III Ph.D. Alumni marty mayberry [at] gmail com
Risto Miikkulainen Faculty risto [at] cs utexas edu
MIR Sentence Processing Package The MIR Sentence Processing package contains the C source code for the MIR system, as well as a selection of scripts wi... 1998