Incremental Nonmonotonic Parsing through Semantic Self-Organization (2003)
Subsymbolic systems have been successfully used to model several aspects of human language processing. Subsymbolic parsers are appealing because they allow combining syntactic, semantic, and thematic constraints in sentence interpretation and nonmonotonically revising that interpretation while incrementally processing a sentence. Such parsers are also cognitively plausible: processing is robust and multiple interpretations are simultaneously activated when the input is ambiguous. Yet, it has proven very difficult to scale them up to realistic language. They have limited memory capacity, training takes a long time, and it is difficult to represent linguistic structure. A new connectionist model, INSOMNet, scales up the subsymbolic approach by utilizing semantic self-organization. INSOMNet was trained on semantic dependency graph representations from the recently-released LinGO Redwoods HPSG Treebank of sentences from the VerbMobil project. The results show that INSOMNet accurately learns to represent these semantic dependencies and generalizes to novel structures. Further evaluation of INSOMNet on the original VerbMobil sentences transcribed with annotations for spoken language demonstrates robust parsing of noisy input, while graceful degradation in performance from adding noise to the network weights underscores INSOMNet's tolerance to damage. Finally, the cognitive plausibility of the model is shown on a standard psycholinguistic benchmark, in which INSOMNet demonstrates expectations and defaults, coactivation of multiple interpretations, nonmonotonicity, and semantic priming.
View:
PDF
Citation:
PhD Thesis, Department of Computer Sciences, the University of Texas at Austin, Austin, TX, 2003. Technical Report AI-TR-04-310.
Bibtex:

Also show archived content
Marshall R. Mayberry III Ph.D. Alumni marty mayberry [at] gmail com
Risto Miikkulainen Faculty risto [at] cs utexas edu