Incremental Nonmonotonic Parsing through SemanticSelf-Organization (2003)
Subsymbolic systems have been successfully used to model several aspects of human language processing. Subsymbolic parsers are appealing because they allow combining syntactic, semantic, and thematic constraints in sentence interpretation and revising that interpretation as each word is read in. These parsers are also cognitively plausible: processing is robust and multiple interpretations are simultaneously activated when the input is ambiguous. Yet, it has been very difficult to scale them up to realistic language. They have limited memory capacity, training takes a long time, and it is difficult to represent linguistic structure. In this study, we propose to scale up the subsymbolic approach by utilizing semantic self-organization. The resulting architecture, INSOMNET, was trained on semantic representations of the newly-released LINGO Redwoods HPSG Treebank of annotated sentences from the VerbMobil project. The results show that INSOMNET is able to accurately represent the semantic dependencies while demonstrating expectations and defaults, coactivation of multiple interpretations, and robust parsing of noisy input.
View:
PDF
Citation:
In Proceedings of the 25th Annual Conference of the Cognitive Science Society, 2003.
Bibtex:

Also show archived content
Marshall R. Mayberry III Ph.D. Alumni marty mayberry [at] gmail com
Risto Miikkulainen Faculty risto [at] cs utexas edu
MIR Sentence Processing Package The MIR Sentence Processing package contains the C source code for the MIR system, as well as a selection of scripts wi... 1998