Parsing to learn |
Journal/Book: J Psycholinguist Res. 1998; 27: 233 Spring St, New York, NY 10013. Plenum Publ Corp. 339-374.
Abstract: Learning a language by parameter setting is almost certainly less onerous than com posing a grammar from scratch. But recent computational modeling of how parameters are set has shown that it is not at all the simple mechanical process sometimes imagined. Sentences must be parsed to discover the properties that select between parameter values. But the sentences that drive learning cannot be parsed with the learner's current grammar. And there is not much point in parsing them with just one new grammar. They, must apparently be parsed with all possible grammars, in order to find out which one is most successful at licensing the language. The research task is to reconcile this with the fact that the human sentence parsing mechanism, even in adults, has only very limited parallel parsing capacity. I have proposed that all possible grammars can be folded into one, if parameter values are fragments of sentential tree structures that the parser can make use of where necessary to assign a structure to an input sentence. However, the problem of capacity limitations remains. The combined grammar will afford multiple analyses for some sentences, too many to be computed on-line. I propose that the parser computes only one analysis per sentence but can detect ambiguity, and that the learner makes use of unambiguous input only. This provides secure information but relatively little of it, particularly at early, stages of learning where few grammars have been excluded and ambiguity is life. I consider three solutions: improving the parser's ability to extract unambiguous information from partially ambiguous sentences, assuming default parameter values to temporarily eliminate ambiguity, reconfiguring the parameters so that some are subordinate to others and do not present themselves to the learner until the others have been set. A more radical alternative is to give up the quest for error-free teaming and permit parameters to be set without regard for whether the parser may have overlooked an alternative analysis of the sentence. If it can be assumed that the human parser keeps a running tally of the parameter values it has accessed, then the learner would do nothing other than parse sentences for comprehension, as adults do. The most useful parameter values would become more and more easily accessed; the noncontributors would drop out of the running. There would be no learning mechanism at all, over and above the parser But how accurate this system would he, remains to he established.
Note: Article Fodor JD, CUNY Grad Sch & Univ Ctr, Dept Linguist, 33 W 42ND St, New York,NY 10036 USA
Keyword(s): TRIGGERS
© Top Fit Gesund, 1992-2024. Alle Rechte vorbehalten – Impressum – Datenschutzerklärung