tailieunhanh - Báo cáo khoa học: "COMPUTATIONAL PLEXITY AND LEXICAL FUNCTIONAL GRAMMAR"

An important goal of ntodent linguistic theory is to characterize as narrowly as possible the class of natural !anguaooes. An adequate linguistic theory should be broad enough to cover observed variation iu human languages, and yet narrow enough to account for what might be dubbed "cognitive demands" -- among these, perhaps, the demands of lcarnability and pars,ability. If cognitive demands are to carry any real theoretical weight, then presumably a language may be a (theoretically) pos~ible human language, and yet be "inaccessible" because it is not leanmble or pa~able. . | COMPUTA TIONAL COMPLEXITY AND LEXICAL FUNCTIONAL GRAMMAR Robert c. Berwick MIT Artificial Intelligence Laboratory Cambridge MA 1. INTRODUCTION An important goal of modem linguistic theory is to characterize as narrowly as possible the class of natural languages. An adequate linguistic theory should be broad enough to cover observed variation ill human languages and yet narrow enough to account for what might be dubbed cognitive demands -- among these perhaps the demands of learnability and parsability. If cognitive demands arc to carry any real theoretical weight then presumably a language may be a theoretically possible human language and yet be inaccessible because it is not learnable or parsable. Formal results along these lines have already been obtained for certain kinds of Transformational Generative Grammars for example Peters and Ritchie 1J showed that Arpec s-style unrestricted transformational grammars can generate any recursively enumerable set while Rounds 2 3 extended this work by demonstrating that modestly resuicted transformational grammars TGs can generate languages whose recognition time is provably exponential. In Rounds proof transformations are subject to a terminal length non-decreasing condition as suggested by Peters and Myhill. Thus in the worst case TGs generate languages whose recognition is widely recognized to be computationally intractable. Whether this worst case complexity analysis has any real import for actual linguistic study has been the subject of some debate for discussion see Chomsky 4 Berwick and Weinberg 5 . Without resolving that controversy here however one thing can be said to make TGs efficiently parsablc one might provide additional constraints. For instance these additional strictures could be roughly of the sort advocated in Marcus work on parsing 6 -- constraints specifying that TG-based languages must have parsers that meet certain locality conditions . The Marcus constraints apparently amount to an extension of .

TỪ KHÓA LIÊN QUAN
TÀI LIỆU MỚI ĐĂNG
crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.