tailieunhanh - Báo cáo khoa học: "Two-Level, Many-Paths Generation"

Large-scale natural language generation requires the integration of vast mounts of knowledge: lexical, grammatical, and conceptual. A robust generator must be able to operate well even when pieces of knowledge axe missing. It must also be robust against incomplete or inaccurate inputs. To attack these problems, we have built a hybrid generator, in which gaps in symbolic knowledge are filled by statistical methods. We describe algorithms and show experimental results. | Two-Level Many-Paths Generation Kevin Knight use Information Sciences Institute 4676 Admiralty Way Marina del Rey CA 90292 knight@ Vasileios Hatzivassiloglou Department of Computer Science Columbia University New York NY 10027 vh@ Abstract Large-scale natural language generation requires the integration of vast amounts of knowledge lexical grammatical and conceptual. A robust generator must be able to operate well even when pieces of knowledge are missing. It must also be robust against incomplete or inaccurate inputs. To attack these problems we have built a hybrid generator in which gaps in symbolic knowledge are filled by statistical methods. We describe algorithms and show experimental results. We also discuss how the hybrid generation model can be used to simplify current generators and enhance theừ portability even when perfect knowledge is in principle obtainable. 1 Introduction A large-scale natural language generation NLG system for unrestricted text should be able to operate in an environment of 50 000 conceptual terms and 100 000 words or phrases. Turning conceptual expressions into English requires the integration of large knowledge bases KBs including grammar ontology lexicon collocations and mappings between them. The quality of an NLG system depends on the quality of its inputs and knowledge bases. Given that perfect KBs do not yet exist an important question arises can we build high-quality NLG systems that are robust against incomplete KBs and inputs Although robustness has been heavily studied in natural language understanding Weischedel and Black 1980 Hayes 1981 Lavie 1994 it has received much less attention in NLG Robin 1995 . We describe a hybrid model for natural language generation which offers improved performance in the presence of knowledge gaps in the generator the grammar and the lexicon and of errors in the semantic input. The model comes out of our practical experience in building a large Japanese-English .

crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.