tailieunhanh - Báo cáo khoa học: "Temporal Restricted Boltzmann Machines for Dependency Parsing"

We propose a generative model based on Temporal Restricted Boltzmann Machines for transition based dependency parsing. The parse tree is built incrementally using a shiftreduce parse and an RBM is used to model each decision step. The RBM at the current time step induces latent features with the help of temporal connections to the relevant previous steps which provide context information. Our parser achieves labeled and unlabeled attachment scores of and respectively, which compare well with similar previous models and the state-of-the-art. . | Temporal Restricted Boltzmann Machines for Dependency Parsing Nikhil Garg Department of Computer Science University of Geneva Switzerland James Henderson Department of Computer Science University of Geneva Switzerland j Abstract We propose a generative model based on Temporal Restricted Boltzmann Machines for transition based dependency parsing. The parse tree is built incrementally using a shift-reduce parse and an RBM is used to model each decision step. The RBM at the current time step induces latent features with the help of temporal connections to the relevant previous steps which provide context information. Our parser achieves labeled and unlabeled attachment scores of and respectively which compare well with similar previous models and the state-of-the-art. 1 Introduction There has been significant interest recently in machine learning methods that induce generative models with high-dimensional hidden representations including neural networks Bengio et al. 2003 Col-lobert and Weston 2008 Bayesian networks Titov and Henderson 2007a and Deep Belief Networks Hinton et al. 2006 . In this paper we investigate how these models can be applied to dependency parsing. We focus on Shift-Reduce transition-based parsing proposed by Nivre et al. 2004 . In this class of algorithms at any given step the parser has to choose among a set of possible actions each representing an incremental modification to the partially built tree. To assign probabilities to these actions previous work has proposed memory-based classifiers Nivre et al. 2004 SVMs Nivre et al. 2006b and Incremental Sigmoid Belief Networks ISBN Titov and Henderson 2007b . In a related earlier 11 work Ratnaparkhi 1999 proposed a maximum entropy model for transition-based constituency parsing. Of these approaches only ISBNs induce highdimensional latent representations to encode parse history but suffer from either very approximate or slow inference .

TỪ KHÓA LIÊN QUAN
crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.