tailieunhanh - Báo cáo khoa học: "Word Vectors and Two Kinds of Similarity"

This paper examines what kind of similarity between words can be represented by what kind of word vectors in the vector space model. Through two experiments, three methods for constructing word vectors, ., LSA-based, cooccurrence-based and dictionary-based methods, were compared in terms of the ability to represent two kinds of similarity, ., taxonomic similarity and associative similarity. The result of the comparison was that the dictionary-based word vectors better reflect taxonomic similarity, while the LSAbased and the cooccurrence-based word vectors better reflect associative similarity. . | Word Vectors and Two Kinds of Similarity Akira Utsumi and Daisuke Suzuki Department of Systems Engineering The University of Electro-Communications 1-5-1 Chofugaoka Chofushi Tokyo 182-8585 Japan utsumi@ dajie@ Abstract This paper examines what kind of similarity between words can be represented by what kind of word vectors in the vector space model. Through two experiments three methods for constructing word vectors . LSA-based cooccurrence-based and dictionary-based methods were compared in terms of the ability to represent two kinds of similarity . taxonomic similarity and associative similarity. The result of the comparison was that the dictionary-based word vectors better reflect taxonomic similarity while the LSA-based and the cooccurrence-based word vectors better reflect associative similarity. 1 Introduction Recently geometric models have been used to represent words and their meanings and proven to be highly useful both for many NLP applications associated with semantic processing Widdows 2004 and for human modeling in cognitive science Gardenfors 2000 Landauer and Dumais 1997 . There are also good reasons for studying geometric models in the field of computational linguistics. First geometric models are cost-effective in that it takes much less time and less effort to construct large-scale geometric representation of word meanings than it would take to construct dictionaries or thesauri. Second they can represent the implicit knowledge of word meanings that dictionaries and thesauri cannot do. Finally geometric representation is easy to revise and extend. A vector space model is the most commonly used geometric model for the meanings of words. The basic idea of a vector space model is that words are represented by high-dimensional vectors . word vectors and the degree of semantic similarity between any two words can be easily computed as a cosine of the angle formed by their vectors. A number of methods have been .

crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.