tailieunhanh - Ebook Fundamentals of neural networks: Part 2

(BQ) An exceptionally clear, thorough introduction to neural networks written at an elementary level. Written with the beginning student in mind, the text features systematic discussions of all major neural networks and fortifies the reader's understudy with many examples. | CHAPTER 5 Adaptive Resonance Theory INTRODUCTION Adaptive resonance theory ART was developed by Carpenter and Grossberg 1987a . One form ART1 is designed for clustering binary vectors another ART2 Carpenter Grossberg 1987b accepts continuous-valued vectors. These nets cluster inputs by using unsupervised learning. Input patterns may be presented in any order. Each time a pattern is presented an appropriate cluster unit is chosen and that cluster s weights are adjusted to let the cluster unit learn the pattern. As is often the case in clustering nets the weights on a cluster unit may be considered to be an exemplar or code vector for the patterns placed on that cluster. Motivation Adaptive resonance theory nets are designed to allow the user to control the degree of similarity of patterns placed on the same cluster. However since input patterns may differ in their level of detail number of components that are nonzero the relative similarity of an input pattern to the weight vector for a cluster unit rather than the absolute difference between the vectors is used. A difference in one component is more significant in patterns that have very few nonzero components than it is in patterns with many nonzero components . As the net is trained each training pattern may be presented several times. A pattern may be placed on one cluster unit the first time it is presented and then 218 Sec. Introduction 219 placed on a different cluster when it is presented later due to changes in the weights for the first cluster if it has learned other patterns in the meantime. A stable net will not return a pattern to a previous cluster in other words a pattern oscillating among different cluster units at different stages of training indicates an unstable net. Some nets achieve stability by gradually reducing the learning rate as the same set of training patterns is presented many times. However this does not allow the net to learn readily a new pattern that is presented for .

crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.