Đang chuẩn bị liên kết để tải về tài liệu:
Lecture Machine learning (2014-2015) - Lecture 4: Nonlinear ridge regression risk, regularization, and cross-validation
Đang chuẩn bị nút TẢI XUỐNG, xin hãy chờ
Tải xuống
This lecture will teach you how to fit nonlinear functions by using bases functions and how to control model complexity. The goal is for you to: Learn how to derive ridge regression; understand the trade-off of fitting the data and regularizing it; Learn polynomial regression; understand that, if basis functions are given, the problem of learning the parameters is still linear; learn cross-validation; understand model complexity and generalization. | Nonlinear ridge regression Risk regularization and cross-validation . Nando de Freitas w XV . r OMef ft14 - .H M lift ML B UNIVERSITY OF OXFORD Outline of the lecture This lecture will teach you how to fit nonlinear functions by using bases functions and how to control model complexity. The goal is for you to Learn how to derive ridge regression. Understand the trade-off of fitting the data and regularizing it. Learn polynomial regression. Understand that if basis functions are given the problem of learning the parameters is still linear. Learn cross-validation. Understand model complexity and generalization. Regularization All the answers so far are of therofm _ IIIỈ . m 0 1 XrX xXry They require the inversion of x7 X. This can lead to problems if the system of equations is poorly conditioned. A solution is to add a small element to the diagonal - 0 XTX 52Id -xXTy This is the ridge regression estimate. It is the solution to the following regularised quadratic cost function J 0 y -X0 T y - X0 0 .