tailieunhanh - Lecture Machine learning (2014-2015) - Lecture 5: Optimisation

Many machine learning problems can be cast as optimization problems. This lecture introduces optimization. The objective is for you to learn: The definitions of gradient and Hessian; the gradient descent algorithm; Newton’s algorithm; stochastic gradient descent (SGD) for online learning; popular variants, such as AdaGrad and Asynchronous SGD;. | UNIVERSITY OF 5 OXFORD . I Optimization Nando de Freitas Outline of the lecture Many machine learning problems can be cast as optimization problems. This lecture introduces optimization. The objective is for you to learn The definitions of gradient and Hessian. The gradient descent algorithm. Newton s algorithm. Stochastic gradient descent SGD for online learning. Popular variants such as AdaGrad and Asynchronous SGD. Improvements such as momentum and Polyak averaging. How to apply all these algorithms to linear regression. Calculus background Partial derivatives and gradient ỡdAôiA

TỪ KHÓA LIÊN QUAN
crossorigin="anonymous">
Đã phát hiện trình chặn quảng cáo AdBlock
Trang web này phụ thuộc vào doanh thu từ số lần hiển thị quảng cáo để tồn tại. Vui lòng tắt trình chặn quảng cáo của bạn hoặc tạm dừng tính năng chặn quảng cáo cho trang web này.