tailieunhanh - ADVANCES IN THEORY AND APPLICATIONS OF STEREO VISION - P2

Cuốn sách hiện nay phân tích kỹ thuật khác nhau dựa trên mật độ và mô tả ứng dụng của họ. Ngoài ra, mật độ dựa trên phương pháp được so sánh với các phương pháp phân cấp và phân vùng cho phát hiện cụm với hình dạng bất kỳ và phát hiện outlier. Theo kết quả thu được, quang học và LDBSCAN là do thành công nhất tính chính xác của họ và khả năng để có hiệu quả phát hiện ra cụm với mật độ địa phương khác nhau | 9 Address-Event based Stereo Vision with Bio-inspired Silicon Retina Imagers Jurgen Kogler1 Christoph Sulzbachner1 Martin Humenberger1 and Florian Eibensteiner2 1AIT Austrian Institute of Technology 2Upper Austria University of Applied Sciences Austria 1. Introduction Several industry home or automotive applications need 3D or at least range data of the observed environment to operate. Such applications are . driver assistance systems home care systems or 3D sensing and measurement for industrial production. State-of-the-art range sensors are laser range finders or laser scanners LIDAR light detection and ranging time-of-flight TOF cameras and ultrasonic sound sensors. All of them are embedded which means that the sensors operate independently and have an integrated processing unit. This is advantageous because the processing power in the mentioned applications is limited and they are computationally intensive anyway. Another benefits of embedded systems are a low power consumption and a small form factor. Furthermore embedded systems are full customizable by the developer and can be adapted to the specific application in an optimal way. A promising alternative to the mentioned sensors is stereo vision. Classic stereo vision uses a stereo camera setup which is built up of two cameras stereo camera head mounted in parallel and separated by the baseline. It captures a synchronized stereo pair consisting of the left camera s image and the right camera s image. The main challenge of stereo vision is the reconstruction of 3D information of a scene captured from two different points of view. Each visible scene point is projected on the image planes of the cameras. Pixels which represent the same scene points on different image planes correspond to each other. These correspondences can then be used to determine the three dimensional position of the projected scene point in a defined coordinate system. In more detail the horizontal displacement called the disparity is .

TÀI LIỆU LIÊN QUAN
TỪ KHÓA LIÊN QUAN