Displaying Stereo Images
lesson
Humans have long been fascinated with seeing images and movies in ā3Dā. Let’s look at how human stereo vision works and some of the technologies used to present images to our eyes in ā3Dā.
lesson
Humans have long been fascinated with seeing images and movies in ā3Dā. Let’s look at how human stereo vision works and some of the technologies used to present images to our eyes in ā3Dā.
lesson
One very powerful trick used by humans is binocular vision. The images from each eye are quite similar, but there is a small horizontal shift, a disparity, between them and that shift is a function of the object distance.
lesson
Given two images of a scene taken from slightly different viewpoints, a stereo image pair, it’s possible to determine the disparity for every pixel using template matching. The disparity image is one where the value of each pixel is inversely related to the distance between that point in the scene and the camera.
lesson
Let’s recap the important points from the topics we have covered about human depth perception, display of 3D images and estimating 3D scene structure using stereo and other types of sensors.
lesson
The relationship between world coordinates, image coordinates and camera spatial velocity has some interesting ramifications. Some very different camera motions cause identical motion of points in the image, and some camera motions leads to no change in the image at all in some parts of the image. Let’s explore at these phenomena and how we […]
lesson
An image is a two dimensional projection of a three dimensional world. The big problem with this projection is that big distant objects appear the same size as small close objects. For people, and robots, it’s important to distinguish these different situations. Let’s look at how humans and robots can determine the scale of objects […]
lesson
We previously learnt how to derive a Jacobian which relates the velocity of a point, defined relative to one coordinate frame, to the velocity relative to a different coordinate frame. Now we extend that to the 3D case.
lesson
We use MATLAB and some Toolbox functions to create a robot controller that moves a camera so the image matches what we want it to look like. We call this an image-based visual servoing system.
lesson
A critical part of a visual servoing system is establishing correspondence between points in the scene observed by the camera, and points in our desired image of the scene.
lesson
The relationship between world coordinates, image coordinates and camera spatial velocity is elegantly summed up by a single matrix equation that involves what we call the image Jacobian.