Image-Based Visual Servoing
lesson
We use MATLAB and some Toolbox functions to create a robot controller that moves a camera so the image matches what we want it to look like. We call this an image-based visual servoing system.
lesson
We use MATLAB and some Toolbox functions to create a robot controller that moves a camera so the image matches what we want it to look like. We call this an image-based visual servoing system.
lesson
A critical part of a visual servoing system is establishing correspondence between points in the scene observed by the camera, and points in our desired image of the scene.
lesson
The relationship between world coordinates, image coordinates and camera spatial velocity is elegantly summed up by a single matrix equation that involves what we call the image Jacobian.
lesson
Let’s recap the important points from the topics we have covered about homogeneous coordinates, image formation, camera modeling and planar homographies.
lesson
Let’s learn how to import a color image into MATLAB and see how the data is organized as a matrix with three dimensions.
lesson
It is common to think about an assembly task being specified in terms of coordinates in the 3D world. An alternative approach is to consider the task in terms of the relative position of objects in one or more views of the task — visual servoing.
lesson
We can describe the relationship between a 3D world point and a 2D image plane point, both expressed in homogeneous coordinates, using a linear transformation – a 3×4 matrix. Then we can extend this to account for an image plane which is a regular grid of discrete pixels.
lesson
How is an image formed? The real world has three dimensions but an image has only two. We can use linear algebra and homogeneous coordinates to understand what’s going on. This more general approach allows us to model the positions of pixels in the sensor array and to derive relationships between points on the image […]
lesson
The image Jacobian depends not only on the image plane coordinates but also the distance from the camera to the points of interest. If this distance is not known, what can we do? Let’s look at how we can determine this distance, and how the optical flow equation can be rearranged to convert from observed […]
lesson
When a camera moves in the world, points in the image move in a very specific way. The image plane or pixel velocity is a function of the camera’s motion and the position of the points in the world. This is known as optical flow. Let’s explore the link between camera and image motion.