We previously introduced the Gaussian kernel and the Gaussian kernel is very frequently used in image processing because the ability to smooth the image reduces noise within the image. And particularly if we are computing the derivative of an image noise can be quite significant.

Taking a derivative tends to enhance any noise that is within the image, so it is important to reduce the noise in advance and we can do that by smoothing with a Gaussian. So the operation of computing an image derivative — we can express as a convolution between a derivative kernel D and the input image I. And we denote the result as ∇I; that is, the upside down triangle in front of the I: represents a derivative of the image I.

And we want to smooth the image before we apply the derivative to it, so we represent it like this. We apply the Gaussian image; we convolve the Gaussian image standard deviation σ with the image I, and then we convolve it with the derivative kernel.

Using the rules of associativity we can rewrite it so that we apply the derivative kernel to the Gaussian. Those two together we refer to as a derivative of Gaussian kernel. So convolving DoG kernel with the image performs both smoothing and gradient computation and this is what the DoG kernel looks like and this is its analytical form. We can compute the DoG kernel using the MATLAB tool box function kdgauss, where the first argument is σ and the second argument is the half width of the kernel.

Now sometimes it is useful to compute the second derivative, and the Laplacian is an isotropic second derivative operator. It tells us where the gradient has got a maximum value and it doesn’t matter whether it is in the U- or V- direction. The notation IUU indicates the second derivative in the U direction. IVV is the second derivative in the V direction. We can compute the second derivative by convolving the image with the Laplacian kernel and the Laplacian kernel is simply a symmetric 3 by 3 matrix shown here. And in the MATLAB tool box we can obtain a value of this kernel using the function klaplace.

Computing a secondary derivative is even more noise enhancing then the first derivative, so it is even more important that we smooth the image first. So we can pull the same trick that we did with the first derivative, we write here the secondary derivative operation as the Laplacian kernel convolved with the input image, but if we smooth the image first with a Gaussian kernel using the rules of associativity, we can rewrite this as a convolution of the Laplacian kernel with the Gaussian kernel and that is then convolved with the input image.

The convolution of the Laplacian and Gaussian kernels is referred to as the Laplacian of Gaussian kernel and it is shown here. It has a shape that is often referred to as a Mexican hat, sometimes it is referred to even as the Mexican hat function. It is an upside down Mexican hat; we can see here its analytic form and we can compute this kernel using the tool box function klog, where the first argument is the standard deviation and the second argument is the half width of the kernel.


There is no code in this lesson.

Using the properties of convolution we can combine a simple derivative kernel with Gaussian smoothing to create a derivative of Gaussian (DoG) kernel which is very useful for edge detection, or a Laplacian of Gaussian (LoG) kernel which is useful for detecting regions.

Professor Peter Corke

Professor of Robotic Vision at QUT and Director of the Australian Centre for Robotic Vision (ACRV). Peter is also a Fellow of the IEEE, a senior Fellow of the Higher Education Academy, and on the editorial board of several robotics research journals.

Skill level

This content assumes an understanding of high school level mathematics; for example, trigonometry, algebra, calculus, physics (optics) and experience with MATLAB command line and programming, for example workspace, variables, arrays, types, functions and classes.

More information...

Rate this lesson


Check your understanding

Leave a comment