Marr–Hildreth algorithm

(Redirected from Marr-Hildreth algorithm)

In computer vision, the Marr–Hildreth algorithm is a method of detecting edges in digital images, that is, continuous curves where there are strong and rapid variations in image brightness.[1] The Marr–Hildreth edge detection method is simple and operates by convolving the image with the Laplacian of the Gaussian function, or, as a fast approximation by difference of Gaussians. Then, zero crossings are detected in the filtered result to obtain the edges. The Laplacian-of-Gaussian image operator is sometimes also referred to as the Mexican hat wavelet due to its visual shape when turned upside-down. David Marr and Ellen C. Hildreth are two of the inventors.[2]

Limitations

edit

The Marr–Hildreth operator suffers from two main limitations. It generates responses that do not correspond to edges, so-called "false edges", and the localization error may be severe at curved edges. Today, there are much better edge detection methods, such as the Canny edge detector based on the search for local directional maxima in the gradient magnitude, or the differential approach based on the search for zero crossings of the differential expression that corresponds to the second-order derivative in the gradient direction (both of these operations preceded by a Gaussian smoothing step). For more details, see the article on edge detection.

See also

edit

References

edit
  1. ^ Marr, D.; Hildreth, E. (29 Feb 1980). "Theory of Edge Detection". Proceedings of the Royal Society of London. Series B, Biological Sciences. 207 (1167): 187–217. doi:10.1098/rspb.1980.0020. PMID 6102765.
  2. ^ Umbaugh, Scott E (2010). Digital image processing and analysis : human and computer vision applications with CVIPtools (2nd ed.). Boca Raton, Florida: CRC Press. ISBN 978-1-4398-0205-2.