language-icon Old Web
English
Sign In

Image moment

In image processing, computer vision and related fields, an image moment is a certain particular weighted average (moment) of the image pixels' intensities, or a function of such moments, usually chosen to have some attractive property or interpretation. Image moments are useful to describe objects after segmentation. Simple properties of the image which are found via image moments include area (or total intensity), its centroid, and information about its orientation. For a 2D continuous function f(x,y) the moment (sometimes called 'raw moment') of order (p + q) is defined as for p,q = 0,1,2,...Adapting this to scalar (greyscale) image with pixel intensities I(x,y), raw image moments Mij are calculated by In some cases, this may be calculated by considering the image as a probability density function, i.e., by dividing the above by A uniqueness theorem (Hu ) states that if f(x,y) is piecewise continuous and has nonzero values only in a finite part of the xy plane, moments of all orders exist, and the moment sequence (Mpq) is uniquely determined by f(x,y). Conversely, (Mpq) uniquely determines f(x,y). In practice, the image is summarized with functions of a few lower order moments. Simple image properties derived via raw moments include: Central moments are defined as

[ "Feature detection (computer vision)", "Image processing" ]
Parent Topic
Child Topic
    No Parent Topic