Properties of a Digital Image
A digitalized image has three basic properties: resolution, definition, and number of planes.
Spatial Resolution
The spatial resolution of an image is its number of rows and columns of pixels.
Image Definition
The definition of an image indicates the number of shades that you can see in an image. The bit depth of an image is the number of bits used to encode the value of a pixel. For a given bit depth of n, the image has an image definition of 2n, meaning a pixel can have 2n different values. For example, for a bit depth of 8 bits, a pixel can take 256 different values ranging from 0 to 255. For a bit depth of 16 bits, a pixel can take 65,536 different values ranging from 0 to 65,535 or from -32,768 to 32,767.
The manner in which you encode your image depends on the nature of the image acquisition device, the type of image processing you need to use, and the type of analysis you need to perform. For example, 8-bit encoding is sufficient if you need to obtain the shape information of objects in an image. However, if you need to precisely measure the light intensity of an image or region in an image, you should use 16-bit or floating-point encoding.
Use color encoded images when your machine vision or image processing application depends on the color content of the objects you are inspecting or analyzing.
Number of Planes
The number of planes in an image corresponds to the number of arrays of pixels that compose the image. A grayscale or pseudo-color image is composed of one plane. A true-color image is composed of three planes—one each for the red component, green component, and blue component.
In true-color images, the color component intensities of a pixel are coded into three different values. The color image is the combination of three arrays of pixels corresponding to the red, green, and blue components in an RGB image. HSL images are defined by their hue, saturation, and luminance values. Refer to Color Basics for more information about color images and when they are used.