For further support information, refer to the Technical Support and Professional Services appendix. To comment
on National Instruments documentation, refer to the National Instruments Web site at ni.com/info and enter
the info code feedback.
The media on which you receive National Instruments software are warranted not to fail to execute programming instructions, due to defects
in materials and workmanship, for a period of 90 days from date of shipment, as evidenced by receipts or other documentation. National
Instruments will, at its option, repair or replace software media that do not execute programming instructions if National Instruments receives
notice of such defects during the warranty period. National Instruments does not warrant that the operation of the software shall be
uninterrupted or error free.
A Return Material Authorization (RMA) number must be obtained from the factory and clearly marked on the outside of the package before
any equipment will be accepted for warranty work. National Instruments will pay the shipping costs of returning to the owner parts which are
covered by warranty.
National Instruments believes that the information in this document is accurate. The document has been carefully reviewed for technical
accuracy. In the event that technical or typographical errors exist, National Instruments reserves the right to make changes to subsequent
editions of this document without prior notice to holders of this edition. The reader should consult National Instruments if errors are suspected.
In no event shall National Instruments be liable for any damages arising out of or related to this document or the information contained in it.
E
XCEPTASSPECIFIEDHEREIN, NATIONAL INSTRUMENTSMAKESNOWARRANTIES, EXPRESSORIMPLIED, ANDSPECIFICALLYDISCLAIMSANYWAR RANTYOF
MERCHANTABILITYORFITNESSFORAPARTICULARPURPOSE . CUSTOMER’SRIGHTTORECOVERDAMAGESCAUSEDBYFAULTORNEGLIGENCEONTHEPART OF
N
ATIONAL INSTRUMENTSSHALLBELIMITEDTOTHEAMOUNTTHERETOFOREPAIDBYTHECUSTOMER. NATIONAL INSTRUMENTSWILLNOTBELIABLEFOR
DAMAGESRESULTINGFROMLOSSOFDATA, PROFITS, USEOFPRODUCTS, ORINCIDENTALORCONSEQUENTIALDAMAGES, EVENIFADVISEDOFTHEPOSS IBILITY
THEREOF. This limitation of the liability of National Instruments will apply regardless of the form of action, whether in contract or tort, including
negligence. Any action against National Instruments must be brought within one year after the cause of action accrues. National Instruments
shall not be liable for any delay in performance due to causes beyond its reasonable control. The warranty provided herein does not cover
damages, defects, malfunctions, or service failures caused by owner’s failure to follow the National Instruments installation, operation, or
maintenance instructions; owner’s modification of the product; owner’s abuse, misuse, or negligent acts; and power failure or surges, fire,
flood, accident, actions of third parties, or other events outside reasonable control.
Copyright
Under the copyright laws, this publication may not be reproduced or transmitted in any form, electronic or mechanical, including photocopying,
recording, storing in an information retrieval system, or translating, in whole or in part, without the prior written consent of National
Instruments Corporation.
Trademarks
CVI™, IMAQ™, LabVIEW™, National Instruments™, National Instruments Alliance Partner™, NI™, ni.com™, NI Developer Zone™,
and NI-IMAQ
Product and company names mentioned herein are trademarks or trade names of their respective companies.
Members of the National Instruments Alliance Partner Program are business entities independent from National Instruments and have no
agency, partnership, or joint-venture relationship with National Instruments.
™
are trademarks of National Instruments Corporation.
Patents
For patents covering National Instruments products, refer to the appropriate location: Help»Patents in your software, the patents.txt file
on your CD, or
ni.com/patents.
WARNING REGARDING USE OF NATIONAL INSTRUMENTS PRODUCTS
(1) NATIONAL INSTRUMENTS PRODUCTS ARE NOT DESIGNED WITH COMPONENTS AND TESTING FOR A LEVEL OF
RELIABILITY SUITABLE FOR USE IN OR IN CONNECTION WITH SURGICAL IMPLANTS OR AS CRITICAL COMPONENTS IN
ANY LIFE SUPPORT SYSTEMS WHOSE FAILURE TO PERFORM CAN REASONABLY BE EXPECTED TO CAUSE SIGNIFICANT
INJURY TO A HUMAN.
(2) IN ANY APPLICATION, INCLUDING THE ABOVE, RELIABILITY OF OPERATION OF THE SOFTWARE PRODUCTS CAN BE
IMPAIRED BY ADVERSE FACTORS, INCLUDING BUT NOT LIMITED TO FLUCTUATIONS IN ELECTRICAL POWER SUPPLY,
COMPUTER HARDWARE MALFUNCTIONS, COMPUTER OPERATING SYSTEM SOFTWARE FITNESS, FITNESS OF COMPILERS
AND DEVELOPMENT SOFTWARE USED TO DEVELOP AN APPLICATION, INSTALLATION ERRORS, SOFTWARE AND
HARDWARE COMPATIBILITY PROBLEMS, MALFUNCTIONS OR FAILURES OF ELECTRONIC MONITORING OR CONTROL
DEVICES, TRANSIENT FAILURES OF ELECTRONIC SYSTEMS (HARDWARE AND/OR SOFTWARE), UNANTICIPATED USES OR
MISUSES, OR ERRORS ON THE PART OF THE USER OR APPLICATIONS DESIGNER (ADVERSE FACTORS SUCH AS THESE ARE
HEREAFTER COLLECTIVELY TERMED “SYSTEM FAILURES”). ANY APPLICATION WHERE A SYSTEM FAILURE WOULD
CREATE A RISK OF HARM TO PROPERTY OR PERSONS (INCLUDING THE RISK OF BODILY INJURY AND DEATH) SHOULD
NOT BE RELIANT SOLELY UPON ONE FORM OF ELECTRONIC SYSTEM DUE TO THE RISK OF SYSTEM FAILURE. TO AVOID
DAMAGE, INJURY, OR DEATH, THE USER OR APPLICATION DESIGNER MUST TAKE REASONABLY PRUDENT STEPS TO
PROTECT AGAINST SYSTEM FAILURES, INCLUDING BUT NOT LIMITED TO BACK-UP OR SHUT DOWN MECHANISMS.
BECAUSE EACH END-USER SYSTEM IS CUSTOMIZED AND DIFFERS FROM NATIONAL INSTRUMENTS' TESTING
PLATFORMS AND BECAUSE A USER OR APPLICATION DESIGNER MAY USE NATIONAL INSTRUMENTS PRODUCTS IN
COMBINATION WITH OTHER PRODUCTS IN A MANNER NOT EVALUATED OR CONTEMPLATED BY NATIONAL
INSTRUMENTS, THE USER OR APPLICATION DESIGNER IS ULTIMATELY RESPONSIBLE FOR VERIFYING AND VALIDATING
THE SUITABILITY OF NATIONAL INSTRUMENTS PRODUCTS WHENEVER NATIONAL INSTRUMENTS PRODUCTS ARE
INCORPORATED IN A SYSTEM OR APPLICATION, INCLUDING, WITHOUT LIMITATION, THE APPROPRIATE DESIGN,
PROCESS AND SAFETY LEVEL OF SUCH SYSTEM OR APPLICATION.
In addition to this manual, the following documentation resources are
available to help you create your vision application.
IMAQ Vision
• IMAQ Vision Concepts Manual—If you are new to machine vision
and imaging, read this manual to understand the concepts behind
IMAQ Vision.
•IMAQ Vision for LabWindows/CVI Function Reference—If you need
information about IMAQ Vision functions while creating your
application, refer to this help file.
NI Vision Assistant
• NI Vision Assistant Tutorial—If you need to install NI Vision
Assistant and learn the fundamental features of the software, follow
the instructions in this tutorial.
•NI Vision Assistant Help—If you need descriptions or step-by-step
guidance about how to use any of the functions or features of NI Vision
Assistant, refer to this help file.
NI Vision Builder for Automated Inspection
•NI Vision Builder for Automated Inspection Tutorial—If you have
little experience with machine vision, and you need information about
how to solve common inspection tasks with NI Vision Builder AI,
follow the instructions in this tutorial.
•NI Vision Builder for Automated Inspection: Configuration Help—If you need descriptions or step-by-step guidance about how to
use any of the NI Vision Builder AI functions to create an automated
vision inspection system, refer to this help file.
•NI Vision Builder for Automated Inspection: Inspection Help—If you
need information about how to run an automated vision inspection
system using NI Vision Builder AI, refer to this help file.
IMAQ Vision for LabWindows/CVI User Manualxni.com
Other Documentation
•Your National Instruments image acquisition (IMAQ) device user
•Getting Started With Your IMAQ System—If you need instructions
•NI-IMAQ User Manual—If you need information about how to use
•NI-IMAQ VI or function reference guides—If you need information
• IMAQ Vision Deployment Engine Note to Users—If you need
•Example programs—If you want examples of how to create specific
•Application Notes—If you want to know more about advanced
•NI Developer Zone (NIDZ)—If you want even more information
About This Manual
manual—If you need installation instructions and device-specific
information, refer to your device user manual.
for installing the NI-IMAQ software and your IMAQ hardware,
connecting your camera, running Measurement & Automation
Explorer (MAX) and the NI-IMAQ Diagnostics, selecting a camera
file, and acquiring an image, refer to this getting started document.
NI-IMAQ and IMAQ image acquisition devices to capture images for
processing, refer to this manual.
about the features, functions, and operation of the NI-IMAQ image
acquisition VIs or functions, refer to these help files.
information about how to deploy your custom IMAQ Vision
applications on target computers, read this CD insert.
applications, go to
<CVI>\samples\vision.
IMAQ Vision concepts and applications, refer to the Application
Notes located on the National Instruments Web site at
appnotes.nsf
.
ni.com/
about developing your vision application, visit the NI Developer Zone
at
ni.com/zone. The NI Developer Zone contains example
programs, tutorials, technical presentations, the Instrument Driver
Network, a measurement glossary, an online magazine, a product
advisor, and a community area where you can share ideas, questions,
and source code with vision developers around the world.
This chapter describes the IMAQ Vision for LabWindows/CVI software,
outlines the IMAQ Vision function organization, and lists the steps for
making a machine vision application.
Note Refer to the Vision Development Module Release Notes that came with your
software for information about the system requirements and installation procedure for
IMAQ Vision for LabWindows/CVI.
About IMAQ Vision
IMAQ Vision for LabWindows/CVI—a part of the Vision Development
Module—is a library of C functions that you can use to develop machine
vision and scientific imaging applications. The Vision Development
Module also includes the same imaging functions for LabVIEW,
and ActiveX controls for Microsoft Visual Basic. Vision Assistant, another
Vision Development Module software product, enables you to prototype
your application strategy quickly without having to do any programming.
Additionally, NI offers Vision Builder for Automated Inspection:
configurable machine vision software that you can use to prototype,
benchmark, and deploy applications.
1
Application Development Environments
This release of IMAQ Vision for LabWindows/CVI supports the
following Application Development Environments (ADEs) for
Windows 2000/NT/XP.
•LabWindows/CVI version 6.0 and later
•Microsoft Visual C/C++ version 6.0 and later
Note IMAQ Vision has been tested and found to work with these ADEs, although other
The IMAQ Vision function tree (NIVision.lfp) contains separate
classes corresponding to groups or types of functions. Table 1-1 lists the
IMAQ Vision function types and gives a description of each type.
Table 1-1. IMAQ Vision Function Types
Function TypeDescription
Image
Management
Memory
Management
Error
Management
AcquisitionFunctions that acquire images through an IMAQ hardware device.
DisplayFunctions that cover all aspects of image visualization and image window
OverlayFunctions that create and manipulate overlays.
Regions of
Interest
File I/OFunctions that read and write images from and to files.
CalibrationFunctions that learn calibration information and correct distorted images.
Image
Analysis
Grayscale
Processing
Functions that create space in memory for images and perform basic image
manipulation.
Function that returns, to the operating system, previously used memory that is
no longer needed.
Functions that set the current error, return the name of the function in which the
last error occurred, return the error code of the last error, and clear any pending
errors.
management.
Functions that create and manipulate regions of interest.
Functions that compute the centroid of an image, profile of a line of pixels,
and the mean line profile. This type also includes functions that calculate the
pixel distribution and statistical parameters of an image.
Functions for grayscale image processing and analysis.
Binary
Processing
Color
Processing
Pattern
Matching
IMAQ Vision for LabWindows/CVI User Manual1-2ni.com
Functions for binary image processing and analysis.
Functions for color image processing and analysis.
Functions that learn patterns and search for patterns in images.
Chapter 1Introduction to IMAQ Vision
Table 1-1. IMAQ Vision Function Types (Continued)
Function TypeDescription
CaliperFunctions designed for gauging, measurement, and inspection applications.
OperatorsFunctions that perform arithmetic, logic, and comparison operations with
two images or with an image and a constant value.
Analytic
Geometry
Frequency
Domain
Analysis
Barcode I/OFunctions that find and read barcodes.
LCDFunctions that find and read seven-segment LCD characters.
MeterFunctions that return the arc information of a meter and read the meter.
UtilitiesFunctions that return structures, and a function that returns a pointer to
OCRFunctions that perform optical character recognition on an image.
ClassificationFunctions that classify an image or feature vector.
ObsoleteFunctions that are no longer necessary but may exist in older applications.
Functions that perform basic geometric calculations on an image.
Functions for the extraction and manipulation of complex planes. Functions
of this type perform Fast Fourier Transform (FFT), inverse FFT, truncation,
attenuation, addition, subtraction, multiplication, and division of complex
images.
predefined convolution matrices.
IMAQ Machine Vision Function Tree
The IMAQ Machine Vision function tree (NIMachineVision.fp)
contains separate classes corresponding to groups or types of functions.
Table 1-2 lists the IMAQ Machine Vision function types and gives a
description of each type.
Table 1-2. IMAQ Machine Vision Function Types
Function TypeDescription
Coordinate TransformFunctions that find coordinate transforms based on image contents.
Count and Measure ObjectsFunction that counts and measures objects in an image.
Find PatternsFunction that finds patterns in an image.
Locate EdgesFunctions that locate different types of edges in an image.
Table 1-2. IMAQ Machine Vision Function Types (Continued)
Function TypeDescription
Measure DistancesFunctions that measure distances between objects in an image.
Measure IntensitiesFunctions that measure light intensities in various shaped regions
within an image.
Select Region of InterestFunctions that allow a user to select a specific region of interest in
an image.
Creating IMAQ Vision Applications
Figures 1-1 and 1-2 illustrate the steps for creating an application with
IMAQ Vision. Figure 1-1 describes the general steps to designing a Vision
application. The last step in Figure 1-1 is expanded upon in Figure 1-2.
You can use a combination of the items in the last step to create your IMAQ
Vision application. Refer to the corresponding chapter listed to the side of
the item for more information about items in either diagram.
IMAQ Vision for LabWindows/CVI User Manual1-4ni.com
Chapter 1Introduction to IMAQ Vision
Set Up Your Imaging System
Chapter 2:
Measurement-Ready
Getting
Images
Calibrate Your Imaging System
Create an Image
Acquire or Read an Image
Display an Image
Attach Calibration Information
Analyze an Image
Improve an Image
Improve an Image
Make Measurements or Identify Objects
1
2
3
in an Image Using
Grayscale or Color Measurements, and/or
Particle Analysis, and/or
Machine Vision
Chapter 6:
Calibration
Figure 1-1. General Steps for Designing a Vision Application
Note
Diagram items enclosed with dashed lines are optional steps.
Figure 1-2. Inspection Steps for Building a Vision Application
Note
Diagram items enclosed with dashed lines are optional steps.
IMAQ Vision for LabWindows/CVI User Manual1-6ni.com
Getting Measurement-Ready
Images
This chapter describes how to set up your imaging system, acquire and
display an image, analyze the image, and prepare the image for additional
processing.
Set Up Your Imaging System
Before you acquire, analyze, and process images, you must set up your
imaging system. How you set up your system depends on your imaging
environment and the type of analysis and processing you need to do. Your
imaging system should produce images with high enough quality so that
you can extract the information you need from the images.
Complete the following steps to set up your imaging system.
1.Determine the type of equipment you need given your space
constraints and the size of the object you need to inspect. Refer to
Chapter 3, System Setup and Calibration, of the IMAQ Vision Concepts Manual for more information.
a.Make sure your camera sensor is large enough to satisfy your
minimum resolution requirement.
b.Make sure your lens has a depth of field high enough to keep all
of your objects in focus regardless of their distance from the lens.
Also, make sure your lens has a focal length that meets your
needs.
c.Make sure your lighting provides enough contrast between the
object under inspection and the background for you to extract the
information you need from the image.
2.Position your camera so that it is perpendicular to the object under
inspection. If your camera acquires images of the object from an angle,
perspective errors occur. Even though you can compensate for these
errors with software, NI recommends that you use a perpendicular
inspection angle to obtain the most accurate results.
3.Select an IMAQ device that meets your needs. National Instruments
offers several IMAQ devices, including analog color and monochrome
devices as well as digital devices. Visit
information about IMAQ devices.
4.Configure the driver software for your image acquisition device. If
you have a National Instruments image acquisition device, configure
the NI-IMAQ driver software through MAX. Open MAX by
double-clicking the Measurement & Automation Explorer icon on
your desktop. Refer to the NI-IMAQ User Manual and the
Measurement and Automation Explorer Help for IMAQ for more
information.
Calibrate Your Imaging System
After you set up your imaging system, you may want to calibrate your
system to assign real-world coordinates to pixel coordinates. This allows
you to compensate for perspective and nonlinear errors inherent in your
imaging system.
Perspective errors occur when your camera axis is not perpendicular to the
object under inspection. Nonlinear distortion may occur from aberrations
in the camera lens. Perspective errors and lens aberrations cause images to
appear distorted. This distortion misplaces information in an image, but it
does not necessarily destroy the information in the image.
ni.com/imaq for more
Use simple calibration if you only want to assign real-world coordinates to
pixel coordinates. Use perspective and nonlinear distortion calibration if
you need to compensate for perspective errors and nonlinear lens distortion.
For detailed information about calibration, refer to Chapter 5, Performing
Machine Vision Tasks.
Create an Image
To create an image in IMAQ Vision for LabWindows/CVI, call
imaqCreateImage(). This function returns an image reference you can
use when calling other IMAQ Vision functions. The only limitation to the
size and number of images you can acquire and process is the amount of
memory on your computer. When you create an image, specify the type of
the image. Table 2-1 lists the valid image types.
IMAQ Vision for LabWindows/CVI User Manual2-2ni.com
Chapter 2Getting Measurement-Ready Images
Table 2-1. IMAQ Vision for LabWindows/CVI Image Types
Val ueDescription
IMAQ_IMAGE_U8
IMAQ_IMAGE_I16
IMAQ_IMAGE_SGL
IMAQ_IMAGE_COMPLEX 2 × 32 bits per pixel—floating point, native format after a Fast
8 bits per pixel—unsigned, standard monochrome
16 bits per pixel—signed, monochrome
32 bits per pixel—floating point, monochrome
Fourier Transform (FFT)
IMAQ_IMAGE_RGB
IMAQ_IMAGE_HSL
IMAQ_IMAGE_RGB_U64
32 bits per pixel—standard color
32 bits per pixel—color
64 bits per pixel—standard color
You can create multiple images by executing imaqCreateImage() as
many times as you want. Determine the number of required images through
an analysis of your intended application. The decision is based on different
processing phases and your need to keep the original image after each
processing step. The decision to keep an image occurs before each
processing step.
When you create an image, IMAQ Vision creates an internal image
structure to hold properties of the image, such as its name and border size.
However, no memory is allocated to store the image pixels at this time.
IMAQ Vision functions automatically allocate the appropriate amount of
memory when the image size is modified. For example, functions that
acquire or resample an image alter the image size, so they allocate the
appropriate memory space for the image pixels. The return value of
imaqCreateImage() is a pointer to the image structure. Supply this
pointer as an input to all subsequent IMAQ Vision functions.
Most functions in the IMAQ Vision library require one or more image
pointers. The number of image pointers a function takes depends on the
image processing function and the type of image you want to use. Some
IMAQ Vision functions act directly on the image and require only one
image pointer. Other functions that process the contents of images require
pointers to the source image(s) and to a destination image.
At the end of your application, dispose of each image that you created using
Some IMAQ Vision functions that modify the contents of an image have
source image and destination image input parameters. The source image
receives the image to process. The destination image receives the
processing results. The destination image can receive either another image
or the original, depending on your goals. If you do not want the contents of
the original image to change, use separate source and destination images.
If you want to replace the original image with the processed image, pass the
same image as both the source and destination.
Depending on the function, the image type of the destination image can be
the same or different than the image type of the source image. The function
descriptions in the IMAQ Vision for LabWindows/CVI Function Reference
include the type of images you can use as image inputs and outputs. IMAQ
Vision resizes the destination image to hold the result if the destination is
not the appropriate size.
The following examples illustrate source and destination images with
imaqTranspose():
•
imaqTranspose(myImage, myImage);
This function creates a transposed image using the same image for the
source and destination. The contents of
•
imaqTranspose(myTransposedImage, myImage);
This function creates a transposed image and stores it in a destination
different from the source. The
and
myTransposedImage contains the result.
myImage change.
myImage image remains unchanged,
Functions that perform arithmetic or logical operations between two
images have two source images and a destination image. You can perform
an operation between two images and then either store the result in a
separate destination image or in one of the two source images. In the
latter case, make sure you no longer need the original data in the source
image before storing the result over the data.
The following examples show the possible combinations using
imaqAdd():
•
imaqAdd(myResultImage, myImageA, myImageB);
This function adds two source images (myImageA and myImageB) and
stores the result in a third image (
myResultImage). Both source
images remain intact after processing.
IMAQ Vision for LabWindows/CVI User Manual2-4ni.com
•imaqAdd(myImageA,myImageA,myImageB);
This function adds two source images and stores the result in the first
source image.
•
imaqAdd(myImageB, myImageA, myImageB);
This function adds two source images and stores the result in the
second source image.
Most operations between two images require that the images have the
same type and size. However, some arithmetic operations can work
between two different types of images, such as 8-bit and 16-bit images.
Some functions perform operations that populate an image. Examples of
this type of operation include reading a file, acquiring an image from an
IMAQ device, or transforming a 2D array into an image. This type of
function can modify the size of an image.
Some functions take an additional mask parameter. The presence of this
parameter indicates that the processing or analysis is dependent on the
contents of another image, the image mask.
Note The image mask must be an 8-bit image.
Chapter 2Getting Measurement-Ready Images
If you want to apply a processing or analysis function to the entire image,
pass NULL for the image mask. Passing the same image to both the source
image and image mask also gives the same effect as passing NULL for the
image mask, except in this case the source image must be an 8-bit image.
Acquire or Read an Image
After you create an image reference, you can acquire an image into your
imaging system in three ways. You can acquire an image with a camera
through your IMAQ device, load an image from a file stored on your
computer, or convert data stored in a 2D array to an image. Functions that
acquire images, load images from file, or convert data from a 2D array to
an image automatically allocate the memory space required to
accommodate the image data.
Use one of the following methods to acquire images with a National
Instruments IMAQ device.
•Acquire a single image using
this function, it initializes the IMAQ device and acquires the next
incoming video frame. Use this function for low-speed single capture
applications where ease of programming is essential.
•Acquire a single image using
function, it acquires the next incoming video frame on an IMAQ
device you have already initialized using
i
mgSessionOpen(). Use this function for high-speed single capture
applications.
•Acquire images continually through a grab acquisition. Grab functions
perform high-speed acquisitions that loop continually on one buffer.
Use
imaqSetupGrab() to start the acquisition. Use imaqGrab()
to return a copy of the current image. Use
to stop the acquisition.
•Acquire a fixed number of images using a sequence acquisition.
Set up the acquisition using
imaqStartAcquisition() to acquire the number of images you
requested during setup. If you want to acquire only certain images,
supply
of frames to skip after each acquired frame.
•Acquire images continually through a ringed buffer acquisition.
Set up the acquisition using
imaqStartAcquisition() to start acquiring images into the
acquired ring buffer. To get an image from the ring, call
imaqExtractFromRing() or imaqCopyRing(). Use
imaqStopAcquisition() to stop the acquisition.
imaqEasyAcquire(). When you call
imaqSnap(). When you call this
imgInterfaceOpen() and
imaqStopAcquisition()
imaqSetupSequence(). Use
imaqSetupSequence() with a table describing the number
imaqSetupRing(). Use
Note You must use imgClose() to release resources associated with the image
acquisition device.
Reading a File
Use imaqReadFile() to open and read data from a file stored on your
computer into the image reference. You can read from image files stored
in several standard formats: BMP, TIFF, JPEG, PNG, and AIPD. The
software automatically converts the pixels it reads into the type of image
you pass in.
IMAQ Vision for LabWindows/CVI User Manual2-6ni.com
Chapter 2Getting Measurement-Ready Images
Use imaqReadVisionFile() to open an image file containing additional
information, such as calibration information, template information for
pattern matching, or overlay information. For more information about
pattern matching templates and overlays, refer to Chapter 5, Performing
Machine Vision Tasks.
You can also use
properties—such as image size, recommended image type, and
calibration units—without actually reading all the image data.
Converting an Array to an Image
Use imaqArrayToImage() to convert a 2D array to an image. You can
also use
imaqImageToArray() to convert an image to a 2D array.
Display an Image
Display an image in an external window using imaqDisplayImage().
You can display images in 16 different external windows. Use the other
display functions to configure the appearance of each external window.
Properties you can set include whether the window has scroll bars, a title
bar, and whether it is resizable. You can also use
position the external image window at a particular location on you monitor.
Refer to the IMAQ Vision for LabWindows/CVI Function Reference for a
complete list of Display functions.
Note Image windows are not LabWindows/CVI panels. They are managed directly by
IMAQ Vision .
You can use a color palette to display grayscale images by applying a color
palette to the window. Use
color palettes. For example, if you need to display a binary image—an
image containing particle regions with pixel values of 1 and a background
region with pixel values of 0—apply the predefined binary palette. For
more information about color palettes, refer to Chapter 2, Display, of the
IMAQ Vision Concepts Manual.
imaqGetFileInfo() to retrieve image
imaqMoveWindow() to
imaqSetWindowPalette() to set predefined
Note At the end of your application, close all open external windows using
If you want to attach the calibration information of the current setup to
each image you acquire, use
function takes in a source image containing the calibration information and
a destination image that you want to calibrate. The output image is your
inspection image with the calibration information attached to it. For
detailed information about calibration, refer to Chapter 6, Calibrating
Images.
Note Because calibration information is part of the image, it is propagated throughout
the processing and analysis of the image. Functions that modify the image size, such as
geometrical transforms, void the calibration information. Use
to save the image and all of the attached calibration information to a file.
Analyze an Image
After you acquire and display an image, you may want to analyze the
contents of the image for the following reasons:
•To determine whether the image quality is high enough for your
inspection task.
•To obtain the values of parameters that you want to use in processing
functions during the inspection process.
imaqCopyCalibrationInfo(). This
imaqWriteVisionFile()
The histogram and line profile tools can help you analyze the quality of
your images.
Use
imaqHistogram() to analyze the overall grayscale distribution in the
image. Use the histogram of the image to analyze two important criteria
that define the quality of an image—saturation and contrast. If your image
is underexposed, or does not have enough light, the majority of your pixels
will have low intensity values, which appear as a concentration of peaks on
the left side of your histogram. If your image is overexposed, or has too
much light, the majority of your pixels will have high intensity values,
which appear as a concentration of peaks on the right side of your
histogram. If your image has an appropriate amount of contrast, your
histogram will have distinct regions of pixel concentrations. Use the
histogram information to decide if the image quality is high enough to
separate objects of interest from the background.
IMAQ Vision for LabWindows/CVI User Manual2-8ni.com
Chapter 2Getting Measurement-Ready Images
If the image quality meets your needs, use the histogram to determine the
range of pixel values that correspond to objects in the image. You can use
this range in processing functions, such as determining a threshold range
during particle analysis.
If the image quality does not meet your needs, try to improve the imaging
conditions to get the necessary image quality. You may need to re-evaluate
and modify each component of your imaging setup, including lighting
equipment and setup, lens tuning, camera operation mode, and acquisition
device parameters. If you reach the best possible conditions with your setup
but the image quality still does not meet your needs, try to improve the
image quality using the image processing techniques described in the
Improve an Image section of this chapter.
Use
imaqLineProfile() to get the pixel distribution along a line in the
image, or use
one-dimensional path in the image. By looking at the pixel distribution, you
can determine if the image quality is high enough to provide you with sharp
edges at object boundaries. Also, you can determine if the image is noisy
and identify the characteristics of the noise.
If the image quality meets your needs, use the pixel distribution
information to determine some parameters of the inspection functions you
want to use. For example, use the information from the line profile to
determine the strength of the edge at the boundary of an object. You can
input this information into
along the line.
imaqROIProfile() to get the pixel distribution along a
imaqEdgeTool2() to find the edges of objects
Improve an Image
Using the information you gathered from analyzing your image, you may
want to improve the quality of your image for inspection. You can improve
your image with lookup tables, filters, grayscale morphology, and FFTs.
Apply lookup table (LUT) transformations to highlight image details in
areas containing significant information at the expense of other areas.
A LUT transformation converts input grayscale values in the source image
into other grayscale values in the transformed image. IMAQ Vision
provides four functions that directly or indirectly apply lookup tables to
images.
•
imaqMathTransform()—Converts the pixel values of an image
by eplacing them with values from a predefined lookup table.
IMAQ Vision has seven predefined lookup tables based on
mathematical transformations. For more information about these
lookup tables, refer to Chapter 5, Image Processing, of the IMAQ Vision Concepts Manual.
•
imaqLookup()—Converts the pixel values of an image by replacing
them with values from a user-defined lookup table.
•
imaqEqualize()—Distributes the grayscale values evenly within a
given grayscale range. Use
in images containing few grayscale values.
•
imaqInverse()—Inverts the pixel intensities of an image to
compute the negative of the image. For example, use
before applying an automatic threshold to your image if the
background pixels are brighter than the object pixels.
imaqEqualize() to increase the contrast
imaqInverse()
Filters
Filter your image when you need to improve the sharpness of transitions in
the image or increase the overall signal-to-noise ratio of the image. You can
choose either a lowpass or highpass filter depending on your needs.
Lowpass filters remove insignificant details by smoothing the image,
removing sharp details, and smoothing the edges between the objects
and the background. You can use
lowpass filter with
imaqConvolve() or imaqNthOrderFilter().
Highpass filters emphasize details, such as edges, object boundaries,
or cracks. These details represent sharp transitions in intensity value.
You can define your own highpass filter with
imaqNthOrderFilter(), or you can use a predefined highpass filter
with
imaqEdgeFilter() or imaqCannyEdgeFilter(). The
imaqEdgeFilter() function allows you to find edges in an image using
predefined edge detection kernels, such as the Sobel, Prewitt, and Roberts
kernels.
IMAQ Vision for LabWindows/CVI User Manual2-10ni.com
imaqLowPass() or define your own
imaqConvolve() or
Convolution Filter
The imaqConvolve() function allows you to use a predefined set of
lowpass and highpass filters. Each filter is defined by a kernel of
coefficients. Use
predefined kernels do not meet your needs, define your own custom filter
using a 2D array of floating point numbers.
Nth Order Filter
The imaqNthOrderFilter() function allows you to define a lowpass or
highpass filter depending on the value of N that you choose. One specific
Nth order filter, the median filter, removes speckle noise, which appears as
small black and white dots. Use
filter. For more information about Nth order filters, refer to Chapter 5,
Image Processing, of the IMAQ Vision Concepts Manual.
Grayscale Morphology
Perform grayscale morphology when you want to filter grayscale
features of an image. Grayscale morphology helps you remove or
enhance isolated features, such as bright pixels on a dark background.
Use these transformations on a grayscale image to enhance non-distinct
features before thresholding the image in preparation for particle analysis.
Chapter 2Getting Measurement-Ready Images
imaqGetKernel() to retrieve predefined kernels. If the
imaqMedianFilter() to apply a median
Grayscale morphological transformations compare a pixel to those pixels
surrounding it. The transformation keeps the smallest pixel values when
performing an erosion or keeps the largest pixel values when performing
a dilation.
Refer to Chapter 5, Image Processing, of the IMAQ Vision Concepts Manual for more information about grayscale morphology
transformations.
Use
imaqGrayMorphology() to perform one of the following seven
transformations:
•Erosion—Reduces the brightness of pixels that are surrounded by
neighbors with a lower intensity.
•Dilation—Increases the brightness of pixels surrounded by neighbors
with a higher intensity. A dilation produces the opposite effect of an
erosion.
•Opening—Removes bright pixels isolated in dark regions and smooths
boundaries.
•Closing—Removes dark pixels isolated in bright regions and smooths
boundaries.
•Proper-opening—Removes bright pixels isolated in dark regions and
smooths the inner contours of particles.
•Proper-closing—Removes dark pixels isolated in bright regions and
smooths the inner contours of particles.
•Auto-median—Generates simpler particles that have fewer details.
FFT
Use the Fast Fourier Transform (FFT) to convert an image into its
frequency domain. In an image, details and sharp edges are associated
with mid to high spatial frequencies because they introduce significant
gray-level variations over short distances. Gradually varying patterns are
associated with low spatial frequencies.
An image can have extraneous noise, such as periodic stripes, introduced
during the digitization process. In the frequency domain, the periodic
pattern is reduced to a limited set of high spatial frequencies. Also, the
imaging setup may produce non-uniform lighting of the field of view,
which produces an image with a light drift superimposed on the
information you want to analyze. In the frequency domain, the light drift
appears as a limited set of low frequencies around the average intensity of
the image, the DC component.
You can use algorithms working in the frequency domain to isolate and
remove these unwanted frequencies from your image. Complete the
following steps to obtain an image in which the unwanted pattern has
disappeared but the overall features remain.
1.Use
imaqFFT() to convert an image from the spatial domain to the
frequency domain. This function computes the FFT of the image and
results in a complex image representing the frequency information of
your image.
2.Improve your image in the frequency domain with a lowpass or
highpass frequency filter. Specify which type of filter to use with
imaqAttenuate() or imaqTruncate(). Lowpass filters smooth
noise, details, textures, and sharp edges in an image. Highpass filters
emphasize details, textures, and sharp edges in images, but they also
emphasize noise.
•Lowpass attenuation—The amount of attenuation is directly
proportional to the frequency information. At low frequencies,
there is little attenuation. As the frequencies increase, the
IMAQ Vision for LabWindows/CVI User Manual2-12ni.com
Chapter 2Getting Measurement-Ready Images
attenuation increases. This operation preserves all of the zero
frequency information. Zero frequency information corresponds
to the DC component of the image or the average intensity of
the image in the spatial domain.
•Highpass attenuation—The amount of attenuation is inversely
proportional to the frequency information. At high frequencies,
there is little attenuation. As the frequencies decrease, the
attenuation increases. The zero frequency component is removed
entirely.
•Lowpass truncation—Frequency components above the ideal
cutoff frequency are removed, and the frequencies below it remain
unaltered.
•Highpass truncation—Frequency components above the ideal
cutoff frequency remain unaltered, and the frequencies below it
are removed.
3.To transform your image back to the spatial domain, use
imaqInverseFFT().
Complex Image Operations
The imaqExtractComplexPlane() and
imaqReplaceComplexPlane() functions allow you to access, process,
and update independently the real and imaginary planes of a complex
image. You can also convert planes of a complex image to an array
and back with
This chapter describes how to take measurements from grayscale and color
images. You can make inspection decisions based on image statistics, such
as the mean intensity level in a region. Based on the image statistics, you
can perform many machine vision inspection tasks on grayscale or color
images, such as detecting the presence or absence of components, detecting
flaws in parts, and comparing a color component with a reference.
Figure 3-1 illustrates the basic steps involved in making grayscale and
color measurements.
Define Regions of Interest
3
Figure 3-1. Steps to Taking Grayscale and Color Measurements
Define Regions of Interest
A region of interest (ROI) is an area of an image in which you want
to focus your image analysis. You can define an ROI interactively,
programmatically, or with an image mask.
Defining Regions Interactively
You can interactively define an ROI in a window that displays an image.
Use the tools from the IMAQ Vision tools palette to interactively define and
manipulate an ROI.