All information provided in this manual is believed to be accurate and reliable. No
responsibility is assumed by Photonfocus AG for its use. Photonfocus AG reserves the right to
make changes to this information without notice.
Reproduction of this manual in whole or in part, by any means, is prohibited without prior
permission having been obtained from Photonfocus AG.
The Swiss company Photonfocus is one of the leading specialists in the development of CMOS
image sensors and corresponding industrial cameras for machine vision.
Photonfocus is dedicated to making the latest generation of CMOS technology commercially
available. Active Pixel Sensor (APS) and global shutter technologies enable high speed and
high dynamic range (120 dB) applications, while avoiding disadvantages like image lag,
blooming and smear.
Photonfocus’ product range is complemented by custom design solutions in the area of camera
electronics and CMOS image sensors.
Photonfocus is ISO 9001 certified. All products are produced with the latest techniques in order
to ensure the highest degree of quality.
Photonfocus products are available through an extensive international distribution network
and through our key account managers. Contact us via email at sales@photonfocus.com.
1.5Further information
Photonfocus reserves the right to make changes to its products and documentation without notice. Photonfocus products are neither intended nor certified for
use in life support systems or in other critical systems. The use of Photonfocus
products in such applications is prohibited.
Photonfocus and LinLog®are registered trademarks of Photonfocus AG.
CameraLink®and GigE Vision®are a registered mark of the Automated Imaging Association. Product and company names mentioned herein are trademarks
or trade names of their respective companies.
Reproduction of this manual in whole or in part, by any means, is prohibited
without prior permission having been obtained from Photonfocus AG.
Photonfocus can not be held responsible for any technical or typographical errors.
8 of 121MAN076 09/2017 V1.0
1.6Legend
In this documentation the reader’s attention is drawn to the following icons:
Important note, additional information
Important instructions
General warning, possible component damage hazard
Warning, electric shock hazard
Warning, fire hazard
1.6 Legend
MAN076 09/2017 V1.09 of 121
1 Preface
10 of 121MAN076 09/2017 V1.0
2
MV3 - D64 0I - M0 1- 1 44 - G2 - 12
P r e f i x 1
P r e f i x 2
S e n s o r w i d t h
C a m e r a
s p e e d
I n t e r f a c e t y p e
I n t e r f a c e
r e s o l u t i o n
S e n s o r M a n u f a c t u r e r
S e n s o r
F a m i l y
S e n s o r
T y p e
Introduction
2.1Introduction
This manual describes standard Photonfocus MV3-D640I-M01-144-G2-12 camera that has a
Gigabit Ethernet (GigE) interface. The abbreviated name MV3-D640I-M01 is used in most parts
of this manual. The camera contains the SNAKE SW InGaAs sensor from Sofradir.
2.2Camera Naming Convention
The naming convention of the MV3-D640I-M01 camera is summarized in Fig. 2.1.
Figure 2.1: Camera naming convention
Prefix1 Camera platform and usage prefix. The following prefix applies to this camera: MV3
Prefix2 Camera family specifier. The following specifiers are used in this manual: "D":
Sensor Family Sensor family of the prior indicated manufacturer. "01": Snake series
Camera speed The camera speed is usually the product of the camera interface clock in MHz
and the number of parallel interface channels (taps).
Interface type Interface type specification: "G2": Gigabit Ethernet (GigE Vision)®.
Interface resolution Maximal resolution (bit width) of the camera interface.
2.3Camera list
A list of all cameras covered in this manual is shown in Table 2.1.
MAN076 09/2017 V1.011 of 121
2 Introduction
NameResolutionFrame RateNotes
MV3-D640I-M01-144-G2-12640 x 512300 fps
1)
Gigabit Ethernet SWIR standard
camera.
Table 2.1: Camera models covered by this manual (Footnotes:1)frame rate at full resolution)
12 of 121MAN076 09/2017 V1.0
3
Product Specification
3.1Introduction
The Photonfocus MV3-D640I-M01 camera series is built around the CMOS InGaAs image sensor
Snake SW from Sofradir, that provides a resolution of 640 x 512 pixels. The cameras are aimed
at applications in industrial image processing where high sensitivity within the SWIR band from
0.9 to 1.7 µm are required.
The principal advantages are:
•Resolution of 640x512 pixels
•Optimized for high sensitivity, low noise and low dark current
•Spectral range: SWIR from 900 - 1700 nm
•Global shutter
•Gigabit Ethernet interface, GigE Vision and GenICam compliant
•Frame rates of 300 fps at maximal resolution
•I/O capabilities: 2 isolated inputs and 2 isolated outputs
•Up to 512 regions of interest (MROI)
•2 look-up tables (12-to-8 bit) on user-defined image region (Region-LUT)
•Crosshairs overlay on the image
•Image information and camera settings inside the image (status line)
•Image Binning
•Thermoelectric cooling of the image sensor to stabilize the sensor temperature and avoid
temperature drifts.
•Software provided for setting and storage of camera parameters
•The rugged housing at a compact size with integrated TEC modul of 60 x 60 x 59.3 mm
makes the Photonfocus MV3-D640I-M01-G2 camera the perfect solution for applications
in which space is at a premium.
3
The general specification and features of the camera are listed in the following sections.
MAN076 09/2017 V1.013 of 121
3 Product Specification
Figure 3.1: Photonfocus MV3-D640I-M01-144-G2 GigE interface camera
14 of 121MAN076 09/2017 V1.0
3.2 Feature Overview
3.2Feature Overview
The general specification and features of the camera are listed in the following sections. The
detailed description of the camera features is given in the following chapters.
CharacteristicsPhotonfocus MV3-D640I-M01 GigE Series
InterfaceGigabit Ethernet (PoE), GigE Vision and GenICam compliant
Table 3.3: Physical characteristics and operating ranges of the MV3-D640I-M01-G2 camera series
3.3.1Absolute Maximum Ratings
ParameterValue
Camera Control Input Signal Voltage Single Ended-0 V ... +24 V
Camera Control Output Signal Voltage Single Ended0 V ... +24 V
Camera Control Output Signal Output Current Single Ended0.5 A
Camera Control Output Signal Output Power Single Ended0.35 W
ESD Contact Discharge Camera Control Signals4 kV
ESD Air Discharge Camera Control Signals8 kV
Fast Transients/Bursts Data and Camera Control Signals1 kV
Surge immunity Data and Camera Control Signals1 kV
Table 3.4: Absolute Maximum Ratings
MAN076 09/2017 V1.017 of 121
3 Product Specification
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
800900100011001200130014001500160017001800
Wavelength [nm]
Sofradir Snake
3.3.2Electrical Characteristics
ParameterValue
Camera Control Input Single Ended+5 V ... +20 V
Table 3.5: Electrical Characteristics
3.3.3Spectral Response
Fig. 3.2 shows the quantum efficiency curve of the Snake SW sensors from Sofradir measured in
the wavelength range from 900 nm to 1800 nm.
Figure 3.2: Quantum efficiency (QE) [%] of the Snake SW image sensors
18 of 121MAN076 09/2017 V1.0
4
C o u n t e r / T i m e r
S i g n a l R o u t i n g
I / O C o n t r o l
A c q u i s i t i o n
C o n t r o l
F r a m e
C o n t r o l
E x p o s u r e
C o n t r o l
S o f t w a r e S i g n a l
P u l s e
&
U s e r O u t p u t
L i n e I n
L E D
L i n e O u t
Image Acquisition
This chapter gives detailed information about the controlling of the image acquisition. It
shows how the camera can be triggered or run in free-running mode, and how the frame rate
can be configured.
The structure offers a lot of flexibility in the configuration. It follows the GenICam naming convention. Typical camera configurations are included in the chapter "Use Cases" in the Appendix C.
4.1Overview
The overview shows the major camera elements which are involved in the image acquisition.
The section starts with a description of the vocabulary and terms, which are used to explain the
acquisition related features.
4.1.1Vocabulary
An acquisition is composed of one or many frames. A frame is a single acquired image which
consists of an exposure time and an image read out. A burst of frame is defined as a capture of
a group of one or many frames within an acquisition. So an acquisition can be grouped in N
single frames or N burst of frames.
4.1.2Structure
The camera contains a block that controls the image acquisition which contains the sub-blocks
for acquisition control, frame control and exposure control. Furthermore there are a controller
blocks of I/O signals, counters, timers and software interface. All of these elements can be
connected through an interconnection, which allows to control the image acquisition by digital
input signals, by counters, by timers and/or software access.
Figure 4.1: Structure
MAN076 09/2017 V1.019 of 121
4 Image Acquisition
Acquisition Control The acquisition control block takes care of the acquisition function. The
camera can only capture frames, when the acquisition has been started and is active (see
Section 4.2 for more information).
Frame Control The frame control block takes care of the capturing of one or many frames and
burst of frames (see Section 4.3 for more information).
Exposure Control The exposure control block takes care of the exposure time of a frame (see
Section 4.4 for more information).
Counter The camera has four independent counters. They count events from a selectable
source (see Section 5.1 for more information about counter configuration and usage).
Timer The camera has four independent timers. A timer delay and duration are configurable
and the timers are triggered by a selectable source (see Section 5.2 for more information
about timer configuration and usage).
I/O Control The I/O control unit manages physical camera inputs and outputs and LED. A
switch matrix within this block allows connecting internal status signals to the output
lines or LED. Status signals can come from the acquisition, frame or exposure control
block, also from timer or counter, or even input lines can be routed to an output. The
input lines can be used to control the acquisition, frame and/or exposure, also to start a
timer or count events with a counter (see Chapter 6 for more information).
Software Signal Pulse and User Output The camera has user outputs, which can be set to 1 or
0 by software access, and a software signal pulse generator block (see Section 4.7). These
user outputs and signal pulses can be used to control camera functions by software
access, such like acquisition and frame capture or counter and timer.
Signal Routing All these elements are connected to each other by the signal routing block. The
following sections show which signals are available and how they can be used in the
others blocks.
4.1.3Image Acquisition, Frame and Exposure Control Parameters
Mainly the following commands/settings are involved in order to control and configure the
camera acquisition and frame capturing:
•Acquisition Start and Stop Command
•Acquisition Mode (Single Frame, Multi Frame, Continuous Frame)
•Acquisition Frame Count
•Acquisition Frame Burst Count
•Acquisition Frame Rate
•Exposure Mode (Timed, Trigger Controlled)
•Exposure Time
This list shows only an overview of available parameters. The function and usage of them are
explained in the following chapters.
20 of 121MAN076 09/2017 V1.0
4.2 Acquisition Control
4.1.4Image Acquisition, Frame and Exposure Trigger
The camera can run in "free-running" mode, which means that it captures images
automatically in full frame rate once the acquisition has been started. However the acquisition
and image capturing can be controlled by triggers. For this purposes, there are seven triggers
available:
•Acquisition Start Trigger
•Acquisition End Trigger
•Frame Start Trigger
•Frame Burst Start Trigger
•Frame Burst End Trigger
•Exposure Start Trigger
•Exposure End Trigger
The source of these trigger can be set for every trigger individually; it can come from an
external line, an internally generated pulse from the counters or timers or from a pulse
generated by a software command. Each of these triggers can be switched on or off
individually. The camera generates the triggers pulses internally, which are switched off. The
following sections (Acquisition, Frame & Exposure Control) show the usage of these triggers.
Each trigger has its own source signal processing path. Section 4.6 gives more information
about the configuration.
4.1.5Image Acquisition, Frame and Exposure Status
The following list shows the acquisition, frame and exposure related status signals:
•Acquisition Trigger Wait
•Acquisition Active
•Frame Trigger Wait
•Frame Active
•Exposure Active
These status signals are used within the camera to control the camera timing. The current state
of these signals can be read out by software. Furthermore it can be connected to an output
line or LED through the I/O control block (see Chapter 6), which allows to track the status from
an external device. The timing of these signals are explained in the following sections.
4.2Acquisition Control
4.2.1Acquisition Start and Stop Commands, Acquisition Mode and Acquisition
Frame Count
The camera can only capture frames when the acquisition is started. The acquisition start
command, which is executed by the software, starts the acquisition and prepares the camera to
acquire frames. The acquisition is stopped, when the acquisition stop command is executed or depending on the acquisition mode parameter - a certain number of frames is captured.
Following acquisition mode parameters are available:
MAN076 09/2017 V1.021 of 121
4 Image Acquisition
A c q u i s i t i o n C o n t r o l
A c q u i s i t i o n S t a r t T r i g g e r
A c q u i s i t i o n E n d T r i g g e r
A c q u i s i t i o n T r i g g e r W a i t
A c q u i s i t i o n S t a r t
A c q u i s i t i o n E n d
A c q u i s i t i o n T r i g g e r
A c q u i s i t i o n S t a r t ( )
A c q u i s i t i o n E n d ( )
A c q u i s i t i o n A c t i v e
F r a m e 1F r a m e 2. . . .F r a m e N
A c q u i s i t i o n S t a r t ( )
A c q u i s i t i o n A c t i v e
F r a m e A c t i v e
A c q u i s i t i o n S t o p ( )
A c q u i s i t i o n S t a r t T r i g g e r M o d e = O f f
A c q u i s i t i o n E n d T r i g g e r M o d e = O f f
F r a m e S t a r t T r i g g e r M o d e = O f f
F r a m e B u r s t S t a r t T r i g g e r M o d e = O f f
E x p o s u r e M o d e = T i m e d
Figure 4.2: Acquisition Control Block
Acquisition Mode = Single Frame When the acquisition is started, the camera stops the
acquisition automatically as soon as one frame has been captured. To capture another
frame, the acquisition start command needs to be performed again.
Acquisition Mode = Multi Frame When the acquisition is started the camera stops the
acquisition automatically after a certain number of frames, which are defined by
acquisition frame count parameter, or when the acquisition stop command is executed.
Acquisition Mode = Continuous Frame When the acquisition is started the camera captures
images continuously until the acquisition stop command is executed.
4.2.2Acquisition Frame Rate and Acquisition Frame Rate Enable
With frame start trigger mode=off (see Section 4.3.1) and frame burst start trigger mode=off
(see Section 4.3.2), and when exposure mode is set to "timed" (see Section 4.4.1), the camera
generates frame start triggers internally according to the acquisition frame rate enable and
acquisition frame rate configuration:
Acquisition Frame Rate Enable = off The camera runs in the "free-running" mode. It means
frame start triggers are generated internally as soon as the camera is ready to start a new
image capture. The frame rate is defined by the exposure time and the ROI settings (see
Chapter 8).
Acquisition Frame Rate Enable = on The camera generates frame start triggers internally
according to the acquisition frame rate configuration. See Chapter 8 for more
information about the range of supported frame rates.
Fig. 4.3 shows the procedure of a capture of N frames. The modes of acquisition start trigger,
frame start trigger and frame burst start trigger is set to off and the exposure mode is timed.
Once an acquisition start command has been executed, the camera starts capturing frames
until the acquisition stop command is received.
Figure 4.3: Free-running Image Capture, when Acquisition Start and End Trigger Mode is Off
22 of 121MAN076 09/2017 V1.0
4.2 Acquisition Control
A c q u i s i t i o n A c t i v e
A c q u i s i t i o n S t a r t
T r i g g e r 1
A c q u i s i t i o n E n d
T r i g g e r 1
A c q u s i t i o n
T r i g g e r
W a i t
A c q u i s i t i o n S t o p ( )
A c q u s i t i o n
T r i g g e r
W a i t
A c q u i s i t i o n S t a r t ( )
F r a m e 1F r a m e 2. . . .F r a m e N
F r a m e A c t i v e
A c q u i s i t i o n S t a r t T r i g g e r M o d e = O n
A c q u i s i t i o n E n d T r i g g e r M o d e = O n
F r a m e S t a r t T r i g g e r M o d e = O f f
F r a m e B u r s t S t a r t T r i g g e r M o d e = O f f
E x p o s u r e M o d e = T i m e d
4.2.3Acquisition Start Trigger
The acquisition start trigger can be used to control the acquisition start procedure. The main
property of this trigger is the trigger mode. It can be set to on or off:
Acquisition Start Trigger Mode = on As soon as the acquisition is started by executing of the
acquisition start command, the camera goes into the state "Acquisition Trigger Wait". In
this state, the camera can’t start capturing images; it waits for an acquisition start trigger.
As soon as the trigger has been received, the camera goes then into the state
"Acquisition Active" and is ready to capture frames.
Acquisition Start Trigger Trigger Mode = off As soon as the acquisition is started by executing
of the acquisition start command, the camera goes immediately into state "Acquisition
Active" and is ready to capture frames.
Fig. 4.4 shows an example when the trigger mode of the acquisition start trigger is on. The
acquisition status goes to acquisition trigger wait once an acquisition start command has been
executed. As soon as an acquisition start trigger has been arrived, the acquisition status goes to
acquisition active.
Figure 4.4: Free-running Image Capture, when Acquisition Start and End Trigger Mode is On
4.2.4Acquisition End Trigger
The acquisition end trigger can be used to control the acquisition end procedure. The main
property of this trigger is the trigger mode. It can be set to on or off:
Acquisition End Trigger Mode = on When the acquisition status is "Acquisition Active", it goes
to "Acquisition Trigger Wait" as soon as an acquisition end trigger has been received. The
camera stops capturing images and waits until an acquisition start trigger has been issued
again.
Acquisition End Trigger Mode = off The acquisition end triggers are ignored and the
acquisition status remains active until an acquisition stop command has been executed or
the certain amount of frames has been captured, depending on the acquisition mode
parameter (see Section 4.2.1)
MAN076 09/2017 V1.023 of 121
4 Image Acquisition
F r a m e C o n t r o l
F r a m e S t a r t T r i g g e r
F r a m e B u r s t S t a r t T r i g g e r
F r a m e T r i g g e r W a i t
F r a m e S t a r t
F r a m e E n d
F r a m e B u r s t E n d T r i g g e r
F r a m e T r i g g e r
F r a m e A c t i v e
F r a m e B u r s t S t a r t
F r a m e B u r s t E n d
4.2.5Acquisition Control Output Signals
The acquisition control block has the following output signals:
Acquisition Trigger Wait An asserted acquisition trigger wait indicates, that the acquisition
control is waiting for an acquisition start trigger (see Section 4.2.3).
Acquisition Active Acquisition active is a status signal, which is asserted, when the acquisition
has been started and deasserted, when the acquisition is stopped.
Acquisition Start The acquisition start is an event, which is generated, when the acquisition is
started.
Acquisition End The acquisition end is an event, which is generated, when the acquisition is
stopped.
Acquisition Trigger The acquisition trigger is an event, which is generated, when the
acquisition is started by the acquisition start trigger (see Section 4.2.3).
Acquisition trigger wait and acquisition active status signal are routed to the I/O control block
and can there be selected for output on the physical output line or on one of the available leds
(see Chapter 6). Acquisition start, acquisition end and acquisition trigger event can be used to
trigger or to control other function in the camera, like counter or timer.
4.3Frame Control
Figure 4.5: Frame Control Block
As soon as the acquisition is active the camera is ready to capture frames, which is controlled by
the frame control block. The behaviour depends on the frame start trigger, frame burst start
trigger and frame burst end trigger configuration.
4.3.1Frame Start Trigger
The frame start trigger can be used to start a single frame capture. The main property of this
trigger is the trigger mode. It can be set to on or off:
Frame Start Trigger Mode = on As soon as a frame start trigger has been received, a capture of
one frame will be started and the frame status goes to "Frame Active". Once one frame
has been processed, that camera status goes to "Frame Trigger Wait" again. The camera
is ready to process frame start triggers, when the acquisition is active, showed by the
"Acquisition Active" status, and when the "Frame Trigger Wait" status is asserted.
Frame Start Trigger Mode = off The frame start triggers are ignored.
24 of 121MAN076 09/2017 V1.0
4.3 Frame Control
F r a m e 1. . . .F r a m e N
A c q u i s i t i o n A c t i v e
F r a m e S t a r t
T r i g g e r 1
F r a m e S t a r t
T r i g g e r 2
F r a m e S t a r t
T r i g g e r N
F r a m e
A c t i v e
F r a m e
A c t i v e
F r a m e
T r i g g e r
W a i t
F r a m e
A c t i v e
F r a m e
T r i g g e r
W a i t
F r a m e
T r i g g e r
W a i t
F r a m e
T r i g g e r
W a i t
A c q u i s i t i o n S t o p ( )A c q u i s i t i o n S t a r t ( )
A c q u i s i t i o n S t a r t T r i g g e r M o d e = O f f
A c q u i s i t i o n E n d T r i g g e r M o d e = O f f
F r a m e S t a r t T r i g g e r M o d e = O n
F r a m e B u r s t S t a r t T r i g g e r M o d e = O f f
E x p o s u r e M o d e = T i m e d
The camera runs in free-running mode when the mode of both triggers, the
frame start trigger and frame burst start trigger, is set to off and the exposure
mode is set to "Timed" (see Section 4.4.1 for more information about the exposure mode).
Fig. 4.6 shows the procedure of a single frame capture started with a frame start trigger. The
acquisition start trigger mode is off, so the camera waits for a frame start trigger once the
acquisition command has been executed. Every receiving frame start trigger starts a capture of
one single frame.
Figure 4.6: Triggered Image Acquisition of single Frames
4.3.2Frame Burst Start Trigger
The frame burst start trigger can be used to start a burst of frame capture. The main property
of this trigger is the trigger mode. It can be set to on or off:
Frame Burst Start Trigger Mode = on As soon as a frame burst start trigger has been received,
a capture of a burst of frames has been started and the frame status goes to "Frame
Active". The number of frames is defined by the "Acquisition Burst Frame Count" value.
The camera goes to "Frame Trigger Wait" again, when the configured number of burst
frames has been captured. The camera is ready to process a frame burst start trigger,
when the acquisition is active, showed by the "Acquisition Active" status, and when the
"Frame Trigger Wait" status is asserted. The frame rate of a burst sequence is configured
by the acquisition frame rate value and the acquisition frame rate enable configuration
(see Section 4.2.2 for more information)
Frame Burst Start Trigger Mode = off The frame burst start triggers are ignored. No capturing
of burst of frames is started.
It is possible to set both, the mode of frame start trigger and frame burst start
trigger, to on. In this case, the camera starts a single frame or a burst of frame
capture, depending which trigger arrives first.
Fig. 4.7 shows the procedure of a burst of frame and single frame capture. The Acquisition
start and end trigger mode is on. So the camera waits for an acquisition trigger once the
acquisition command has been executed. During this period any frame burst start or frame
MAN076 09/2017 V1.025 of 121
4 Image Acquisition
F r a m e 1. . . .F r a m e N
A c q u i s i t i o n A c t i v e
A c q u i s i t i o n S t a r t
T r i g g e r 1
A c q u i s i t i o n E n d
T r i g g e r 1
F r a m e
T r i g g e r
W a i t
F r a m e
T r i g g e r
W a i t
A c q u s i t i o n
T r i g g e r
W a i t
A c q u i s i t i o n S t o p ( )
A c q u s i t i o n
T r i g g e r
W a i t
A c q u i s i t i o n S t a r t ( )
F r a m e B u r s t S t a r t
T r i g g e r 1
F r a m e B u r s t E n d
T r i g g e r 1
( o p t i o n a l )
F r a m e
A c t i v e
F r a m e S t a r t
T r i g g e r 1
F r a m e
A c t i v e
F r a m e
T r i g g e r
W a i t
F r a m e 1
A c q u i s i t i o n S t a r t T r i g g e r M o d e = O n
A c q u i s i t i o n E n d T r i g g e r M o d e = O n
F r a m e S t a r t T r i g g e r M o d e = O n
F r a m e B u r s t S t a r t T r i g g e r M o d e = O n
E x p o s u r e M o d e = T i m e d
start triggers are ignored. As soon as an acquisition start trigger has been received, the camera
waits for frame triggers. It can be a frame start trigger or a frame burst start trigger.
Depending which trigger - frame start or frame burst start - arrives first, the camera starts a
single frame or burst of frame acquisition.
Figure 4.7: Triggered Acquisition of a Burst of Frame and of a single Frame
4.3.3Frame Burst End Trigger
The frame burst end trigger can be used to abort a current running burst capturing cycle. The
main property of this trigger is the trigger mode. It can be set to on or off:
Frame Burst End Trigger Mode = on The frame burst end trigger is processed only, when burst
acquisition cycle is active. Once this trigger has been received, it waits, until the current
frame has been processed and then aborts the burst cycle. The camera status goes to
"Frame Trigger Wait" again.
Frame Burst End Trigger Mode = off The frame burst end triggers are ignored.
The value "Acquisition Burst Frame Count" is ignored when the mode of the
frame burst end trigger is set to on. It means, when a burst capture has been
started, the camera captures frames until a frame burst end trigger arrives.
4.3.4Frame Control Output Signals
The frame control block has the following output signals:
Frame Trigger Wait An asserted frame trigger wait indicates, that the frame control is waiting
for a frame start trigger (see Section 4.3.1), frame burst start trigger (see Section 4.3.2) or
an exposure start trigger (see Section 4.4.2).
Frame Active An asserted frame active signal indicates, that one or more frames are being
captured.
Frame Start The frame start is an event, which is generated, when a new frame starts.
26 of 121MAN076 09/2017 V1.0
4.4 Exposure Control
E x p o s u r e C o n t r o l
E x p o s u r e S t a r t T r i g g e r
E x p o s u r e E n d T r i g g e r
E x p o s u r e A c t i v e
E x p o s u r e S t a r t
E x p o s u r e E n d
Frame End The frame end is an event, which is generated, when a frame is finished.
Frame Burst Start The frame burst start is an event, which is generated, when a new burst of
frame starts.
Frame Burst End The frame burst end is an event, which is generated, when a burst of frame is
finished.
Frame Trigger The frame trigger is an event, which is generated, when a frame is started by a
frame start trigger (see Section 4.3.1), by a frame burst start trigger (see Section 4.3.2) or
by an exposure start trigger (see Section 4.4.2).
Frame trigger wait and frame active status signal are routed to the I/O control block and can
there be selected for output on the physical output line or on one of the available leds. Frame
start, frame end and frame trigger event can be used to trigger or to control other function in
the camera, like counter and timer.
4.4Exposure Control
Figure 4.8: Exposure Control Block
A frame consists of an exposure cycle and an image read out. The exposure control block takes
care of the exposure cycle. The camera has two modes of exposure time operations, which are
defined by the exposure mode settings:
•Timed Exposure
•Trigger Controlled Exposure
4.4.1Exposure Mode
The exposure mode configuration defines, if the exposure time is controlled by the exposure
time registers or by the by triggers. Following configurations are available:
Timed Exposure The exposure time is defined by the exposure time register. The value will
determine the exposure time for each frame.
Trigger Controlled Exposure The exposure time is controller by exposure start and exposure
end triggers (see Section 4.4.2 and Section 4.4.3 for more information).
4.4.2Exposure Start Trigger
The exposure start trigger is used to start an exposure.
Exposure Start Trigger Mode = on As soon as a exposure start trigger has been received, the
camera starts with the exposure cycle, which is showed by the "Exposure Active" status.
The exposure remains until an exposure end trigger has been received (see Section 4.4.3).
The camera is ready to process exposure start triggers, when the acquisition is active,
showed by the "Acquisition Active" status, and when the "Frame Trigger Wait" status is
asserted.
MAN076 09/2017 V1.027 of 121
4 Image Acquisition
A c q u i s i t i o n A c t i v e
F r a m e A c t i v eF r a m e T r i g g e r
W a i t
F r a m e T r i g g e r
W a i t
F r a m e T r i g g e r
W a i t
A c q u i s i t i o n S t a r t ( )A c q u i s i t i o n S t o p ( )
E x p o s u r e 1 R e a d O u t 1
F r a m e 1
E x p o s u r e S t a r t
T r i g g e r 1
E x p o s u r e E n d
T r i g g e r 1
F r a m e A c t i v e
E x p o s u r e 2 R e a d O u t 2
F r a m e 2
E x p o s u r e S t a r t
T r i g g e r 2
E x p o s u r e E n d
T r i g g e r 2
E x p o s u r e
A c t i v e
E x p o s u r e
A c t i v e
A c q u i s i t i o n S t a r t T r i g g e r M o d e = O f f
A c q u i s i t i o n E n d T r i g g e r M o d e = O f f
F r a m e S t a r t T r i g g e r M o d e = O f f
F r a m e B u r s t S t a r t T r i g g e r M o d e = O f f
E x p o s u r e M o d e = T r i g g e r C o n t r o l l e d
Exposure Start Trigger Mode = off The exposure start triggers are ignored.
The exposure start trigger is only available, when the exposure mode is set to
"trigger controlled". The camera sets the trigger modes of this trigger automatically to on, when the exposure mode is set to "trigger controlled" and vice versa
to off when the exposure mode is set to "timed".
4.4.3Exposure End Trigger
The exposure end trigger is used to terminate an exposure active exposure cycle.
Exposure End Trigger Mode = on An activated exposure cycle will be terminated and the
image read out will be started, as soon as an exposure end trigger has been received. The
"Exposure Active" status goes to inactive. Exposure end triggers are only processed, when
an exposure start trigger has started a trigger controlled exposure cycle previously.
Exposure End Trigger Mode = off The exposure end triggers are ignored.
The exposure end trigger is only available, when the exposure mode is set to
"trigger controlled". The camera sets the trigger modes of this trigger automatically to on, when the exposure mode is set to "trigger controlled" and vice versa
to off when the exposure mode is set to "timed".
Fig. 4.9 shows procedure of a trigger controlled exposure mode. The Acquisition start trigger
mode is off. The camera goes directly into status "Frame Trigger Wait" once the acquisition
status has been executed. An exposure of a new frame is started as soon as an exposure start
trigger has been received. The exposure ends with an exposure end trigger.
Figure 4.9: Trigger Controlled Exposure Mode
4.4.4Exposure Control Output Signals
The frame control block has the following output signals:
Exposure Active An asserted exposure active signal indicates, that a exposure time of a current
frame is active.
28 of 121MAN076 09/2017 V1.0
4.5 Overlapped Image Acquisition Timing
E x p o s u r e 1 R e a d O u t 1
F r a m e 1
E x p o s u r e 2 R e a d O u t 2
F r a m e 2
E x p o s u r e N R e a d O u t N
F r a m e N
A c q u i s i t i o n A c t i v e
A c q u i s i t i o n S t a r t
T r i g g e r 1
A c q u i s i t i o n E n d
T r i g g e r 1
F r a m e S t a r t
T r i g g e r 1
F r a m e S t a r t
T r i g g e r 2
F r a m e S t a r t
T r i g g e r N
F r a m e
T r i g g e r
W a i t
F r a m e
T r i g g e r
W a i t
F r a m e
T r i g g e r
W a i t
A c q u s i t i o n
T r i g g e r
W a i t
A c q u s i t i o n
T r i g g e r
W a i t
A c q u i s i t i o n S t a r t ( )A c q u i s i t i o n S t o p ( )
F r a m e A c t i v e
F r a m e
T r i g g e r
W a i t
Exposure Start The exposure start is an event, which is generated, when a new exposure time
has been started.
Exposure End The exposure end is an event, which is generated, when a currently running
exposure time is finished.
Exposure active status signal is routed to the I/O control block and can there be selected for
output on the physical output line or on one of the available leds. Exposure start and exposure
end event can be used to trigger or to control other function in the camera, like acquisition,
frame and exposure control or counter and timer.
4.5Overlapped Image Acquisition Timing
The camera is able to perform an overlapped image acquisition. It means, a new exposure can
be started during the image readout of the previous image. Fig. 4.10 shows an image
acquisition procedure when images are captured in overlapped mode. The status "Frame
Active" remains active during the acquisition of the three frames since there is no gap between
two frames. The "Frame Trigger Wait" is asserted during the frame read out in order to
indicate, that a new exposure can be started.
All examples in the previous sections show a non overlapping frame timing. A new frame is
always started after the previous image read out has been finished. All these examples work
also in the overlap mode. It doesn’t show the overlap mode in order to simplify the diagrams.
The overlapped image acquisition is not available, when the triggered controlled
exposure mode is configured.
Figure 4.10: Overlapped Image acquisition Timing
There is a restriction of the overlapped image acquisition: The exposure time of the current
frame must not end prior to the end of the read out of the previous frame. Two different
timing situation needs to be distinguished:
MAN076 09/2017 V1.029 of 121
4 Image Acquisition
E x p o s u r e 1R e a d O u t 1
F r a m e 1
E x p o s u r e 2R e a d O u t 2
F r a m e 2
E x p o s u r e 1R e a d O u t 1
F r a m e 1
E x p o s u r e 2R e a d O u t 2
F r a m e 2
Exposure Time > Read Out Time An new exposure cycle can be started as soon as the read out
of the previous frame has been started (See Fig. 4.11).
Exposure Time < Read Out Time The start of a new exposure has delayed by a certain amount
of time in order to ensure, that the exposure doesn’t end prior to the image read out end
of the previous frame (See Fig. 4.12).
The camera adjusts the timing automatically, and ensures that it complies with this restriction.
Frame start or frame burst start triggers which arrive too early and which violates the
overlapping restriction, will be ignored by the camera and indicated by a missed trigger event,
which can be counted by a counter (see Section 5.1.5 for more information how to count
missed triggers). Fig. 4.11 and Fig. 4.12 shows both timing situations, when the exposure time
longer than the read out time and when the exposure time is shorter than the read out time.
Figure 4.11: Overlapped Image acquisition when exposure time > read out time
Figure 4.12: Overlapped Image acquisition when exposure time < read out time
4.6Acquisition-, Frame- and Exposure-Trigger Configuration
The acquisition-, frame- and exposure timing can be controlled by 7 triggers:
•Acquisition Start Trigger
•Acquisition End Trigger
•Frame Start Trigger
•Frame Burst Start Trigger
•Frame Burst End Trigger
•Exposure Start Trigger
•Exposure End Trigger
Fig. 4.13 shows the signal path which is available for every of the 7 triggers. It contains:
•Trigger Source Selection
•Trigger Software
•Trigger Mode
•Trigger Activation
•Trigger Divider
•Trigger Delay
30 of 121MAN076 09/2017 V1.0
4.6 Acquisition-, Frame- and Exposure-Trigger Configuration
E x p o s u r e E n d T r i g g e r
E x p o s u r e S t a r t T r i g g e r
F r a m e B u r s t E n d T r i g g e r
F r a m e B u r s t S t a r t T r i g g e r
F r a m e S t a r t T r i g g e r
A c q u i s i t i o n E n d T r i g g e r
D e l a yD i v i d e r
M o d e
O n
O f f
A c t i v a t i o n
R i s i n g E d g e
F a l l i n g E d g e
B o t h E d g e s
S o u r c e
. . .
S o f t w a r e
A c q u i s i t i o n S t a r t T r i g g e r
Figure 4.13: Trigger Path
4.6.1Trigger Source Selection
The user can select the source which is used to generate the corresponding trigger. The source
can come from and external line input, internal pulse generated by a counter or timer or a
pulse, which is generated by software. The following list shows the signal sources, which are
available:
Line In A trigger is generated by a line input signal. The activation configuration defines, if the
rising edge, the falling edge or both edges are taken into account (see Section 4.6.4).
Software A trigger is generated by the locally trigger software command register (see Section
Software Signal Pulse A trigger is generated by the software signal pulse, which comes from a
Counter Start A trigger signal is generated by the counter start event (see Section 5.1.3 for
Counter End A trigger signal is generated by the counter end event (see Section 5.1.3 for more
Timer Start A trigger signal is generated by the timer start event (see Section 5.2.3 for more
Timer End A trigger signal is generated by the timer end event (see Section 5.2.3 for more
4.6.2Trigger Software
The trigger software is a software command register, which is available in the signal path of
each trigger. Accessing to this register generates in internal trigger, which will be processed in
the signal path.
MAN076 09/2017 V1.031 of 121
4.6.2 for more information).
common software signal pulse register (see Section 4.7 for more information).
more information about the counter start event).
information about the counter end event).
information about the timer start event).
information about the timer end event).
Trigger source must be set to software in order that a trigger software command
will be processed.
4 Image Acquisition
4.6.3Trigger Mode
The trigger mode defines, if the trigger is active. Following settings are available:
•On
•Off
4.6.4Trigger Activation
The trigger activation defines, which edge of the selected trigger is processed in the trigger
signal path. Following configuration is available:
•Rising Edge
•Falling Edge
•Both Edges
The trigger Activation configuration is only available, when the trigger source is
set to Line In.
4.6.5Trigger Divider
The trigger divider specifies a division factor of the incoming trigger pulses. A division factor of
1 process every incoming trigger. A division factor of 2 process every second trigger and so on.
4.6.6Trigger Delay
The trigger delay let the user specifies a delay, that will be applied between the reception of a
trigger event and when the trigger becomes active.
4.7Software Signal Pulse and User Output
The software signal pulse block contains eight general purpose registers which allow
generating internal pulse signals by software access. These pulse signals are internally
connected to following functions, where it can be used to start a procedure:
•Acquisition Start Trigger
•Acquisition End Trigger
•Frame Start Trigger
•Frame Burst Start Trigger
•Frame Burst End Trigger
•Exposure Start Trigger
•Exposure End Trigger
•Counter Trigger
•Counter Event
•Counter Reset
•Timer Trigger
32 of 121MAN076 09/2017 V1.0
4.7 Software Signal Pulse and User Output
The user output block is an eight bit status register. The bits can be set to 0 or 1 by software.
The bits are available in the following firmware blocks/functions:
•Counter Trigger
•Counter Reset
•LED S0, S1 & S2
•Line Out
MAN076 09/2017 V1.033 of 121
4 Image Acquisition
34 of 121MAN076 09/2017 V1.0
Counter & Timer
C o u n t e r
S o u r c e
. . .
A c t i v a t i o n
R i s i n g E d g e
F a l l i n g E d g e
B o t h E d g e s
S o u r c e
. . .
A c t i v a t i o n
R i s i n g E d g e
F a l l i n g E d g e
B o t h E d g e s
S o u r c e
. . .
A c t i v a t i o n
R i s i n g E d g e
F a l l i n g E d g e
B o t h E d g e s
C o u n t e r
T r i g g e r
C o u n t e r
E v e n t
C o u n t e r
R e s e t
T i m e S t a m p
T i c k
C o u n t e r D u r a t i o n
C o u n t e r S t a r t V a l u e
C o u n t e r V a l u e
C o u n t e r S t a t u s
C o u n t e r R e s e t ( )
C o u n t e r V a l u e A t R e s e t
C o u n t e r A c t i v e
C o u n t e r S t a r t
C o u n t e r E n d
5.1Counter
5
Figure 5.1: Counter Structure
Four general purpose counters are available (Counter0 . . . Counter3) which are used for
counting events. Each counter can be individually configured by software and controlled by
counter trigger, counter event and counter reset signals. This section describes the
configuration and the function of the counters.
Fig. 5.1 shows the structure of one counter. Along with the counter function itself it contains a
counter trigger source, a counter event source and a counter reset source selection and
activation block. Furthermore there is a counter active status signal, which is available in the
I/O control block (see Chapter 6). And the counter generates counter start and counter end
events, which can be used to trigger other blocks, like counter, timer, acquisition, frame or
exposure control.
5.1.1Counter Usage
In a basic usage, at least a counter event source needs to be selected (see Section 5.1.5 for more
information about the counter event source selection). Once an event source has been
selected, the counter needs to be reset in order to start the counter. When started it counts
from a defined start value, which is configurable by the CounterStartValue property, and ends
counting events after a certain number of events, which is configurable by the
CounterDuration property.
MAN076 09/2017 V1.035 of 121
5 Counter & Timer
If additionally a trigger source is selected, the counter is waiting for a trigger. It ignores
counter events until a valid trigger event has been received. A valid trigger signal on the
selected trigger source starts the counter, which means, that it counts the predefined number
of events from the start value according to the counter start value and duration configuration.
The current counter value is readable with the property CounterValue.
5.1.2Counter Status
The counter has different states, which depends on the configuration and usage. The current
state can be read out by the register CounterStatus. The following list shows the available
states:
Counter Idle Counter is idle, and doesn’t count any events.
Counter Trigger Wait As soon as a counter trigger source is selected (counter trigger source !=
OFF), the counter goes into the counter trigger wait state and waits for a trigger signal
on the selected source. In this state, the counter doesn’t count any events.
Counter Active Counter is active and is ready to count every event. The counter can be
temporarely stopped by setting the counter event source to OFF .
Counter Completed Counter has stopped as it reached its programmed duration.
Counter Overflow Counter has reached its counter limit, which is 2^32-1, and an additional
counter event signal has been received. Once the counter is in the state overflow, it has
to be reset by software or by counter reset signal in order to recover it from this state.
5.1.3Counter Active, -Start and -End Signal
Each counter has the following output signals:
Counter Active An asserted counter active signal indicates, that the counter is active and is
counting events. The counter active signal is internally routed to the I/O control block and
can there be selected for output on the physical output line or on one of the available
leds.
Counter Start The counter start is an event, which is generated, when the counter goes into
the counter active state or is restarted during counter active period.
Counter End The counter end is an event, which is generated, when the counter arrives its
configured end condition; and changes to the state counter completed.
The counter start and end event can be used to trigger or to control other function in the
camera, like acquisition, frame and exposure control or counter and timer. It can also be used
to cascade counters in order to get a bigger counting range.
5.1.4Counter Reset
There are two ways how to reset a counter: Either by a software reset command or by a
hardware reset source signal. The reset behaviour depends on the current counter event and
trigger source configuration:
Counter Trigger Source = Off and Counter Event Source = Off The state of the counter
changes to "idle".
Counter Trigger Source = Off and Counter Event Source != Off The state of the counter
changes to "active".
36 of 121MAN076 09/2017 V1.0
5.1 Counter
Counter Trigger Source != Off The state of the counter changes to "trigger wait".
The counter reset source can be set to the counter end signal of the same
counter. This allows to restart the counter automatically as soon as it arrives
its end condition.
The current counter value at the time, when the reset is performed, is stored to the
CounterValueAtReset property, which can be read out by software.
5.1.5Counter Event Source
The counter event source selects the signal event, which will be the used to increment the
counter. The following signal sources are available:
Off Counter is idle or has been stopped temporarely and doesn’t count any events.
Acquisition Trigger Counts the number of acquisition triggers.
Acquisition Start Counts the number of acquisition start events.
Acquisition End Counts the number of acquisition end events.
FrameTrigger Counts the number of frame triggers.
Frame Start Counts the number of frame start events.
Frame End Counts the number of frame end events.
Frame Burst Start Counts the number of frame burst start events.
Frame Burst End Counts the number of frame burst end events.
Exposure Start Counts the number of exposure start events.
Exposure End Counts the number of exposure end events.
Counter 0 ... 3 Start Counts the number of the chosen counter start events.
Counter 0 ... 3 End Counts the number of the chosen counter end events.
Timer 0 ... 3 Start Counts the number of the chosen timer start events.
Timer 0 ... 3 End Counts the number of the chosen timer end events
Software Signal Pulse 0 ... 7 Counts the number of the chosen software signal pulse events
(see Section 4.7).
Line Input Counts the number of transitions on the line input according to the signal input
activation configuration.
Missed Trigger Counts the number of missed triggers. A missed trigger is generated, when a
new trigger (frame start, frame burst start or exposure start) is received, the frame
control block however is busy and not read to process this trigger
Time Stamp Tick Counts the number of time stamp ticks. A time stamp tick generator is
available for every counter. It generates events with a rate, which can be configured. For
instances if the rate is set to 1 us and the counter event source is set to count time stamp
ticks, the counter increments every 1 us.
MAN076 09/2017 V1.037 of 121
5 Counter & Timer
Line input source has additionally a activation configuration, which needs to be set
accordingly. Following configuration values are available:
Rising Edge Counter counts rising edges on the selected line input.
Falling Edge Counter counts falling edges on the selected line input
Both Edges Counter counts both edges on the selected line input
Activation configuration has only effect when line input is selected as a counter
event source, otherwise this configuration is ignored.
The counter can be temporarely switched off, if the counter event source is set
to off. It continue counting events as soon as counter event source has been
selected again.
5.1.6Counter Trigger Source
The counter trigger source selects the signal which will be the used to start the counter. The
following signals sources are available:
Off Disables the counter trigger
Acquisition Trigger Starts with the reception of the acquisition trigger event.
Acquisition Start Starts with the reception of the acquisition start event.
Acquisition End Starts with the reception of the acquisition end event.
FrameTrigger Starts with the reception of the frame trigger event.
Frame Start Starts with the reception of the frame start event.
Frame End Starts with the reception of the frame end event.
Frame Burst Start Starts with the reception of the frame burst start event.
Frame Burst End Starts with the reception of the frame burst end event.
Exposure Start Starts with the reception of the exposure start event.
Exposure End Starts with the reception of the exposure end event.
User Output 0 ... 7 Starts and counts events as long as the selected user output bit is asserted.
When the counter is started, it ignores counter events as long as the corresponding user
output bit is deasserted (see Section 4.7).
Counter 0 ... 3 Start Starts with the reception of the chosen counter start event.
Counter 0 ... 3 End Starts with the reception of the chosen counter end event.
Timer 0 ... 3 Start Starts with the reception of the chosen timer start event.
Timer 0 ... 3 End Starts with the reception of the chosen timer end event.
Software Signal Pulse 0 ... 7 Starts with the reception of chosen software signal pulse event
(see Section 4.7).
38 of 121MAN076 09/2017 V1.0
5.1 Counter
Line Input Starts when the specified counter trigger activation condition is met on the chosen
line.
Line input source has additionally an activation configuration, which needs to be set
accordingly. Following configuration values are available:
Rising Edge Counter starts with the rising edge on the selected line input.
Falling Edge Counter starts with the falling edge on the selected line input
Both Edges Counter starts with any edge on the selected line input
Level High Counter starts and is counting events as long as the level is high. When the counter
is started, it ignores counter events as long as the corresponding line input is low.
Level Low Counter starts and is counting events as long as the level is low. When the counter
is started, it ignores counter events as long as the corresponding line input is high.
Activation configuration has only effect when line input is selected as a counter
trigger source, otherwise this configuration is ignored.
5.1.7Counter Reset Source
The counter reset source selects the signal which will be the used to reset the counter. The
following signals sources are available:
Off Disables the counter reset.
Counter Trigger Resets with the reception of a trigger on the counter trigger source (see
Section 5.1.6).
Acquisition Trigger Resets with the reception of the acquisition trigger.
Acquisition Start Resets with the reception of the acquisition start event.
Acquisition End Resets with the reception of the acquisition end event.
FrameTrigger Resets with the reception of the frame trigger.
Frame Start Resets with the reception of the frame start event.
Frame End Resets with the reception of the frame end event.
Frame Burst Start Resets with the reception of the frame burst start event.
Frame Burst End Resets with the reception of the frame burst end event.
Exposure Start Resets with the reception of the exposure start event.
Exposure End Resets with the reception of the exposure end event.
User Output 0 ... 7 Resets the counter as long as the selected user output bit is asserted. The
counter remains in reset state until the selected user output bit is deasserted again (see
Section 4.7).
Counter 0 ... 3 Start Resets with the reception the chosen counter start event.
Counter 0 ... 3 End Resets with the reception the chosen counter end event.
MAN076 09/2017 V1.039 of 121
5 Counter & Timer
T i m e r
S o u r c e
. . .
T i m e r
T r i g g e r
T i m e r A c t i v e
T i m e r S t a r t
T i m e r E n d
T i m e r V a l u e
T i m e r S t a t u s
T i m e r D e l a y
T i m e r S t a r t V a l u e
T i m e r D u r a t i o n
T i m e r R e s e t ( )
A c t i v a t i o n
R i s i n g E d g e
F a l l i n g E d g e
B o t h E d g e s
Timer 0 ... 3 Start Resets with the reception the chosen timer start event.
Timer 0 ... 3 End Resets with the reception the chosen timer end event.
Software Signal Pulse 0 ... 7 Resets with the reception of chosen software signal pulse event
(see Section 4.7).
Line Input Resets when the specified counter trigger activation condition is met on the chosen
line.
Line input source has additionally a activation configuration, which needs to be set
accordingly. Following configuration values are available:
Rising Edge Resets with the rising edge on the selected line input.
Falling Edge Resets with the falling edge on the selected line input
Both Edges Resets with any edge on the selected line input
Activation configuration has only effect when line input is selected as a counter
reset source, otherwise this configuration is ignored.
5.2Timer
Figure 5.2: Timer Structure
Four general purpose timers are available in the camera (Timer0 ... Timer3). A timer can be
used to generate a timed pulse - for instances a strobe signal - or to generate an event after a
predetermined duration. Each timer can be configured individually by software and controlled
by timer trigger events. This section describes the configuration and the function of the timers.
Fig. 5.2 shows the structure of one timer. It contains a timer trigger source and activation
block. Furthermore it has three output signals: A timer active status signal, which is routed to
the I/O control block (see Chapter 6) and timer start and end event signals, which can be used
to trigger other blocks, like counter, timer, acquisition, frame or exposure control (see Section
5.2.3).
5.2.1Timer Usage
Timer delay and timer duration value needs to be configured accordingly. When a timer trigger
source is selected the timer is waiting for a trigger signal. Once this trigger signal has been
received, the timer starts first with the timer delay period, if the timer delay value is > 0, and
counts then for the specified timer duration.
40 of 121MAN076 09/2017 V1.0
5.2 Timer
When the timer trigger source is set to off (see Section 5.2.5), a software timer
reset command starts the timer immediately (see Section 5.2.4).
The current timer value can be read via software. A start value can also be set, which means,
that the timer starts from this configured value instead from zero.
5.2.2Timer Status
The timer has different states, which depends on the configuration and usage. The current
state can be read out by the register TimerStatus. The following list shows the available states:
Timer Idle Timer is idle, and trigger source selection is set to off.
Timer Trigger Wait Timer is waiting for a timer trigger signal.
Timer Delay Timer is in timer delay count period.
Timer Active Time is active and counts for the specified timer duration.
Timer Completed Timer completed indicates, that the timer reached the timer duration count.
The timer remains in this state, until a new timer trigger event has been received, or the
timer is reset by a software command (see Section 5.2.4)
5.2.3Timer Active, -Start and -End Signal
Timer Active An asserted timer active signal indicates, that the timer has started counting the
configured duration period. The timer active signal is internally routed to the I/O control
block and can there be selected for output on the physical output line or on one of the
available leds.
Timer Start The timer start is an event, which is generated, when the timer starts with the
timer duration period.
Timer End The timer end is an event, which is generated, when the timer arrives its configured
timer duration value.
The timer start and end event signals can be used to trigger other blocks, like counter, timer,
acquisition, frame or exposure control.
5.2.4Timer Reset
The timer can be reset by a software command. It performs a software reset of the timer
counter and starts the timer immediately (change to state timer active), when trigger source is
set to off; otherwise it goes into the state timer trigger wait.
5.2.5Timer Trigger Source
The timer trigger source selects the events that will be the used to reset and to start the timer.
The following list of signals are available:
Off Disables the timer trigger
Acquisition Trigger Starts with the reception of the acquisition trigger event.
Acquisition Start Starts with the reception of the acquisition start event.
MAN076 09/2017 V1.041 of 121
5 Counter & Timer
Acquisition End Starts with the reception of the acquisition end event.
FrameTrigger Starts with the reception of the frame trigger event.
Frame Start Starts with the reception of the frame start event.
Frame End Starts with the reception of the frame end event.
Frame Burst Start Starts with the reception of the frame burst start event.
Frame Burst End Starts with the reception of the frame burst end event.
Exposure Start Starts with the reception of the exposure start event.
Exposure End Starts with the reception of the exposure end event.
User Output 0 ... 7 Starts when the selected user output bit is set to 1.
Counter 0 ... 3 Start Starts with the reception of the chosen counter start event.
Counter 0 ... 3 End Starts with the reception of the chosen counter end event.
Timer 0 ... 3 Start Starts with the reception of the chosen timer start event.
Timer 0 ... 3 End Starts with the reception of the chosen timer end event.
Software Signal Pulse 0 ... 7 Starts with the reception of chosen software signal pulse event
(see Section 4.7).
Line Input Starts when the specified counter trigger activation condition is met on the chosen
line.
Line input source has additionally a activation configuration, which needs to be set
accordingly. Following configuration values are available:
Rising Edge Timer starts with the rising edge on the selected line input.
Falling Edge Timer starts with the falling edge on the selected line input.
Both Edges Timer starts with any edge on the selected line input.
A self re-trigger timer can be configured, when the trigger source is set to its
own timer end event. The timer needs a software reset in order to start the self
re-trigger mode.
42 of 121MAN076 09/2017 V1.0
6
I n p u t S i g n a l P a t h " L i n e 0 "
I n p u t S i g n a l P a t h " L i n e 1 "
I n v e r t
S t a t u s
L i n e 1
L i n e 0
I/O Control
This chapter shows the structure of the physical input and physical output line. It describes the
signal path and how to configure it.
The I/O control block contains a signal input path and a signal output path.
6.1Input Signal Path
The camera has two physical signal input lines, which come from the input opto-isolators of
the camera hardware interface (see Section 12.6) and which are fed into the input signal path
of the camera. Fig. 6.1 shows the structure of the input signal path. The user can invert these
signals and its current status can be read by software.
Figure 6.1: Input Signal Path
The output of the signal input path is distributed to following camera function:
•Acquisition Start Trigger
•Acquisition End Trigger
•Frame Start Trigger
•Frame Burst Start Trigger
•Frame Burst End Trigger
•Exposure Start Trigger
•Exposure End Trigger
•Counter Trigger
•Counter Event
•Counter Reset
•Timer Trigger
•LineOut0, LineOut1
•LED S0, S1 or S2
MAN076 09/2017 V1.043 of 121
6 I/O Control
O u t p u t S i g n a l P a t h " L E D S 0 "
O u t p u t S i g n a l P a t h " L E D S 1 "
O u t p u t S i g n a l P a t h " L E D S 2 "
L E D S 0
L E D S 1
L E D S 2
L i n e O u t 0
O u t p u t S i g n a l P a t h " L i n e O u t 0 "
O u t p u t S i g n a l P a t h " L i n e O u t 1 "
S t a t u s
I n v e r t
S o u r c e
. . .
L i n e O u t 1
Table 6.1: Line Input Mapping
Figure 6.2: Output Signal Path
6.2Output Signal Path
Line InputCamera Input
Line0ISO_IN0
Line1ISO_IN1
The camera has two physical signal output line and three LEDs. The physical output line is
connected to the output opto-isolator in the camera hardware interface (see Section 12.6). The
output line, LED S0, S1 and S3 have its own output signal path which is shown in Fig. 6.2. The
source can be selected and the selected signal can be invert before it is sent out of the camera.
Furthermore the current status of the output signal can read by software.
Following signals are available and can be selected for the output:
•Input Line
•Counter Active Status
•Timer Active Status
•User Output
•Acquisition Trigger Wait Status
•Acquisition Active Status
•Frame Trigger Wait Status
•Frame Active Status
•Exposure Active Status
•Frame Valid Status
•Line Valid Status
•Heartbeat
•Serial Communication (MB_DIG0)
•Constant 0 Value
•Constant 1 Value
If the source heartbeat is selected, the LED shows a pulsating behaviour, when the camera is
idle (no image capturing is active). It means, the intensity starts from dark and goes slowly to
44 of 121MAN076 09/2017 V1.0
6.2 Output Signal Path
bright and slowly to dark again. Every time, when the camera is acquiring and sending images,
the LED changes to a blinking characteristic.
If the source serial communication is selected, the led flashes every time when the host
communicates with the camera, due to changing or reading of camera parameters.
MAN076 09/2017 V1.045 of 121
6 I/O Control
46 of 121MAN076 09/2017 V1.0
7
Image Format Control
There are several possibilities to focus on the interesting parts of an image:
•Region of Interest (ROI) (see Section 7.1)
•Multiple Regions of Interest (MROI) (see Section 7.2)
These features can help to reduce the number of pixels read from the image sensor which leads
to an increased frame rate (see Chapter 8 for more information about the available frame rate).
7.1Region of Interest (ROI)
Some applications do not need full image resolution. The image size can be reduced by setting
the horizontal image offset (Offset X) and the image width (Width) and by setting the vertical
image offset (OffsetY) and image height (Height). A region of interest can be almost any
rectangular window but it must be placed within the sensor area. OffsetX + Width must not
exceed the sensor width and OffsetY + Height must not exceed sensor height.
The ROI horizontal size and position in the D640I-M01 camera must be a multiple
of 32 pixels. The vertical size and position must be a multiple of 4 pixels.
7.2Multiple Regions of Interest (MROI)
Up to 512 different regions of interest are configurable. This feature can be used to reduce the
amount of image data and increase the frame rate.
An individual MROI region is defined by its starting value in y-direction and its height. The
starting value in horizontal direction and the width is the same for all MROI regions and is
defined by the ROI settings. The maximum frame rate in MROI mode depends mainly on the
number rows and columns being read out.
The individual ROI in a MROI must not overlap.
The MROI regions must be specified ordered from the top to the bottom of the
image.
Vertical starting position (MROI_Y) and height (MROI_H) of a MROI-region must
be a multiple of 2.
Fig. 7.1 compares ROI and MROI: the setups (visualized on the image sensor area) are displayed
in the upper half of the drawing. The lower half shows the dimensions of the resulting image.
MAN076 09/2017 V1.047 of 121
7 Image Format Control
M R O I 0
M R O I 1
M R O I 2
( 0 , 0 )
( 0 , 0 )
( x
m a x
, y
m a x
)
R O I
M R O I 0
M R O I 1
M R O I 2
R O I
( x
m a x
, y
m a x
)
R O I . X
R O I . W
R O I . Y
R O I . H
R O I . X
R O I . W
M R O I 0 . Y
M R O I 0 . H
M R O I 1 . Y
M R O I 1 . H
M R O I 2 . Y
M R O I 2 . H
( 0 , 0 )
R O I . W
M R O I 0 . H
M R O I 1 . HM R O I 2 . H
( 0 , 0 )
R O I . W
R O I . H
On the left-hand side an example of ROI is shown and on the right-hand side an example of
MROI. It can be readily seen that the resulting image with MROI is smaller than the resulting
image with ROI only and the former will result in an increase in image frame rate.
Figure 7.1: Multiple Regions of Interest
Fig. 7.2 shows another MROI drawing illustrating the effect of MROI on the image content.
48 of 121MAN076 09/2017 V1.0
7.2 Multiple Regions of Interest (MROI)
Figure 7.2: Multiple Regions of Interest with 5 ROIs
MAN076 09/2017 V1.049 of 121
7 Image Format Control
( x
m a x
, y
m a x
)
2 0 p i x e l
2 6 p i x e l
2 p i x e l
2 p i x e l
2 p i x e l
1 p i x e l
1 p i x e l
C h e m i c a l A g e n t
A
BC
Fig. 7.3 shows an example from hyperspectral imaging where the presence of spectral lines at
known regions need to be inspected. By using a MROI only a 640x54 region need to be
readout and a frame rate of 2600 fps can be achieved. Without using MROI the resulting frame
rate would be 300 fps for a 640x512 ROI.
Figure 7.3: Multiple Regions of Interest in hyperspectral imaging
50 of 121MAN076 09/2017 V1.0
8
Frame Rate
8.1Introduction
The maximal frame rate of the camera depends on the camera settings. The following factors
influence the maximal frame rate (see also Table 8.2):
•The read-out mode configuration: Sequential [non overlapping] or Interleave
[overlapping]) (Section 8.2).
•The length of the exposure time: A shorter exposure time can lead to an increase in the
maximal frame rate.
•ROI height: a smaller ROI height can lead to an increase in the maximal frame rate.
•ROI width: a smaller ROI width can lead to an increase in the maximal frame rate.
•PixelFormat: the maximal frame rate might be smaller than the maximal sensor frame rate
because otherwise the maximal data rate at the interface (MaxDataRateInterface) would be
exceeded. The maximal sensor frame rate can be achieved at the default settings of
PixelFormat=Mono8 and MaxDataRateInterface=896 MBit/sec.
The maximal frame rate of the camera can be determined by a frame rate calculator in the
support section of the Photonfocus web page www.photonfocus.com. The maximal frame rate
with the current camera settings can be read out by a camera register
(AcquisitionFrameRateMax). A table with values for common ROI settings is shown in Section 8.8.
8.2Read-out Mode
The MV3-D640I CMOS cameras provide two different readout modes:
Sequential (non overlapping) Read-out [Interleave = False]: Exposure time of the next image
can only start if the readout time of the current image is finished. Frame time is the sum
of exposure time and readout time.
Interleave (overlapping) Read-out [Interleave = True]: Exposure time of the next image can
start during the readout time of the current image. The frame time is determined by the
maximum of the exposure time or of the readout time, which ever of both is the longer
one.
In interleave read-out mode the following two situations need to be distinguished in order to
calculate the maximal allowed frame rate:
Interleave Read-out Timing 1: Exposure Time <= Read Out Time + TIntSampling (see Section
8.5).
Interleave Read-out Timing 2: Exposure Time > Read Out Time + TIntSampling (see Section 8.6).
A formula for the calculation of the maximal frame rate is given in the next sections.
MAN076 09/2017 V1.051 of 121
8 Frame Rate
T r i g g e r
E x p o s u r e
R e a d o u t
F r a m e < n >
E x p o s u r e T i m e
R e a d o u t T i m e
T R e a d o u t D e l
F r a m e < n >
E x p o s u r e T i m e
8.3Common parameters
Some parameters and formulas that are used in the following sections are:
Where BitsPerPixel is 8 in Mono8 pixel format, 12 in Mono10Packed and Mono12Packed pixel
formats and 16 in Mono10 and Mono12 pixel formats.
Some common constants are listed in Table 8.1.
CameraTIntSamplingTReadoutDel
D640I-M010.016 ms0.006 ms
Table 8.1: Frame overhead times
8.4Sequential Read-out Timing
This timing is applied when camera is in sequential read-out mode. Exposure is started after
the sensor read out of the previous frame.
The maximal frame rate is in this case:
MaxFrameRate = 1 / (Exposure Time + TReadoutDel + ReadoutTime)
Figure 8.1: Sequential Read-out Timing
8.5Interleave Read-out Timing 1
This timing is applied if Exposure Time <= (Read Out Time + TIntSampling). Exposure is started
during the sensor read out of the previous frame and the exposure time is smaller or equal
than the read out time plus the integration sampling time (see Fig. 8.2). During the integration
sampling time (TIntSampling), the sensor prepares the next frame for reading out.
To avoid a sensor artifact, the exposure must start at a fixed position from the start of the read
out of one row. Therefore the exposure start must be delayed by a time which can be as long
as the read out of one row.
52 of 121MAN076 09/2017 V1.0
8.6 Interleave Read-out Timing 2
T r i g g e r
E x p o s u r e
R e a d o u t
F r a m e < n >F r a m e < n + 1 >
E x p o s u r e T i m e
R e a d o u t T i m e
T R e a d o u t D e l
T I n t S a m p l i n g
T r i g g e r
E x p o s u r e
R e a d o u t
F r a m e < n + 1 >
E x p o s u r e T i m e
R e a d o u t T i m eT R e a d o u t D e l
F r a m e < n >
> T I n t S a m p l i n g
Figure 8.2: Read-out Timing 1
8.6Interleave Read-out Timing 2
This timing is applied if Exposure Time > (Read Out Time + TIntSampling). Exposure is started
during the sensor read out of the previous frame and the exposure time is bigger than the read
out time in this timing (see Fig. 8.3).
The maximal frame rate is in this case:
MaxFrameRate = min(1 / (Exposure Time + TReadoutDel), MaxFrameRateData)
Figure 8.3: aRead-out Timing 2
8.7MROI
The formulas in Section 8.5 and in Section 8.6 can be used but the read-out time in MROI mode
is slightly different than in ROI-mode:
The Height parameter is the total height of the MROI. The MroiOH depends on the number of
rows between the individual ROI. It is calculated by the following formula:
NbrMroi: number of active ROI. MroiGapRows: sum of the number of rows between individual ROI.
Note that if two ROI have no gap in between, they count as one ROI in the NbrMroi parameter.
Example: given the MROI setting: Mroi[0].Y=0, Mroi[0].H=100, Mroi[1].Y=160, Mroi[1].H=100,
Mroi[2].Y=320, Mroi[2].H=100. In this case NbrMroi=3 and MroiGapRows =
(Mroi[1].Y-Mroi[0].Y-Mroi[0].H) + (Mroi[2].Y-Mroi[1].Y-Mroi[1].H) = 120; MroiOH =2*118 +16*2
= 268.
MAN076 09/2017 V1.053 of 121
8 Frame Rate
8.8Maximum Frame Rate Table
A list of common used image dimension and its frame rates is shown in Table 8.2. It shows the
maximum possible frame rates when the exposure time is smaller than the read out time.
There is a frame rate calculator in the support section of the Photonfocus web
page www.photonfocus.com, which allows to determine the frame rates for any
available image dimensions and expsoure time settings.
ROI DimensionMV3-D640I-M01, Mono8MV3-D640I-M01, Mono12Packed
640 x 512300 fps225 fps
512 x 512345 fps280 fps
640 x 480 (VGA)320 fps240 fps
256 x 2561000 fps1000 fps
640 x 416390 fps16390 fps
Table 8.2: Frame rates of different ROI settings (minimal exposure time,MaxDataRateInterface = 896
MBit/sec).
54 of 121MAN076 09/2017 V1.0
9
I m a g e S e n s o r
D i g i t a l O f f s e t / G a i n
C r o s s h a i r s i n s e r t i o n
C a m e r a
I n t e r f a c e
T e s t i m a g e s i n s e r t i o n
I m a g e o u t p u t
B i n n i n g 2m x 2
n
N o n - U n i f o r m i t y
C o r r e c t i o n
S t a t u s l i n e i n s e r t i o n
L o o k - u p t a b l e ( L U T )
Pixel Data Processing
9.1Overview
The pixel, which are read out of the image sensor, are processed in the cameras data path. The
sequence of blocks is shown in figure Fig. 9.1.
Figure 9.1: Camera data path
.
MAN076 09/2017 V1.055 of 121
9 Pixel Data Processing
9.2Image Correction
9.2.1Overview
The camera possesses image pre-processing features, that compensate for non-uniformities
caused by the sensor, the lens or the illumination. This method of improving the image quality
is generally known as ’Shading Correction’ or ’Flat Field Correction’ and consists of a
combination of offset correction, gain correction and pixel interpolation.
Since the correction is performed in hardware, there is no performance limitation of the cameras for high frame rates.
The offset correction subtracts a positive or negative calibration value from the live image and
thus reduces the fixed pattern noise of the CMOS sensor. In addition, defect pixels can be
removed by interpolation. The gain correction can be used to flatten uneven illumination or to
compensate shading effects of a lens. Both offset and gain correction work on a pixel-per-pixel
basis, i.e. every pixel is corrected separately. For the correction, a black reference and a grey
reference image are required. Then, the correction values are determined automatically in the
camera.
Do not set any reference images when gain or LUT is enabled! Read the following sections very carefully.
Correction values of both reference images can be saved into the internal flash memory, but
this overwrites the factory presets. Then the reference images that are delivered by factory
cannot be restored anymore.
9.2.2Offset Correction (FPN, Hot Pixels)
The offset correction is based on a black reference image, which is taken at no illumination
(e.g. lens aperture completely closed). The black reference image contains the fixed-pattern
noise of the sensor, which can be subtracted from the live images in order to minimise the
static noise.
Offset correction algorithm
After configuring the camera with a black reference image, the camera is ready to apply the
offset correction:
1.Determine the average value of the black reference image.
2.Subtract the black reference image from the average value.
3.Mark pixels that have a grey level higher than the hot-pixel-threshold (default 1008 DN @
12 bit) as hot pixels.
4.Store the result in the camera as the offset correction matrix.
5.During image acquisition, subtract the correction matrix from the acquired image and
interpolate the hot pixels (see Section 9.2.4).
56 of 121MAN076 09/2017 V1.0
9.2 Image Correction
4
4
4
31
21
31
432
3
4
1
1
2414
4
3
1
3
4
b l a c k r e f e r e n c e
i m a g e
1
1
1
2
- 1
2
-2
- 1
0
1
- 1
1
- 1
0
2
0
- 1
0
- 2
0
1
1
-2
- 2-2
a v e r a g e
o f b l a c k
r e f e r e n c e
p i c t u r e
=
-
o f f s e t c o r r e c t i o n
m a t r i x
02004006008001000120014001600
0
0.2
0.4
0.6
0.8
1
Histogram of the uncorrected black reference image
Grey level, 12 Bit [DN]
Relative number of pixels [−]
black level offset ok
black level offset too low
Figure 9.2: Schematic presentation of the offset correction algorithm
How to Obtain a Black Reference Image
In order to improve the image quality, the black reference image must meet certain demands.
The detailed procedure to set the black reference image is described in Section
??.
•The black reference image must be obtained at no illumination, e.g. with lens aperture
closed or closed lens opening.
•It may be necessary to adjust the black level offset of the camera. In the histogram of the
black reference image, ideally there are no grey levels at value 0 DN after adjustment of
the black level offset. All pixels that are saturated black (0 DN) will not be properly
corrected (see Fig. 9.3). The peak in the histogram should be well below the
hot-pixel-threshold.
•Camera settings may influence the grey level. Therefore, for best results the camera
settings of the black reference image must be identical with the camera settings of the
image to be corrected.
Figure 9.3: Histogram of a proper black reference image for offset correction
MAN076 09/2017 V1.057 of 121
9 Pixel Data Processing
9.2.3Gain Correction
The gain correction is based on a grey reference image, which is taken at uniform illumination
to give an image with a mid grey level.
Gain correction is not a trivial feature. The quality of the grey reference image
is crucial for proper gain correction.
Gain correction algorithm
After configuring the camera with a black and grey reference image, the camera is ready to
apply the gain correction:
1.Determine the average value of the grey reference image.
2.Subtract the offset correction matrix from the grey reference image.
3.Divide the average value by the offset corrected grey reference image.
4.Pixels that have a grey level higher than a certain threshold are marked as hot pixels.
5.Store the result in the camera as the gain correction matrix.
6.During image acquisition, multiply the gain correction matrix from the offset-corrected
acquired image and interpolate the hot pixels (see Section 9.2.4).
Gain correction is not a trivial feature. The quality of the grey reference image
is crucial for proper gain correction.
58 of 121MAN076 09/2017 V1.0
9.2 Image Correction
:
7
1 0
9
79
78
79
432
3
4
1
1
9684
6
1 0
1
3
4
g r a y r e f e r e n c e
p i c t u r e
a v e r a g e
o f g r a y
r e f e r e n c e
p i c t u r e
)
1
1 . 2
1
0 . 9
1
1 . 2
-2
0 . 91
1
- 1
1
0 . 8
1
1
0
1 . 3
0 . 8
1
0
1
1
-2
- 2-2
=
1
1
1
2
- 1
2
-2
- 1
0
1
- 1
1
- 1
0
2
0
- 1
0
- 2
0
1
1
-2
- 2-2
-
)
o f f s e t c o r r e c t i o n
m a t r i x
g a i n c o r r e c t i o n
m a t r i x
Figure 9.4: Schematic presentation of the gain correction algorithm
Gain correction always needs an offset correction matrix. Thus, the offset correction always has to be performed before the gain correction.
How to Obtain a Grey Reference Image
In order to improve the image quality, the grey reference image must meet certain demands.
The detailed procedure to set the grey reference image is described in Section
??.
•The grey reference image must be obtained at uniform illumination.
Use a high quality light source that delivers uniform illumination. Standard illumination will not be appropriate.
•When looking at the histogram of the grey reference image, ideally there are no grey
levels at full scale (4095 DN @ 12 bit). All pixels that are saturated white will not be
properly corrected (see Fig. 9.5).
•Camera settings may influence the grey level. Therefore, the camera settings of the grey
reference image must be identical with the camera settings of the image to be corrected.
9.2.4Hot and cold pixel correction
Calibration
The positions of the hot and cold pixels are determined in the Calculate step.
Hot (white) defect pixels are identified using the black reference image: every pixel that
exceeds the hot-pixel-threshold in the black reference image is marked as a defect pixel.
Cold (black) defect pixels are identified using the grey reference image: every pixel with an
intensity value below the cold-pixel-threshold in the grey reference image is marked as a
defect pixel.
MAN076 09/2017 V1.059 of 121
9 Pixel Data Processing
2400260028003000320034003600380040004200
0
0.2
0.4
0.6
0.8
1
Histogram of the uncorrected grey reference image
Grey level, 12 Bit [DN]
Relative number of pixels [−]
grey reference image ok
grey reference image too bright
h o t
p i x e l
p
n
p
n - 1
p
n + 1
p
n
=
p
n - 1
+ p
n + 1
2
Figure 9.5: Proper grey reference image for gain correction
Defect pixel correction
If the hot pixel correction is switched on, the camera replaces the value of a defect pixel by an
average of its neighbour pixels (see Fig. 9.6).
The hot pixel correction also corrects cold (black) defect pixels if a grey reference
image has been set prior to the Calculate step.
Figure 9.6: Defect pixel interpolation by the hot pixel correction
9.2.5Corrected Image
Offset, gain and hot pixel correction can be switched on separately. The following
configurations are possible:
•No correction
•Offset correction only
•Offset and hot pixel correction
•Hot pixel correction only
•Offset and gain correction
•Offset, gain and hot pixel correction
In addition, the black reference image and grey reference image that are currently stored in
the camera RAM can be output.
60 of 121MAN076 09/2017 V1.0
9.2 Image Correction
5
7
6
57
66
56
437
3
4
7
1
7464
4
3
1
3
4
c u r r e n t i m a g e
)
5
6
6
55
65
54
437
3
4
7
1
7464
4
3
1
3
4
)
1
1
1
2
- 1
2
-2
- 1
0
1
- 1
1
- 1
0
2
0
- 1
0
- 2
0
1
1
-2
- 2-2
o f f s e t c o r r e c t i o n
m a t r i x
-
1
1 . 2
1
0 . 9
1
1 . 2
-2
0 . 9
1
1
- 1
1
0 . 8
1
1
0
1 . 3
0 . 8
1
0
1
1
-2- 2-2
g a i n c o r r e c t i o n
m a t r i x
=
.
c o r r e c t e d i m a g e
)
Figure 9.7: Schematic presentation of the corrected image using gain correction algorithm
9.2.6Correction Ranges
Table 9.1 shows the minimum and maximum values of the correction matrices, i.e. the range
that the offset and gain algorithm can correct.
MinimumMaximum
Offset correction-1023 DN @ 12 bit+1023 DN @ 12 bit
Gain correction0.422.67
Table 9.1: Offset and gain correction ranges
.
MAN076 09/2017 V1.061 of 121
9 Pixel Data Processing
9.3Gain and Offset
There are different gain settings on the camera:
Gain (Digital Fine Gain) Digital fine gain accepts fractional values from 0.01 up to 15.99. It is
implemented as a multiplication operation.
Digital Gain Digital Gain is a coarse gain with the settings x1, x2, x4 and x8. It is implemented
as a binary shift of the image data where ’0’ is shifted to the LSB’s of the gray values. E.g.
for gain x2, the output value is shifted by 1 and bit 0 is set to ’0’.
The resulting gain is the product of all the gain values, which means that the image data is
multiplied in the camera by this factor.
Digital Fine Gain and Digital Gain may result in missing codes in the output image data.
A user-defined value can be subtracted from the gray value in the digital offset block. If digital
gain is applied and if the brightness of the image is too big then the interesting part of the
output image might be saturated. By subtracting an offset from the input of the gain block it
is possible to avoid the saturation.
9.4Grey Level Transformation (LUT)
Grey level transformation is remapping of the grey level values of an input image to new
values. The look-up table (LUT) is used to convert the greyscale value of each pixel in an image
into another grey value. It is typically used to implement a transfer curve for contrast
expansion. The camera performs a 12-to-8-bit mapping, so that 4096 input grey levels can be
mapped to 256 output grey levels. The use of the three available modes is explained in the next
sections. Two LUT and a Region-LUT feature are available in the camera (see Section 9.4.4).
The output grey level resolution of the look-up table (independent of gain,
gamma or user-definded mode) is always 8 bit.
There are 2 predefined functions, which generate a look-up table and transfer it
to the camera. For other transfer functions the user can define his own LUT file.
Some commonly used transfer curves are shown in Fig. 9.8. Line a denotes a negative or
inverse transformation, line b enhances the image contrast between grey values x0 and x1.
Line c shows brightness thresholding and the result is an image with only black and white grey
levels. and line d applies a gamma correction (see also Section 9.4.2).
9.4.1Gain
The ’Gain’ mode performs a digital, linear amplification with clamping (see Fig. 9.9). It is
configurable in the range from 1.0 to 4.0 (e.g. 1.234).
62 of 121MAN076 09/2017 V1.0
9.4 Grey Level Transformation (LUT)
a
y = f ( x )
x
x
m a x
x
0
x
1
y
m a x
b
c
d
020040060080010001200
0
50
100
150
200
250
300
Grey level transformation − Gain: y = (255/1023) ⋅ a ⋅ x
x: grey level input value (10 bit) [DN]
y: grey level output value (8 bit) [DN]
a = 1.0
a = 2.0
a = 3.0
a = 4.0
Figure 9.8: Commonly used LUT transfer curves
Figure 9.9: Applying a linear gain with clamping to an image
The ’Gamma’ mode performs an exponential amplification, configurable in the range from 0.4
to 4.0. Gamma > 1.0 results in an attenuation of the image (see Fig. 9.10), gamma < 1.0 results
in an amplification (see Fig. 9.11). Gamma correction is often used for tone mapping and
better display of results on monitor screens.
Figure 9.10: Applying gamma correction to an image (gamma > 1)
Figure 9.11: Applying gamma correction to an image (gamma < 1)
64 of 121MAN076 09/2017 V1.0
9.4 Grey Level Transformation (LUT)
U s e r L U T
y = f ( x )
1 2 b i t
8 b i t
9.4.3User-defined Look-up Table
In the ’User’ mode, the mapping of input to output grey levels can be configured arbitrarily by
the user. There is an example file in the PFRemote folder. LUT files can easily be generated
with a standard spreadsheet tool. The file has to be stored as tab delimited text file.
Figure 9.12: Data path through LUT
9.4.4Region LUT and LUT Enable
Two LUTs and a Region-LUT feature are available in the camera. Both LUTs can be enabled
independently (see Table 9.2). LUT 0 superseeds LUT1.
Enable LUT 0Enable LUT 1Enable Region LUTDescription
---LUT are disabled.
Xdon’t care-LUT 0 is active on whole image.
-X-LUT 1 is active on whole image.
X-XLUT 0 active in Region 0.
XXXLUT 0 active in Region 0 and LUT 1 active
in Region 1. LUT 0 supersedes LUT1.
Table 9.2: LUT Enable and Region LUT
When Region-LUT feature is enabled, then the LUTs are only active in a user defined region.
Examples are shown in Fig. 9.13 and Fig. 9.14.
Fig. 9.13 shows an example of overlapping Region-LUTs. LUT 0, LUT 1 and Region LUT are
enabled. LUT 0 is active in region 0 ((x00, x01), (y00, y01)) and it supersedes LUT 1 in the
overlapping region. LUT 1 is active in region 1 ((x10, x11), (y10, y11)).
Fig. 9.14 shows an example of keyhole inspection in a laser welding application. LUT 0 and LUT
1 are used to enhance the contrast by applying optimized transfer curves to the individual
regions. LUT 0 is used for keyhole inspection. LUT 1 is optimized for seam finding.
.
MAN076 09/2017 V1.065 of 121
9 Pixel Data Processing
L U T 0
( 0 , 0 )
( x
m a x
, y
m a x
)
L U T 1
x 0 0x 1 0x 0 1x 1 1
y 1 0
y 0 0
y 0 1
y 1 1
L U T 0
L U T 1
L U T 1
L U T 0
( 0 , 0 )
( 0 , 0 )
( x
m a x
, y
m a x
)
( x
m a x
, y
m a x
)
Figure 9.13: Overlapping Region-LUT example
Figure 9.14: Region-LUT in keyhole inspection
66 of 121MAN076 09/2017 V1.0
9.4 Grey Level Transformation (LUT)
Fig. 9.15 shows the application of the Region-LUT to a camera image. The original image
without image processing is shown on the left-hand side. The result of the application of the
Region-LUT is shown on the right-hand side. One Region-LUT was applied on a small region on
the lower part of the image where the brightness has been increased.
Figure 9.15: Region-LUT example with camera image; left: original image; right: gain 4 region in the are
of the date print of the bottle
.
MAN076 09/2017 V1.067 of 121
9 Pixel Data Processing
S e n s o r I m a g e
O u t p u t I m a g e
9.5Binning
9.5.1Description
Binning sums the pixels in subsequent columns and rows, according to the binning
configuration. The result is then divided by the number of binned pixels. The binning feature
will result in images with lower resolution but significantly higher SNR. For instance, 2x2
binning will result in roughly twice the SNR (in bright areas of the image).
Binning is done in the digital domain of the camera.
Fig. 9.16 shows a schematic of 2x2 binning: pixels in a 2x2 neighbourhood (displayed as pixels
with the same color in the schematic) are binned together: their intensity values are summed
and divided by four. The output image has half the height and half the width of the input
image.
Figure 9.16: Example of 2x2 binning
9.5.2Camera settings
The camera supports binning settings of 1, 2, 4 or 8 in horizontal and vertical direction. The
relevant parameters for binning are shown in Table 9.3.
PropertyTypeDescription
BinningHorizontalIntegerNumber of pixels combined in binning in horizontal
direction.
BinningVerticalIntegerNumber of pixels combined in binning in vertical direction.
Binning_BitshiftIntegerAdditional left bitshift after binning (overflow is ignored)
HeightIntegerHeight of the output image.
WidthIntegerWidth of the output image.
Table 9.3: Binning parameters
Binning might increase the maximal frame rate if it is currently limited by the setting of the
maximal data rate (MaxDataRateInterface).
9.6Crosshairs
9.6.1Functionality
The crosshairs inserts a vertical and horizontal line into the image. The width of these lines is
one pixel. The grey level is defined by a 12 bit value (0 means black, 4095 means white). This
68 of 121MAN076 09/2017 V1.0
9.6 Crosshairs
allows to set any grey level to get the maximum contrast depending on the acquired image.
The x/y position and the grey level can be set via the camera software. Figure Fig. 9.17 shows
two examples of the activated crosshairs with different grey values. One with white lines and
the other with black lines.
Figure 9.17: Crosshairs Example with different grey values
The x- and y-positon is absolute to the sensor pixel matrix. It is independent on the ROI, MROI
or decimation configurations. Fig. 9.18 shows two situations of the crosshairs configuration.
The same MROI settings is used in both situations. The crosshairs however is set differently. The
crosshairs is not seen in the image on the right, because the x- and y-position is set outside the
MROI region.
.
MAN076 09/2017 V1.069 of 121
9 Pixel Data Processing
( 0 , 0 )
( x
a b s o l u t
, y
a b s o l u t
, G r e y L e v e l )
R O I
( 0 , 0 )
( x
m a x
, y
m a x
)
R O I
( x
a b s o l u t
, y
a b s o l u t
, G r e y L e v e l )
R O IR O I
( x
m a x
, y
m a x
)
Figure 9.18: Crosshairs absolute position
70 of 121MAN076 09/2017 V1.0
9.7 Status Line and Image Information
481 21 62 0
P r e a m b l e
F i e l d 0
0P i x e l :
12356791 01 11 31 41 51 71 81 92 12 22 3
L S B
M S B
6 6B B0 0F F
F i e l d 1F i e l d 2F i e l d 3F i e l d 4
L S BL S BL S BL S BL S B
M S BM S BM S BM S BM S B
9.7Status Line and Image Information
There are camera properties available that give information about the acquired images, such
as integration time, ROI settings or average image value. These properties can be queried by
software. Alternatively, a status line within the image data can be switched on that contains all
the available image information.
9.7.1Image Average Value
The average image value gives the average of an image in 12 bit format (0 .. 4095 DN),
regardless of the currently used grey level resolution. Note that the 12-bit format was chosen
to be compatible with other Photonfocus cameras
9.7.2Status Line Format
If enabled, the status line replaces the last row of the image with camera status information.
Every parameter is coded into fields of 4 pixels (LSB first) and uses the lower 8 bits of the pixel
value, so that the total size of a parameter field is 32 bit (see Fig. 9.19). The assignment of the
parameters to the fields is listed in Table 9.4.
The status line is available in all camera modes.
Figure 9.19: Status line parameters replace the last row of the image
StatusLineCustomValue0 that can be set by the
user
9632Custom value 1: value of register
StatusLineCustomValue1 that can be set by the
user
Table 9.4: Assignment of status line fields
72 of 121MAN076 09/2017 V1.0
9.8 Test Images
9.7.3Camera Type Codes
Camera ModelCamera Type Code
MV3-D640I-M01-144-G2-12502
Table 9.5: Type codes of Photonfocus MV3-D640-M01 GigE cameras series
9.8Test Images
Test images are generated in the camera FPGA, independent of the image sensor. They can be
used to check the transmission path from the camera to the frame grabber. Independent from
the configured grey level resolution, every possible grey level appears the same number of
times in a test image. Therefore, the histogram of the received image must be flat.
A test image is a useful tool to find data transmission errors that are caused most
often by a defective cable between camera and frame grabber.
The analysis of the test images with a histogram tool gives the correct result at a
resolution of 512 x 512 pixels only.
9.8.1Ramp
Depending on the configured grey level resolution, the ramp test image outputs a constant
pattern with increasing grey level from the left to the right side (see Fig. 9.20).
Figure 9.20: Ramp test images: 8 bit output (left), 10 bit output (middle),12 (right)
9.8.2LFSR
The LFSR (linear feedback shift register) test image outputs a constant pattern with a
pseudo-random grey level sequence containing every possible grey level that is repeated for
every row. The LFSR test pattern was chosen because it leads to a very high data toggling rate,
which stresses the interface electronic and the cable connection.
In the histogram you can see that the number of pixels of all grey values are the same.
MAN076 09/2017 V1.073 of 121
9 Pixel Data Processing
Figure 9.21: LFSR (linear feedback shift register) test image
9.8.3Troubleshooting using the LFSR
To control the quality of your complete imaging system enable the LFSR mode, set the camera
window to 512 x 512 pixels (x=0 and y=0) and check the histogram. If your frame grabber
application does not provide a real-time histogram, store the image and use a graphic software
tool to display the histogram.
In the LFSR (linear feedback shift register) mode the camera generates a constant
pseudo-random test pattern containing all grey levels. If the data transmission is correctly
received, the histogram of the image will be flat (Fig. 9.22). On the other hand, a non-flat
histogram (Fig. 9.23) indicates problems, that may be caused either by a defective camera, by
problems in the acquisition software or in the transmission path.
Figure 9.22: LFSR test pattern received at the frame grabber and typical histogram for error-free data
transmission
74 of 121MAN076 09/2017 V1.0
9.8 Test Images
Figure 9.23: LFSR test pattern received at the frame grabber and histogram containing transmission errors
In robots applications, the stress that is applied to the camera cable is especially high due to
the fast movement of the robot arm. For such applications, special drag chain capable cables
are available. Please contact the Photonfocus Support for consulting expertise.
MAN076 09/2017 V1.075 of 121
9 Pixel Data Processing
76 of 121MAN076 09/2017 V1.0
10
Thermoelectric Cooler (TEC)
10.1TEC Description
The sensor has an integrated thermoelectrical cooler (TEC) which is driven by camera electronic
to stabilize the temperature of the sensor. The performance of the camera TEC is mainly
limited by the heat transfer capability of the system setup. The sensor TEC is controlled by a
PID controller which can cool and heat the sensor. The PID controller is implemented as an
inverted OPV PID controller were the parameters are related to specific resistors and capacitors.
The values of the controller parameters are determined by the response of the camera in
standard setups.
The user can set the target temperature for the sensor TEC through software. The current
through the TEC is limited in the camera to TBDX ≤ 1A. Table 10.1 gives the minimum
temperature set point in dependency of ambient temperature and the thermal resistivity. The
actual sensor temperature and the TEC current can be read software. The data of several
measurements are averaged. The higher the ambient temperature and the higher the thermal
resistivity in the vision system setup, the higher is the minimal stable temperature of the SWIR
sensor.
Thermal resistivitySensor Temperature @RTSensor Temperature @50°C
0.7 K/WTBDTBD
0.9 K/WTBDTBD
Table 10.1: Minimal sensor temperature for different thermal resistivities as a function of ambient temperature
MAN076 09/2017 V1.077 of 121
10 Thermoelectric Cooler (TEC)
78 of 121MAN076 09/2017 V1.0
Precautions
11.1IMPORTANT NOTICE!
READ THE INSTRUCTIONS FOR USE BEFORE OPERATING THE CAMERA
STORE THE INSTRUCTIONS FOR USE FOR FURTHER READING
The installation of the camera in the vision system should be executed by trained
and instructed employees.
DANGER - Electric Shock Hazard
Unapproved power supplies may cause electric shock. Serious injury or death
may occur.
•You must use camera power supplies which meet the Safety Extra Low Voltage (SELV) and Limited Power Source (LPS) requirements.
•If you use a powered hub or a powered switch in PoE or USB vision systems
these devices must meet the SELV and LPS requirements.
11
WARNING - Fire Hazard
Unapproved power supplies may cause fire and burns.
•You must use camera power supplies which meet the Limited Power Source
(LPS) requirements.
•If you use a powered hub or a powered switch in PoE or USB vision systems
these devices must meet the LPS requirements.
Supply voltages outside of the specified range will cause damage. Check the
supply voltage range given in this manual. Avoid reverse supply voltages.
Respect the voltage limits and the common mode rails of the camera control
signals. Ensure that the output signals are not over loaded. Respect the power
limitations of the outputs. Carefully design the vision system before you connect
electronic devices to the camera. Use simulation tools to check your design.
Avoid compensation currents over data cables. Use appropriate ground connections and grounding materials in the installation of your vision system to ensure
equal potential of all chassis earth in your system.
MAN076 09/2017 V1.079 of 121
11 Precautions
Incorrect plugs can damage the camera connectors. Use only the connectors
specified by Photonfocus in this manual. Using plugs designed for a smaller or a
larger number of pins can damage the connectors.
The cameras deliver the data to the vision system over interfaces with high bandwidth. Use only shielded data cables to avoid EMC and data transmission issues.
High speed data cables are susceptible to mechanical stress. Avoid mechanical
stress and bending of the cables below the minimum bending radius of the cables during installation of your vision system. For robot applications appropriate
cables have to be used.
Inappropriate software code to control the cameras may cause unexpected camera behaviour.
•The code examples provided in the Photonfocus software package are in-
•To ensure that the examples will work properly in your application, you
cluded as sample code only. Inappropriate code may cause your camera to
function differently than expected and may compromise your application.
The Photonfocus software package is available on the Photonfocus website:
www.photonfocus.com.
must adjust them to meet your specific needs and must carefully test them
thoroughly prior to use.
Avoid dust on the sensor.
The camera is shipped with a plastic cap on the lens mount. To avoid collecting
dust on the camera’s IR cut filter (colour cameras) or sensor (mono and mono NIR
cameras), make sure that you always put the plastic cap in place when there is
no lens mounted on the camera. Follow these general rules:
•Always put the plastic cap in place when there is no lens mounted on the
camera.
•Make sure that the camera is pointing down every time you remove or replace the plastic cap, a lens or a lens adapter.
•Never apply compressed air to the camera. This can easily contaminate optical components, particularly the sensor.
Cleaning of the sensor
Avoid cleaning the surface of the camera sensor or filters if possible. If you must
clean it:
•Before cleaning disconnect the camera from camera power supply and I/O
connectors.
•Follow the instructions given in the section “Cleaning the Sensor” in this
manual.
80 of 121MAN076 09/2017 V1.0
11.1 IMPORTANT NOTICE!
Cleaning of the housing
To clean the surface of the camera housing:
•Before cleaning disconnect the camera from camera power supply and I/O
connectors.
•Do not use aggressive solvents or thinners which can damage the surface,
the serial number label and electronic parts.
•Avoid the generation of ESD during cleaning.
•Take only a small amount of detergent to clean the camera body. Keep in
mind that the camera body complies to the IP30 standard.
•Make sure the detergent has evaporated after cleaning before reconnecting the camera to the power supply.
MAN076 09/2017 V1.081 of 121
11 Precautions
82 of 121MAN076 09/2017 V1.0
Hardware Interface
12.1Absolute Maximum Ratings
ParameterValue
Camera Control Input Signal Voltage Single Ended-0 V ... +24 V
Camera Control Output Signal Voltage Single Ended0 V ... +24 V
Camera Control Output Signal Output Current Single Ended0.5 A
Camera Control Output Signal Output Power Single Ended0.35 W
ESD Contact Discharge Camera Control Signals4 kV
ESD Air Discharge Camera Control Signals8 kV
Fast Transients/Bursts Data and Camera Control Signals1 kV
Surge immunity Data and Camera Control Signals1 kV
Table 12.1: Absolute Maximum Ratings
12
12.2Electrical Characteristics
ParameterValue
Camera Control Input Single Ended+5 V ... +20 V
Table 12.2: Electrical Characteristics
12.3GigE Camera Connector
The GigE cameras are interfaced to external components via
•an x-coded M12 connector to transmit configuration, image data and trigger.
•a 12 pol. Fischer connector for two I/O inputs and two I/O outputs.
The connectors are located on the back of the camera. Fig. 12.1 shows the plugs and the status
LED which indicates camera operation.
12.4Power Supply
The camera requires a single voltage input (see Table 3.3). The camera meets all performance
specifications using standard switching power supplies, although well-regulated linear power
supplies provide optimum performance.
MAN076 09/2017 V1.083 of 121
12 Hardware Interface
81&
0[
Figure 12.1: Rear view of the GigE camera
It is extremely important that you apply the appropriate voltages to your camera.
Incorrect voltages will damage the camera.
For further details including the pinout please refer to Appendix Appendix A.
.
84 of 121MAN076 09/2017 V1.0
12.5 Status Indicator (GigE cameras)
12.5Status Indicator (GigE cameras)
Six LEDs on the back of the camera gives information about the current status of the GigE
CMOS cameras. LED S0, S1 and S2 (from left to right) are configurable. It can be selected,
which camera status information is shown by these LEDs (see Section 6.2 for the available
camera status signals).
LED designatorDefault function
LED S0It pulsates, when the camera is not grabbing images. It means, the
intensity starts from dark and goes slowly to bright and slowly to dark
again.When the camera is grabbing images the LED blinks at a rate equal
to the frame rate. At slow frame rates, the LED blinks. At high frame rates
the LED changes to an apparently continuous green light, with intensity
proportional to the ratio of readout time over frame time.
LED S1Indicates an active serial communication with the camera.
LED S2dark, not used in default configuration
Table 12.3: Default LED S0, S1 and S2 configuration of the GigE CMOS cameras
LED designatorFunction
LED P0Off
LED P1Off: No Link; Solid On: 1Gbps Link - No Activity; Blink: 1Gbps Link - Activity
LED P2Solid On
Table 12.4: Meaning of LED P0, P1 and P2 of the GigE CMOS cameras
12.6I/O Connector
12.6.1Overview
The 12-pol. Fischer I/O connector contains two external single-ended line inputs, two external
single-ended line outputs.
The pinout of the I/O connector is described in Appendix A.
A suitable trigger breakout cable for Fischer 12 pol. connector can be ordered
from your Photonfocus dealership.
Simulation with LTSpice is possible, a simulation model can be downloaded from
our web site www.photonfocus.com on the software download page (in Support
section). It is filed under "Third Party Tools".
Fig. 12.2 shows the schematic of the input and output for the I/O interface. The input and
output are isolated.
.
MAN076 09/2017 V1.085 of 121
12 Hardware Interface
I S O _ I N 0
I S O _ O U T 1
I S O _ G N D
1 2 p o l . F i s c h e r C o n n e c t o r
C a m e r a
C a m e r a F i r m w a r e
I / O C o n t r o l
L i n e I n 0
L i n e O u t 1
I S O _ G N D
I S O _ G N D
C A M E R A _ G N D
C A M E R A _ G N D
C u r r e n t
L i m i t e r
I S O _ O U T 0
L i n e O u t 0
I S O _ G N D
C A M E R A _ G N D
I S O _ I N 1L i n e I n 1
I S O _ G N D
C A M E R A _ G N D
C u r r e n t
L i m i t e r
Figure 12.2: Schematic of inputs and output
86 of 121MAN076 09/2017 V1.0
12.6.2Single-ended Line Input
I S O _ I N
C a m e r a
1 2 p o l . F i s c h e r
C o n n e c t o r
I S O _ G N D
Y O U R _ G N D
Y O U R _ G N D
I S O _ G N D
C u r r e n t
L i m i t e r
I S O _ I N
C a m e r a
I S O _ G N D
Y O U R _ G N D
Y O U R _ G N D
&
Y O U R _ V C C
+
C o n t r o l L o g i c
I S O _ G N D
C u r r e n t
L i m i t e r
1 2 p o l . F i s c h e r
C o n n e c t o r
ISO_IN is a single-ended isolated input (see Fig. 12.2).
Fig. 12.3 shows a direct connection to the ISO_IN input.
Figure 12.3: Direct connection to ISO_IN
Fig. 12.4 shows how to connect ISO_IN to TTL logic output device.
12.6 I/O Connector
Figure 12.4: Connection to ISO_IN from a TTL logic device
.
MAN076 09/2017 V1.087 of 121
12 Hardware Interface
I S O _ O U T
C a m e r a
I S O _ G N D
Y O U R _ G N D
Y O U R _ G N D
C o n t r o l L o g i c
&
Y O U R _ P W R
+
4 k 7
+
Y O U R _ P W R
I S O _ G N D
1 2 p o l . F i s c h e r
C o n n e c t o r
Y O U R _ P W R
R
+
I S O _ O U T
C a m e r a
I S O _ G N D
Y O U R _ G N D
I S O _ G N D
1 2 p o l . F i s c h e r
C o n n e c t o r
Y O U R _ P W R
L
Y O U R _ P W R
L
D
D
D
1
2
+
+
R e s p e c t t h e l i m i t s o f t h e o p t o c o p l e r !
I S O _ O U T
C a m e r a
I S O _ G N D
Y O U R _ G N D
I S O _ G N D
1 2 p o l . F i s c h e r
C o n n e c t o r
12.6.3Single-ended Line Output
ISO_OUT is a single-ended isolated output.
Fig. 12.5 shows the connection from the ISO_OUT output to a TTL logic device.
Figure 12.5: Connection example to ISO_OUT
Fig. 12.6 shows the connection from ISO_OUT to a LED.
Figure 12.6: Connection from ISO_OUT to a LED
Respect the limits of the opto-isolator in the connection to ISO_OUT. Maximal
ratings that must not be exceeded: voltage: 24 V, current: 50 mA, power: 150
mW. (see also Fig. 12.7). The type of the opto-isolator is: Everlight EL3H7C.
Figure 12.7: Limits of ISO_OUT output
.
88 of 121MAN076 09/2017 V1.0
12.6 I/O Connector
4 k 7
+
Y O U R _ P W R
I S O _ O U T
3
I S O _ G N D
2
Y O U R _ G N D
I S O _ G N D
M a s t e r C a m e r a
I S O _ I N
S l a v e C a m e r a
1
2
I S O _ G N D
C u r r e n t
L i m i t e r
I S O _ G N D
12.6.4Master / Slave Camera Connection
The trigger input of one Photonfocus MV3 camera can easily be connected to the strobe
output of another Photonfocus MV3 camera as shown in Fig. 12.8. This results in a master/slave
mode where the slave camera operates synchronously to the master camera.
Figure 12.8: Master / slave connection of two Photonfocus MV3 cameras
.
MAN076 09/2017 V1.089 of 121
12 Hardware Interface
I s o l a t o r
C A M _ G N D
I S O _ I N
I S O _ G N D
G r o u n d p l a n e v o l t a g e d i f f e r e n c e
I S O _ G N D
S e p a r a t e g r o u n d
n o g r o u n d l o o p
12.6.5I/O Wiring
The Photonfocus cameras include electrically isolated inputs and outputs. Take great care when
wiring trigger and strobe signals to the camera, specially over big distances (a few meters) and
in noisy environments. Improper wiring can introduce ground loops which lead to malfunction
of triggers and strobes.
There are two roads to avoid ground loops:
•Separating I/O ground and power supply (ISO_GND and ISO_PWR) from camera power
(CAM_GND, CAM_PWR)
•Using a common power supply for camera and I/O signals with star-wiring
Separate Grounds
To separate the signal and ground connections of the camera (CAM_GND, CAM_PWR, data
connections) from the I/O connections (ISO_GND, ISO_PWR, ISO_IN, ISO_OUT) is one way to
avoid ground loops. Fig. 12.9 shows a schematic of this setup. In this setup the power supplies
for the camera and for ISO power must be separate devices.
Figure 12.9: I/O wiring using separate ground
.
90 of 121MAN076 09/2017 V1.0
12.6 I/O Connector
D e v i c e 1
+
-
D e v i c e 2
+-
D e v i c e 3
+
-
D e v i c e 4
+
-
D e v i c e n
+-
. . .
S t a r P o i n t
G N DP W R
I s o l a t o r
C A M _ G N D
I S O _ I N
I S O _ G N D
S t a r w i r i n i g
n o g r o u n d l o o p
Common Grounds with Star Wiring
Ground loops can be avoided using "star wiring", i.e. the wiring of power and ground
connections originate from one "star point" which is typically a power supply. Fig. 12.10 shows
a schematic of the star-wiring concept.
Fig. 12.11 shows a schematic of the star-wiring concept applied to a Photonfocus GigE
camera.The power supply and ground connections for the camera and for the I/O are
connected to the same power supply which acts as the "Star Point".
Figure 12.10: Star-wiring principle
Figure 12.11: I/O wiring using star-wiring
.
MAN076 09/2017 V1.091 of 121
12 Hardware Interface
P o w e r S u p p l y
+
-
C a m e r a
F l a s h
M a c h i n e V i s i o n
S y s t e m P C
E t h e r n e t D a t a C a b l e
S T R
+
-
C A M _ P W R
C A M _ G N D
I S O _ O U T
I S O _ P W R
I S O _ G N D
I S O _ I N
S t a r t P o i n t
+
-
L i g h t B a r r i e r
Fig. 12.12 shows an example of how to connect a flash light and a trigger source to the camera
using star-wiring. The trigger in this example is generated from a light barrier. Note how the
power and ground cables are connected to the same power supply.
Figure 12.12: I/O wiring using star-wiring example
.
92 of 121MAN076 09/2017 V1.0
An example of improper wiring that causes a ground loop is shown in Fig. 12.13.
I s o l a t o r
C A M _ G N D
I S O _ I N
I S O _ G N D
G r o u n d p l a n e v o l t a g e d i f f e r e n c e
C o n n e c t i n g C A M _ G N D a n d
I S O _ G N D t h e w r o n g w a y
G r o u n d l o o p
G r o u n d l o o p
12.6 I/O Connector
Figure 12.13: Improper I/O wiring causing a ground loop
MAN076 09/2017 V1.093 of 121
12 Hardware Interface
94 of 121MAN076 09/2017 V1.0
13
81&
0[
Miscellaneous
13.1Mechanical Interface
During storage and transport, the camera should be protected against vibration, shock,
moisture and dust. The original packaging protects the camera adequately from vibration and
shock during storage and transport. Please either retain this packaging for possible later use or
dispose of it according to local regulations.
13.1.1MV3 cameras with GigE Interface
Fig. 13.1 shows the mechanical drawing of the camera housing for the Photonfocus
MV3-D640I-M01-G2 cameras with GigE interface (all values in mm).
Figure 13.1: Mechanical dimensions of the MV3-D640I-M01-G2 GigE model
MAN076 09/2017 V1.095 of 121
13 Miscellaneous
13.2Optical Interface
13.2.1Cleaning the Sensor
The sensor is part of the optical path and should be handled like other optical components:
with extreme care.
Dust can obscure pixels, producing dark patches in the images captured. Dust is most visible
when the illumination is collimated. Dark patches caused by dust or dirt shift position as the
angle of illumination changes. Dust is normally not visible when the sensor is positioned at the
exit port of an integrating sphere, where the illumination is diffuse.
1.The camera should only be cleaned in ESD-safe areas by ESD-trained personnel using wrist
straps. Ideally, the sensor should be cleaned in a clean environment. Otherwise, in dusty
environments, the sensor will immediately become dirty again after cleaning.
2.Use a high quality, low pressure air duster (e.g. Electrolube EAD400D, pure compressed
inert gas, www.electrolube.com) to blow off loose particles. This step alone is usually
sufficient to clean the sensor of the most common contaminants.
Workshop air supply is not appropriate and may cause permanent damage to
the sensor.
3.If further cleaning is required, use a suitable lens wiper or Q-Tip moistened with an
appropriate cleaning fluid to wipe the sensor surface as described below. Examples of
suitable lens cleaning materials are given in Table 13.1. Cleaning materials must be
ESD-safe, lint-free and free from particles that may scratch the sensor surface.
Do not use ordinary cotton buds. These do not fulfil the above requirements and
permanent damage to the sensor may result.
4.Wipe the sensor carefully and slowly. First remove coarse particles and dirt from the
sensor using Q-Tips soaked in 2-propanol, applying as little pressure as possible. Using a
method similar to that used for cleaning optical surfaces, clean the sensor by starting at
any corner of the sensor and working towards the opposite corner. Finally, repeat the
procedure with methanol to remove streaks. It is imperative that no pressure be applied
to the surface of the sensor or to the black globe-top material (if present) surrounding the
optically active surface during the cleaning process.
Anticon Gold 9"x 9"WiperMilliken, USAESD safe and suitable for
class 100 environments.
www.milliken.com
TX4025WiperTexwipewww.texwipe.com
TransplexSwabTexwipe
Small Q-Tips SWABS
BB-003
Large Q-Tips SWABS
CA-003
Q-tipsHans J. Michael GmbH,
Germany
Q-tipsHans J. Michael GmbH,
Germany
www.hjm-reinraum.de
Point Slim HUBY-340Q-tipsHans J. Michael GmbH,
Germany
MethanolFluidJohnson Matthey GmbH,
Germany
Semiconductor Grade
99.9% min (Assay),
Merck 12,6024, UN1230,
slightly flammable and
poisonous.
www.alfa-chemcat.com
2-Propanol
(Iso-Propanol)
FluidJohnson Matthey GmbH,
Germany
Semiconductor Grade
99.5% min (Assay) Merck
12,5227, UN1219,
slightly flammable.
www.alfa-chemcat.com
Table 13.1: Recommended materials for sensor cleaning
For cleaning the sensor, Photonfocus recommends the products available from the suppliers as
listed in Table 13.1.
Cleaning tools (except chemicals) can be purchased directly from Photonfocus
(www.photonfocus.com).
.
MAN076 09/2017 V1.097 of 121
13 Miscellaneous
13.3Temperature Monitor
The camera contains a temperature monitor system to protect the camera electronics from
damage due to excessive heat. There are 3 temperature sensing diodes in the camera:
Sensor PCB: Temperature diode on the sensor board.
Proc PCB: Temperature diode on the processor board.
FPGA: Temperature diode on the camera FPGA.
The camera reads these 3 temperature values periodically and compares them to user-defined
higher and lower limits. These limits can be set individually for every temperature diode. If any
of the temperature diodes reports a temperature that is above its higher limit then the camera
is put to a power-down state where the TEC and the image sensor are shut down. No images
are acquired in power-down state. If the temperature of all temperature monitors is below
their higher limit and at least one of the temperatures is below its lower limit then the camera
resumes its normal operating state.
It is recommended to not set the higher limit higher than the predefined values: Sensor PCB HighLimit = 60.0 deg C, Proc PCB HighLimit = 60.0 deg C, FPGA
HighLimit = 79.99 deg C.
98 of 121MAN076 09/2017 V1.0
14
Troubleshooting
14.1No images can be acquired
If no images can be acquired then the cause could be one of the following:
1.Camera is not triggered: see Section 14.1.1
First proceed with the above listed action. If still no images can be acquired then write an
e-mail to Photonfocus support (support@photonfocus.com).
14.1.1No acquisition due to no triggers
Set the camera to the free-running trigger mode (TriggerMode=Off).
If no images can be acquired in free-running mode then triggering is not the main cause of the
acquisition problem.
If images can be acquired in free-running mode but not in your chosen trigger mode then
check the electrical connection of the trigger signal (see also Section 12.6).
MAN076 09/2017 V1.099 of 121
14 Troubleshooting
100 of 121MAN076 09/2017 V1.0
Loading...
+ hidden pages
You need points to download manuals.
1 point = 1 manual.
You can buy points or you can get point for every manual you upload.