Photon Focus MV1-D1312C CameraLink Series User Manual

User Manual
MV1-D1312C CameraLink
®
Series
CMOS Area Scan Colour Camera
MAN046 04/2010 V1.0
All information provided in this manual is believed to be accurate and reliable. No responsibility is assumed by Photonfocus AG for its use. Photonfocus AG reserves the right to make changes to this information without notice. Reproduction of this manual in whole or in part, by any means, is prohibited without prior permission having been obtained from Photonfocus AG.
1
2
Contents
1 Preface 7
1.1 About Photonfocus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.2 Contact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.3 Sales Offices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Further information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.5 Legend . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2 How to get started (CameraLink®) 9
3 Product Specification 13
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2 Feature Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3 Technical Specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4 RGB Bayer Pattern Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.5 Frame Grabber relevant Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4 Functionality 21
4.1 Image Acquisition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.1.1 Readout Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.1.2 Readout Timing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.1.3 Exposure Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1.4 Maximum Frame Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.2 Pixel Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.2.1 Linear Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.2.2 LinLog®. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.3 Reduction of Image Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.3.1 Region of Interest (ROI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.3.2 ROI configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.3.3 Calculation of the maximum frame rate . . . . . . . . . . . . . . . . . . . . . . 34
4.3.4 Multiple Regions of Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.4 Trigger and Strobe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.4.2 Trigger Source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.4.3 Exposure Time Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.4.4 Trigger Delay . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
4.4.5 Burst Trigger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
4.4.6 Software Trigger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.4.7 Strobe Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.5 Data Path Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
4.6 Image Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
4.6.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
4.6.2 Offset Correction (FPN, Hot Pixels) . . . . . . . . . . . . . . . . . . . . . . . . . 46
CONTENTS 3
CONTENTS
4.6.3 Corrected Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.7 Digital Gain and Offset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.8 Fine Gain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.9 Channel Colour Level Transformation (LUT) . . . . . . . . . . . . . . . . . . . . . . . . 49
4.9.1 Gain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.9.2 Gamma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
4.9.3 User-defined Look-up Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.9.4 Region LUT and LUT Enable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.10 Image Information and Status Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.10.1 Counters and Average Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.10.2 Status Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.11 Test Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.11.1 Ramp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.11.2 LFSR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.11.3 Troubleshooting using the LFSR . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.12 Configuration Interface (CameraLink®) . . . . . . . . . . . . . . . . . . . . . . . . . . 60
5 Hardware Interface 61
5.1 Connectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.1.1 CameraLink®Connector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.1.2 Power Supply . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.1.3 Trigger and Strobe Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
5.1.4 Status Indicator (CameraLink®cameras) . . . . . . . . . . . . . . . . . . . . . . 63
5.2 CameraLink®Data Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
6 The PFRemote Control Tool 65
6.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.2 PFRemote and PFLib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.3 Operating System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.4 Installation Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
6.5 Graphical User Interface (GUI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
6.5.1 Port Browser . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
6.5.2 Ports, Device Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
6.5.3 Main Buttons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
6.6 Device Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
7 Graphical User Interface (GUI) 69
7.1 MV1-D1312C-160 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
7.1.1 Exposure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
7.1.2 Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
7.1.3 Trigger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
7.1.4 Data Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
7.1.5 LUT (Look-Up-Table) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
7.1.6 LinLog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
7.1.7 Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
7.1.8 Info . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
8 Mechanical and Optical Considerations 85
8.1 Mechanical Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
8.1.1 MV1 cameras with CameraLink®Interface . . . . . . . . . . . . . . . . . . . . . 85
8.2 Optical Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
8.2.1 Cleaning the Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
8.3 Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4
9 Warranty 89
9.1 Warranty Terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
9.2 Warranty Claim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
10 References 91
A Pinouts 93
A.1 Power Supply Connector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
A.2 CameraLink®Connector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
B Revision History 97
CONTENTS 5
CONTENTS
6
1
Preface
1.1 About Photonfocus
The Swiss company Photonfocus is one of the leading specialists in the development of CMOS image sensors and corresponding industrial cameras for machine vision, security & surveillance and automotive markets. Photonfocus is dedicated to making the latest generation of CMOS technology commercially available. Active Pixel Sensor (APS) and global shutter technologies enable high speed and high dynamic range (120 dB) applications, while avoiding disadvantages like image lag, blooming and smear. Photonfocus has proven that the image quality of modern CMOS sensors is now appropriate for demanding applications. Photonfocus’ product range is complemented by custom design solutions in the area of camera electronics and CMOS image sensors. Photonfocus is ISO 9001 certified. All products are produced with the latest techniques in order to ensure the highest degree of quality.
1.2 Contact
Photonfocus AG, Bahnhofplatz 10, CH-8853 Lachen SZ, Switzerland
Sales Phone: +41 55 451 07 45 Email: sales@photonfocus.com
Support Phone: +41 55 451 01 37 Email: support@photonfocus.com
Table 1.1: Photonfocus Contact
1.3 Sales Offices
Photonfocus products are available through an extensive international distribution network and through our key account managers. Details of the distributor nearest you and contacts to our key account managers can be found at www.photonfocus.com.
1.4 Further information
Photonfocus reserves the right to make changes to its products and documenta­tion without notice. Photonfocus products are neither intended nor certified for use in life support systems or in other critical systems. The use of Photonfocus products in such applications is prohibited.
Photonfocus is a trademark and LinLog®is a registered trademark of Photonfo­cus AG. CameraLink®and GigE Vision®are a registered mark of the Automated Imaging Association. Product and company names mentioned herein are trade­marks or trade names of their respective companies.
7
1 Preface
Reproduction of this manual in whole or in part, by any means, is prohibited without prior permission having been obtained from Photonfocus AG.
Photonfocus can not be held responsible for any technical or typographical er­rors.
1.5 Legend
In this documentation the reader’s attention is drawn to the following icons:
Important note
Alerts and additional information
Attention, critical warning
Notification, user guide
8
2
How to get started (CameraLink®)
1. Install a suitable frame grabber in your PC.
To find a compliant frame grabber, please see the frame grabber compatibility list at www.photonfocus.com.
2. Install the frame grabber software.
Without installed frame grabber software the camera configuration tool PFRe­mote will not be able to communicate with the camera. Please follow the in­structions of the frame grabber supplier.
3. Remove the camera from its packaging. Please make sure the following items are included with your camera:
Power supply connector (7-pole power plug)
Camera body cap
If any items are missing or damaged, please contact your dealership.
4. Remove the camera body cap from the camera and mount a suitable lens.
When removing the camera body cap or when changing the lens, the camera should always be held with the opening facing downwards to prevent dust or debris falling onto the CMOS sensor.
Figure 2.1: Camera with protective cap and lens.
Do not touch the sensor surface. Protect the image sensor from particles and dirt!
9
2 How to get started (CameraLink®)
To choose a lens, see the Lens Finder in the ’Support’ area at www.photonfocus.com.
5. Connect the camera to the frame grabber with a suitable CameraLink®cable (see Fig. 2.2). CameraLink®cables can be purchased from Photonfocus directly (www.photonfocus.com). Please note that Photonfocus provides appropriate solutions for your advanced vision applications.
Figure 2.2: Camera with frame grabber, power supply and cable.
Do not connect or disconnect the CameraLink®cable while camera power is on! For more information about CameraLink®see Section 4.12.
6. Connect a suitable power supply to the provided 7-pole power plug. For the connector assembly see Fig. A.1. The pinout of the connector is shown in Appendix A.
Check the correct supply voltage and polarity! Do not exceed the maximum operating voltage of +12V DC (± 10%).
7. Connect the power supply to the camera (see Fig. 2.2).
The status LED on the rear of the camera will light red for a short moment, and then flash green. For more information see Section 5.1.4.
8. Download the camera software PFRemote to your computer.
You can find the latest version of PFRemote on the support page at www.photonfocus.com.
10
9. Install the camera software PFRemote. Please follow the instructions of the PFRemote setup wizard.
Figure 2.3: Screen shot PFremote setup wizard
10. Start the camera software PFRemote and choose the communication port.
Figure 2.4: PFRemote start window
11. Check the status LED on the rear of the camera.
The status LED lights green when an image is being produced, and it is red when serial communication is active. For more information see Section 5.1.4.
12. You may display images using the software that is provided by the frame grabber manufacturer.
11
2 How to get started (CameraLink®)
12
3
Product Specification
3.1 Introduction
The MV1-D1312C CMOS camera series are built around the colour A1312C CMOS image sensor from Photonfocus, that provides a resolution of 1312 x 1082 pixels at a wide range of spectral sensitivity. It is aimed at standard applications in industrial image processing. The principal advantages are:
Resolution of 1312 x 1082 pixels.
Bayer pattern filter and cut off filter @ 660nm
High quantum efficiency (between 25% and 45%).
High pixel fill factor (> 60%).
Superior signal-to-noise ratio (SNR).
Low power consumption at high speeds.
Very high resistance to blooming.
High dynamic range of up to 120 dB.
Ideal for high speed applications: Global shutter.
Colour resolution of up to 12 bit.
On camera shading correction.
Up to 512 regions of interest (MROI).
2 look-up tables (12-to-8 bit) on user-defined image region (Region-LUT).
Image information and camera settings inside the image (status line).
Software provided for setting and storage of camera parameters.
The camera has a digital CameraLink®interface.
The compact size of 60 x 60 x 45 mm3makes the MV1-D1312C CMOS cameras the perfect solution for applications in which space is at a premium.
The general specification and features of the camera are listed in the following sections.
This manual applies only to MV1-D1312C cameras with revision 2.0 or higher. The camera revision information is displayed as uC Revision in the Info tab of the PFRemote application.
.
13
3 Product Specification
3.2 Feature Overview
Characteristics MV1-D1312C Series
Interfaces CameraLink®base configuration
Camera Control PFRemote (Windows GUI) or programming library (SDK)
Configuration Interface CLSERIAL (9’600 baud or 57’600 baud, user selectable)
Trigger Modes Interface Trigger / External opto isolated trigger input
Image pre-processing Shading Correction (Offset)
2 look-up tables (12-to-8 bit) on user-defined image region (Region-LUT)
Features Colour resolution 12 bit / 10 bit / 8 bit
Region of Interest (ROI)
Up to 512 regions of interest (MROI)
Test pattern (LFSR and grey level ramp)
Image information and camera settings inside the image (status line)
High blooming resistance
Opto isolated trigger input and opto isolated strobe output
Table 3.1: Feature overview (see Chapter 4 for more information)
Figure 3.1: MV1-D1312C CMOS camera series with C-mount lens.
.
14
3.3 Technical Specification
Technical Parameters MV1-D1312C Series
Technology CMOS active pixel (APS)
Scanning system Progressive scan
Optical format / diagonal 1” (13.6 mm diagonal) @ maximum resolution
2/3” (11.6 mm diagonal) @ 1024 x 1024 resolution
Resolution 1312 x 1082 pixels
Pixel size 8 µm x 8 µm
Active optical area 10.48 mm x 8.64 mm (maximum)
Random noise < 0.3 DN @ 8 bit
1)
Fixed pattern noise (FPN) 3.4 DN @ 8 bit / correction OFF
1)
Fixed pattern noise (FPN) < 1DN @ 8 bit / correction ON
1)2)
Dark current MV1-D1312C 0.65 fA / pixel @ 27 °C
Full well capacity ~ 100 ke
Spectral range MV1-D1312C 390 to 670 nm (to 10% of peak responsivity) (see Fig. 3.3)
Responsivity MV1-D1312C 190 x103DN/(J/m2) @ 625 nm / 8 bit / gain = 1
(approximately 560 DN / (lux s) @ 625 nm / 8 Bit / gain = 1) (see Fig. 3.3)
Quantum Efficiency > 40 % (see Fig. 3.2)
Optical fill factor > 60 %
Dynamic range 60 dB in linear mode
Colour format RGB Bayer Raw Data Pattern
Characteristic curve Linear
Shutter mode Global shutter
Colour resolution 12 bit / 10 bit / 8 bit
Table 3.2: General specification of the MV1-D1312C camera series (Footnotes:1)Indicated values are typical values.2)Indicated values are subject to confirmation.
3.3 Technical Specification 15
3 Product Specification
MV1-D1312C-160
Exposure Time 10 µs ... 0.42 s
Exposure time increment 25 ns
Frame rate3)( T
int
= 10 µs) 108 fps
Pixel clock frequency 80 MHz
Pixel clock cycle 12.5 ns
Camera taps 2
Read out mode sequential or simultaneous
Table 3.3: Model-specific parameters (Footnote :3)Maximum frame rate @ full resolution)
MV1-D1312C-160
Operating temperature 0°C ... 50°C
Camera power supply +12 V DC (± 10 %)
Trigger signal input range +5 .. +15 V DC
Max. power consumption < 3.5 W
Lens mount C-Mount (CS-Mount optional)
Dimensions 60 x 60 x 45 mm
3
Mass 265 g
Conformity CE / RoHS / WEE
Table 3.4: Physical characteristics and operating ranges
Fig. 3.2 shows the quantum efficiency and Fig. 3.3 the responsivity of the A1312C CMOS sensor, displayed as a function of wavelength. For more information on photometric and radiometric measurements see the Photonfocus application notes AN006 and AN008 available in the support area of our website www.photonfocus.com. The A1312C colour sensor is equipped with a cover glass. It incorporates an infra-red cut-off filter to avoid false colours arising when an infra-red component is present in the illumination. Fig. 3.4 shows the transmssion curve of the cut-off filter.
3.4 RGB Bayer Pattern Filter
Fig. 3.5 shows the bayer filter arrangement on the pixel matrix in the MV1-D1312C camera series. The numbers in the figure represents pixel position x, pixel position y.
The fix bayer pattern arrangement has to be considered when the ROI configu­ration is changed or the MROI feature is used (see 4.3). It depends on the line number in which an ROI starts. An ROI can start at an even or an odd line num­ber.
3.5 Frame Grabber relevant Configuration
The parameters and settings, which are essential to configure the frame grabber are shown in the following table. The timing diagrams of the camera are given in Section 4.1.2.
16
0 %
5 %
1 0 %
1 5 %
2 0 %
2 5 %
3 0 %
3 5 %
4 0 %
4 5 %
3 0 0 4 0 0 5 0 0 6 0 0 7 0 0 8 0 0
W a v e l e n g t h [ n m ]
Q u a n t u m E f f i c i e n c y
Q E ( r e d )
Q E ( g r e e n 1 )
Q E ( g r e e n 2 )
Q E ( b l u e )
Figure 3.2: Quantum efficiency of the A1312C CMOS image sensor (standard) in the MV1-D1312C camera series
MV1-D1312C-160
Pixel Clock per Tap 80 MHz
Number of Taps 2
Colour resolution 12 bit / 10 bit / 8 bit
Line pause 18 clock cycles
CC1 EXSYNC
CC2 not used
CC3 not used
CC4 not used
Table 3.5: Summary of parameters needed for frame grabber configuration
CameraLink®port and bit assignments are compliant with the CameraLink®standard (see [CL]). sum Table 3.6 shows the tap configurations for the MV1-D1312C-160 cameras.
.
3.5 Frame Grabber relevant Configuration 17
3 Product Specification
0
1 0 0
2 0 0
3 0 0
4 0 0
5 0 0
6 0 0
7 0 0
8 0 0
9 0 0
3 0 0 4 0 0 5 0 0 6 0 0 7 0 0 8 0 0
W a v e l e n g t h [ n m ]
R e s p o n s i v i t y [ V / J / m ² ]
R e s p o n s i v i t y ( r e d )
R e s p o n s i v i t y ( g r e e n 1 )
R e s p o n s i v i t y ( g r e e n 2 )
R e s p o n s i v i t y ( b l u e )
Figure 3.3: Responsivity of the A1312C CMOS image sensor (standard) in the MV1-D1312C camera series
Figure 3.4: Transmission curve of the cut-off filter in the MV1-D1312C camera series
18
Figure 3.5: Bayer Pattern Arrangement in the MV1-D1312C camera series
3.5 Frame Grabber relevant Configuration 19
3 Product Specification
Bit Tap 0 Tap 1 Tap 0 Tap 1 Tap 0 Tap 1
8 Bit 8 Bit 10 Bit 10 Bit 12 Bit 12 Bit
0 (LSB) A0 B0 A0 C0 A0 C0
1 A1 B1 A1 C1 A1 C1
2 A2 B2 A2 C2 A2 C2
3 A3 B3 A3 C3 A3 C3
4 A4 B4 A4 C4 A4 C4
5 A5 B5 A5 C5 A5 C5
6 A6 B6 A6 C6 A6 C6
7 (MSB of 8 Bit) A7 B7 A7 C7 A7 C7
8 - - B0 B4 B0 B4
9 (MSB of 10 Bit) - - B1 B5 B1 B5
10 - - - - B2 B6
11 (MSB of 12 Bit) - - - - B3 B7
Table 3.6: CameraLink®2 Tap port and bit assignments for the MV1-D1312C-160 camera
20
4
Functionality
This chapter serves as an overview of the camera configuration modes and explains camera features. The goal is to describe what can be done with the camera. The setup of the MV1-D1312C series cameras is explained in later chapters.
4.1 Image Acquisition
4.1.1 Readout Modes
The MV1-D1312C CMOS cameras provide two different readout modes:
Sequential readout Frame time is the sum of exposure time and readout time. Exposure time
of the next image can only start if the readout time of the current image is finished.
Simultaneous readout (interleave) The frame time is determined by the maximum of the
exposure time or of the readout time, which ever of both is the longer one. Exposure time of the next image can start during the readout time of the current image.
Readout Mode MV1-D1312C Series
Sequential readout available
Simultaneous readout available
Table 4.1: Readout mode of MV1-D1312C Series camera
The following figure illustrates the effect on the frame rate when using either the sequential readout mode or the simultaneous readout mode (interleave exposure).
E x p o s u r e t i m e
F r a m e r a t e ( f p s )
S i m u l t a n e o u s r e a d o u t m o d e
S e q u e n t i a l r e a d o u t m o d e
f p s = 1 / r e a d o u t t i m e
f p s = 1 / e x p o s u r e t i m e
f p s = 1 / ( r e a d o u t t i m e + e x p o s u r e t i m e )
e x p o s u r e t i m e < r e a d o u t t i m e
e x p o s u r e t i m e > r e a d o u t t i m e
e x p o s u r e t i m e = r e a d o u t t i m e
Figure 4.1: Frame rate in sequential readout mode and simultaneous readout mode
Sequential readout mode For the calculation of the frame rate only a single formula applies:
frames per second equal to the inverse of the sum of exposure time and readout time.
21
4 Functionality
Simultaneous readout mode (exposure time < readout time) The frame rate is given by the
readout time. Frames per second equal to the inverse of the readout time.
Simultaneous readout mode (exposure time > readout time) The frame rate is given by the
exposure time. Frames per second equal to the inverse of the exposure time.
The simultaneous readout mode allows higher frame rate. However, if the exposure time greatly exceeds the readout time, then the effect on the frame rate is neglectable.
In simultaneous readout mode image output faces minor limitations. The overall linear sensor response is partially restricted in the lower grey scale region.
When changing readout mode from sequential to simultaneous readout mode or vice versa, new settings of the BlackLevelOffset and of the image correction are required.
Sequential readout
By default the camera continuously delivers images as fast as possible ("Free-running mode") in the sequential readout mode. Exposure time of the next image can only start if the readout time of the current image is finished.
e x p o s u r e r e a d o u t
e x p o s u r e
r e a d o u t
Figure 4.2: Timing in free-running sequential readout mode
When the acquisition of an image needs to be synchronised to an external event, an external trigger can be used (refer to Section 4.4). In this mode, the camera is idle until it gets a signal to capture an image.
e x p o s u r e r e a d o u t
i d l e e x p o s u r e
e x t e r n a l t r i g g e r
Figure 4.3: Timing in triggered sequential readout mode
Simultaneous readout (interleave exposure)
To achieve highest possible frame rates, the camera must be set to "Free-running mode" with simultaneous readout. The camera continuously delivers images as fast as possible. Exposure time of the next image can start during the readout time of the current image.
e x p o s u r e n i d l e
i d l e
r e a d o u t n
e x p o s u r e n + 1
r e a d o u t n + 1
f r a m e t i m e
r e a d o u t n - 1
Figure 4.4: Timing in free-running simultaneous readout mode (readout time> exposure time)
22
e x p o s u r e n
i d l e
r e a d o u t n
e x p o s u r e n + 1
f r a m e t i m e
r e a d o u t n - 1
i d l e
e x p o s u r e n - 1
Figure 4.5: Timing in free-running simultaneous readout mode (readout time< exposure time)
When the acquisition of an image needs to be synchronised to an external event, an external trigger can be used (refer to Section 4.4). In this mode, the camera is idle until it gets a signal to capture an image.
Figure 4.6: Timing in triggered simultaneous readout mode
4.1.2 Readout Timing
Sequential readout timing
By default, the camera is in free running mode and delivers images without any external control signals. The sensor is operated in sequential readout mode, which means that the sensor is read out after the exposure time. Then the sensor is reset, a new exposure starts and the readout of the image information begins again. The data is output on the rising edge of the pixel clock. The signals FRAME_VALID (FVAL) and LINE_VALID (LVAL) mask valid image information. The signal SHUTTER indicates the active exposure period of the sensor and is shown for clarity only.
Simultaneous readout timing
To achieve highest possible frame rates, the camera must be set to "Free-running mode" with simultaneous readout. The camera continuously delivers images as fast as possible. Exposure time of the next image can start during the readout time of the current image. The data is output on the rising edge of the pixel clock. The signals FRAME_VALID (FVAL) and LINE_VALID (LVAL) mask valid image information. The signal SHUTTER indicates the active integration phase of the sensor and is shown for clarity only.
4.1 Image Acquisition 23
4 Functionality
P C L K
S H U T T E R
F V A L
L V A L
D V A L
D A T A
L i n e p a u s e
L i n e p a u s e L i n e p a u s e
F i r s t L i n e L a s t L i n e
E x p o s u r e T i m e
F r a m e T i m e
C P R E
Figure 4.7: Timing diagram of sequential readout mode
24
P C L K
S H U T T E R
F V A L
L V A L
D V A L
D A T A
L i n e p a u s e
L i n e p a u s e L i n e p a u s e
F i r s t L i n e L a s t L i n e
E x p o s u r e T i m e
F r a m e T i m e
C P R E
E x p o s u r e T i m e
C P R E
Figure 4.8: Timing diagram of simultaneous readout mode (readout time > exposure time)
P C L K
S H U T T E R
F V A L
L V A L
D V A L
D A T A
L i n e p a u s e
L i n e p a u s e L i n e p a u s e
F i r s t L i n e L a s t L i n e
F r a m e T i m e
C P R E
E x p o s u r e T i m e
C P R E
Figure 4.9: Timing diagram simultaneous readout mode (readout time < exposure time)
4.1 Image Acquisition 25
4 Functionality
Frame time Frame time is the inverse of the frame rate.
Exposure time Period during which the pixels are integrating the incoming light.
PCLK Pixel clock on CameraLink®interface.
SHUTTER Internal signal, shown only for clarity. Is ’high’ during the exposure
time.
FVAL (Frame Valid) Is ’high’ while the data of one complete frame are transferred.
LVAL (Line Valid) Is ’high’ while the data of one line are transferred. Example: To transfer
an image with 640x480 pixels, there are 480 LVAL within one FVAL active high period. One LVAL lasts 640 pixel clock cycles.
DVAL (Data Valid) Is ’high’ while data are valid.
DATA Transferred pixel values. Example: For a 100x100 pixel image, there are
100 values transferred within one LVAL active high period, or 100*100 values within one FVAL period.
Line pause Delay before the first line and after every following line when reading
out the image data.
Table 4.2: Explanation of control and data signals used in the timing diagram
These terms will be used also in the timing diagrams of Section 4.4.
4.1.3 Exposure Control
The exposure time defines the period during which the image sensor integrates the incoming light. Refer to Table 3.3 for the allowed exposure time range.
4.1.4 Maximum Frame Rate
The maximum frame rate depends on the exposure time and the size of the image (see Section
4.3.)
.
26
4.2 Pixel Response
4.2.1 Linear Response
The camera offers a linear response between input light signal and output colour level. This can be modified by the use of LinLog®as described in the following sections. In addition, a linear digital gain may be applied, as follows. Please see Table 3.2 for more model-dependent information.
Black Level Adjustment
The black level is the average image value at no light intensity. It can be adjusted by the software by changing the black level offset. Thus, the overall image gets brighter or darker. Use a histogram to control the settings of the black level.
4.2.2 LinLog
®
Overview
The LinLog®technology from Photonfocus allows a logarithmic compression of high light intensities inside the pixel. In contrast to the classical non-integrating logarithmic pixel, the LinLog®pixel is an integrating pixel with global shutter and the possibility to control the transition between linear and logarithmic mode.
The images delivered by the camera in LinLog®mode will require special pro­cessing by the user. Photonfocus is unable to provide support for this function.
In situations involving high intrascene contrast, a compression of the upper grey level region can be achieved with the LinLog®technology. At low intensities each pixel shows a linear response. At high intensities the response changes to logarithmic compression (see Fig. 4.10). The transition region between linear and logarithmic response can be smoothly adjusted by software and is continuously differentiable and monotonic.
G r e y V a l u e
L i g h t I n t e n s i t y
0 %
1 0 0 %
L i n e a r R e s p o n s e
S a t u r a t i o n
W e a k c o m p r e s s i o n
V a l u e 2
S t r o n g c o m p r e s s i o n
V a l u e 1
R e s u l t i n g L i n l o g R e s p o n s e
Figure 4.10: Resulting LinLog2 response curve
4.2 Pixel Response 27
4 Functionality
LinLog®is controlled by up to 4 parameters (Time1, Time2, Value1 and Value2). Value1 and Value2 correspond to the LinLog®voltage that is applied to the sensor. The higher the parameters Value1 and Value2 respectively, the stronger the compression for the high light intensities. Time1 and Time2 are normalised to the exposure time. They can be set to a maximum value of 1000, which corresponds to the exposure time. Examples in the following sections illustrate the LinLog®feature.
LinLog1
In the simplest way the pixels are operated with a constant LinLog®voltage which defines the knee point of the transition.This procedure has the drawback that the linear response curve changes directly to a logarithmic curve leading to a poor grey resolution in the logarithmic region (see Fig. 4.12).
tt
V a l u e 1
t
e x p
0
V
L i n L o g
= V a l u e 2
T i m e 1 = T i m e 2 = m a x . = 1 0 0 0
Figure 4.11: Constant LinLog voltage in the Linlog1 mode
0
50
100
150
200
250
300
Typical LinLog1 Response Curve − Varying Parameter Value1
Illumination Intensity
Output grey level (8 bit) [DN]
V1 = 15 V1 = 16
V1 = 17 V1 = 18 V1 = 19
Time1=1000, Time2=1000, Value2=Value1
Figure 4.12: Response curve for different LinLog settings in LinLog1 mode
.
28
LinLog2
To get more grey resolution in the LinLog®mode, the LinLog2 procedure was developed. In LinLog2 mode a switching between two different logarithmic compressions occurs during the exposure time (see Fig. 4.13). The exposure starts with strong compression with a high LinLog®voltage (Value1). At Time1 the LinLog®voltage is switched to a lower voltage resulting in a weaker compression. This procedure gives a LinLog®response curve with more grey resolution. Fig. 4.14 and Fig. 4.15 show how the response curve is controlled by the three parameters Value1, Value2 and the LinLog®time Time1.
Settings in LinLog2 mode, enable a fine tuning of the slope in the logarithmic region.
tt
V a l u e 1
V a l u e 2
T i m e 1
t
e x p
0
V
L i n L o g
T i m e 2 = m a x . = 1 0 0 0
T i m e 1
Figure 4.13: Voltage switching in the Linlog2 mode
0
50
100
150
200
250
300
Typical LinLog2 Response Curve − Varying Parameter Time1
Illumination Intensity
Output grey level (8 bit) [DN]
T1 = 840 T1 = 920 T1 = 960
T1 = 980 T1 = 999
Time2=1000, Value1=19, Value2=14
Figure 4.14: Response curve for different LinLog settings in LinLog2 mode
4.2 Pixel Response 29
4 Functionality
0
20
40
60
80
100
120
140
160
180
200
Typical LinLog2 Response Curve − Varying Parameter Time1
Illumination Intensity
Output grey level (8 bit) [DN]
T1 = 880 T1 = 900 T1 = 920 T1 = 940 T1 = 960 T1 = 980 T1 = 1000
Time2=1000, Value1=19, Value2=18
Figure 4.15: Response curve for different LinLog settings in LinLog2 mode
LinLog3
To enable more flexibility the LinLog3 mode with 4 parameters was introduced. Fig. 4.16 shows the timing diagram for the LinLog3 mode and the control parameters.
V
L i n L o g
t
V a l u e 1
V a l u e 2
t
e x p
T i m e 2
T i m e 1
T i m e 1 T i m e 2
t
e x p
V a l u e 3 = C o n s t a n t = 0
Figure 4.16: Voltage switching in the LinLog3 mode
.
30
0
50
100
150
200
250
300
Typical LinLog2 Response Curve − Varying Parameter Time2
Illumination Intensity
Output grey level (8 bit) [DN]
T2 = 950 T2 = 960 T2 = 970 T2 = 980
T2 = 990
Time1=850, Value1=19, Value2=18
Figure 4.17: Response curve for different LinLog settings in LinLog3 mode
4.2 Pixel Response 31
4 Functionality
4.3 Reduction of Image Size
With Photonfocus cameras there are several possibilities to focus on the interesting parts of an image, thus reducing the data rate and increasing the frame rate. The most commonly used feature is Region of Interest (ROI).
4.3.1 Region of Interest (ROI)
Both reductions in x- and y-direction result in a higher frame rate.
The bayer pattern arrangement in the image has influence of the ROI and MROI settings. (see Section 3.4).
The minimum width of the region of interest depends on the model of the MV1­D1312C camera series. For more details please consult Table 4.4 and Table 4.5.
The minimum width must be positioned symmetrically towards the vertical cen­ter line of the sensor as shown in Fig. 4.18. A list of possible settings of the ROI for each camera model is given in Table 4.5.
³
2 7 2 p i x e l
³
2 7 2 p i x e l
³
2 7 2 p i x e l
+ m o d u l o 3 2 p i x e l
³
2 7 2 p i x e l + m o d u l o 3 2 p i x e l
a )
b )
Figure 4.18: Possible configuration of the region of interest with MV1-D1312C-160 CMOS camera
It is recommended to re-adjust the settings of the shading correction each time a new region of interest is selected.
32
ROI Dimension [Standard] MV1-D1312C-160
1312 x 1082 (full resolution) 108 fps
1280 x 1024 (SXGA) 117 fps
1280 x 768 (WXGA) 156 fps
800 x 600 (SVGA) 310 fps
640 x 480 (VGA) 472 fps
544 x 2 10590 fps
544 x 1082 249 fps
1312 x 544 214 fps
1312 x 256 445 fps
544 x 544 485 fps
1024 x 1024 145 fps
1056 x 1056 136 fps
1312 x 2 9613 fps
Table 4.3: Frame rates of different ROI settings (exposure time 10 µs; correction on, and sequential readout mode).
Any region of interest may NOT be placed outside of the center of the sensor. Examples shown in Fig. 4.19 illustrate configurations of the ROI that are NOT allowed.
a )
b )
Figure 4.19: ROI configuration examples that are NOT allowed
4.3 Reduction of Image Size 33
4 Functionality
.
4.3.2 ROI configuration
In the MV1-D1312C camera series the following two restrictions have to be respected for the ROI configuration:
The minimum width (w) of the ROI is 544 pixels in the MV1-D1312C-160 camera.
The region of interest must overlap a minimum number of pixels centered to the left and to the right of the vertical middle line of the sensor (ovl).
For any camera model of the MV1-D1312C camera series the allowed ranges for the ROI settings can be deduced by the following formula:
x
min
= max(0, 656 + ovl w)
x
max
= min(656 ovl, 1312 w) .
where "ovl" is the overlap over the middle line and "w" is the width of the region of interest.
Any ROI settings in x-direction exceeding the minimum ROI width must be mod­ulo 32.
MV1-D1312C-160
ROI width (w) 544 ... 1312
overlap (ovl) 272
width condition modulo 32
Table 4.4: Summary of the ROI configuration restrictions for the MV1-D1312C camera series indicating the minimum ROI width (w) and the required number of pixel overlap (ovl) over the sensor middle line
The settings of the region of interest in x-direction are restricted to modulo 32 (see Table 4.5).
There are no restrictions for the settings of the region of interest in y-direction.
4.3.3 Calculation of the maximum frame rate
The frame rate mainly depends on the exposure time and readout time. The frame rate is the inverse of the frame time. fps =
1
t
frame
Calculation of the frame time (sequential mode)
t
frame
t
exp
+ t
ro
34
Width ROI-X (MV1-D1312C-160)
544 384
576 352 ... 384
608 320 ... 352
640 288 ... 384
672 256 ... 384
704 224 ... 384
736 192 ... 384
768 160 ... 384
800 128 ... 384
832 96 ... 384
864 64 ... 384
896 32 ... 384
... ...
1248 0 ... 64
1312 0
Table 4.5: Some possible ROI-X settings
Typical values of the readout time troare given in table Table 4.6. Calculation of the frame time (simultaneous mode)
The calculation of the frame time in simultaneous read out mode requires more detailed data input and is skipped here for the purpose of clarity.
ROI Dimension MV1-D1312C-160
1312 x 1082 tro= 9.12 ms
1248 x 1082 tro= 8.68 ms
1024 x 512 tro= 3.39 ms
1056 x 512 tro= 3.49 ms
1024 x 256 tro= 1.70 ms
1056 x 256 tro= 1.75 ms
Table 4.6: Read out time at different ROI settings for the MV1-D1312C CMOS camera series in sequential read out mode.
A frame rate calculator for calculating the maximum frame rate is available in the support area of the Photonfocus website.
An overview of resulting frame rates in different exposure time settings is given in table Table
4.7.
4.3 Reduction of Image Size 35
4 Functionality
Exposure time MV1-D1312C-160
10 µs 108 / 108 fps
100 µs 107 / 108 fps
500 µs 103 / 108 fps
1 ms 98 / 108 fps
2 ms 89 / 108 fps
5 ms 70 / 108 fps
10 ms 52 / 99 fps
12 ms 47 / 82 fps
Table 4.7: Frame rates of different exposure times, [sequential readout mode / simultaneous readout mode ], resolution 1312 x 1082 pixel, FPN correction on.
4.3.4 Multiple Regions of Interest
The MV1-D1312 camera series can handle up to 512 different regions of interest. This feature can be used to reduce the image data and increase the frame rate. An application example for using multiple regions of interest (MROI) is a laser triangulation system with several laser lines. The multiple ROIs are joined together and form a single image, which is transferred to the frame grabber. An individual MROI region is defined by its starting value in y-direction and its height. The starting value in horizontal direction and the width is the same for all MROI regions and is defined by the ROI settings. The maximum frame rate in MROI mode depends on the number of rows and columns being read out. Overlapping ROIs are allowed. See Section 4.3.3 for information on the calculation of the maximum frame rate. Fig. 4.20 compares ROI and MROI: the setups (visualized on the image sensor area) are displayed in the upper half of the drawing. The lower half shows the dimensions of the resulting image. On the left-hand side an example of ROI is shown and on the right-hand side an example of MROI. It can be readily seen that resulting image with MROI is smaller than the resulting image with ROI only and the former will result in an increase in image frame rate. Fig. 4.21 shows another MROI drawing illustrating the effect of MROI on the image content. Fig. 4.22 shows an example from hyperspectral imaging where the presence of spectral lines at known regions need to be inspected. By using MROI only a 656x54 region need to be readout and a frame rate of 4300 fps can be achieved. Without using MROI the resulting frame rate would be 216 fps for a 656x1082 ROI.
.
36
M R O I 0
M R O I 1
M R O I 2
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
R O I
M R O I 0
M R O I 1
M R O I 2
R O I
Figure 4.20: Multiple Regions of Interest
Figure 4.21: Multiple Regions of Interest with 5 ROIs
4.3 Reduction of Image Size 37
4 Functionality
6 5 6 p i x e l
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
2 0 p i x e l
2 6 p i x e l
2 p i x e l
2 p i x e l
2 p i x e l
1 p i x e l
1 p i x e l
C h e m i c a l A g e n t
A
B C
Figure 4.22: Multiple Regions of Interest in hyperspectral imaging
38
4.4 Trigger and Strobe
4.4.1 Introduction
The start of the exposure of the camera’s image sensor is controlled by the trigger. The trigger can either be generated internally by the camera (free running trigger mode) or by an external device (external trigger mode). This section refers to the external trigger mode if not otherwise specified. In external trigger mode, the trigger can be applied through the CameraLink®interface (interface trigger) or directly by the power supply connector of the camera (I/O Trigger) (see Section 4.4.2). The trigger signal can be configured to be active high or active low. When the frequency of the incoming triggers is higher than the maximal frame rate of the current camera settings, then some trigger pulses will be missed. A missed trigger counter counts these events. This counter can be read out by the user. The exposure time in external trigger mode can be defined by the setting of the exposure time register (camera controlled exposure mode) or by the width of the incoming trigger pulse (trigger controlled exposure mode) (see Section 4.4.3). An external trigger pulse starts the exposure of one image. In Burst Trigger Mode however, a trigger pulse starts the exposure of a user defined number of images (see Section 4.4.5). The start of the exposure is shortly after the active edge of the incoming trigger. An additional trigger delay can be applied that delays the start of the exposure by a user defined time (see Section 4.4.4). This often used to start the exposure after the trigger to a flash lighting source.
4.4.2 Trigger Source
The trigger signal can be configured to be active high or active low. One of the following trigger sources can be used:
Free running The trigger is generated internally by the camera. Exposure starts immediately
after the camera is ready and the maximal possible frame rate is attained, if Constant Frame Rate mode is disabled. In Constant Frame Rate mode, exposure starts after a user-specified time (Frame Time) has elapsed from the previous exposure start and therefore the frame rate is set to a user defined value.
Interface Trigger In the interface trigger mode, the trigger signal is applied to the camera by
the CameraLink®interface. Fig. 4.23 shows a diagram of the interface trigger setup. The trigger is generated by the frame grabber board and sent on the CC1 signal through the CameraLink®interface. Some frame grabbers allow the connection external trigger devices through an I/O card. A schematic diagram of this setup is shown in Fig. 4.24.
I/O Trigger In the I/O trigger mode, the trigger signal is applied directly to the camera by the
power supply connector (via an optocoupler). A setup of this mode is shown in Fig. 4.25. The electrical interface of the I/O trigger input and the strobe output is described in Section 5.1.3.
4.4.3 Exposure Time Control
Depending on the trigger mode, the exposure time can be determined either by the camera or by the trigger signal itself:
Camera-controlled Exposure time In this trigger mode the exposure time is defined by the
camera. For an active high trigger signal, the camera starts the exposure with a positive trigger edge and stops it when the preprogrammed exposure time has elapsed. The exposure time is defined by the software.
4.4 Trigger and Strobe 39
4 Functionality
C a m e r a
S y s t e m P C
M a c h i n e V i s i o n
B
A
D a t a C a m e r a L i n k
P o w e r
E X S Y N C ( C C 1 ) / S o f t t r i g g e r
C a m e r a L i n k
T M
F r a m e G r a b b e r
Figure 4.23: Interface trigger source
I / O B o a r d
C a m e r a 1
S y s t e m P C
M a c h i n e V i s i o n
B
A
D a t a C a m e r a L i n k
P o w e r
T r i g g e r S o u r c e
E X S Y N C ( C C 1 ) / S o f t t r i g g e r
C a m e r a L i n k
T M
F r a m e G r a b b e r
F l a s h
C a m e r a 2
D a t a C a m e r a L i n k
E X S Y N C ( C C 1 ) / S o f t t r i g g e r
P o w e r
Figure 4.24: Interface trigger with 2 cameras and frame grabber I/O card
Trigger-controlled Exposure time In this trigger mode the exposure time is defined by the
pulse width of the trigger pulse. For an active high trigger signal, the camera starts the exposure with the positive edge of the trigger signal and stops it with the negative edge.
Trigger-controlled exposure time is not available in simultaneous readout mode.
External Trigger with Camera controlled Exposure Time
In the external trigger mode with camera controlled exposure time the rising edge of the trigger pulse starts the camera states machine, which controls the sensor and optional an
40
C a m e r a 1
F l a s h
T r i g g e r S o u r c e
T T L
T T L
S y s t e m P C
M a c h i n e V i s i o n
B
A
D a t a C a m e r a L i n k
P o w e r
C a m e r a L i n k
T M
F r a m e G r a b b e r
Figure 4.25: I/O trigger source
external strobe output. Fig. 4.26 shows the detailed timing diagram for the external trigger mode with camera controlled exposure time.
e x t e r n a l t r i g g e r p u l s e i n p u t
t r i g g e r a f t e r i s o l a t o r
t r i g g e r p u l s e i n t e r n a l c a m e r a c o n t r o l
d e l a y e d t r i g g e r f o r s h u t t e r c o n t r o l
i n t e r n a l s h u t t e r c o n t r o l
d e l a y e d t r i g g e r f o r s t r o b e c o n t r o l
i n t e r n a l s t r o b e c o n t r o l
e x t e r n a l s t r o b e p u l s e o u t p u t
t
d - i s o - i n p u t
t
j i t t e r
t
t r i g g e r - d e l a y
t
e x p o s u r e
t
s t r o b e - d e l a y
t
d - i s o - o u t p u t
t
s t r o b e - d u r a t i o n
t
t r i g g e r - o f f s e t
t
s t r o b e - o f f s e t
Figure 4.26: Timing diagram for the camera controlled exposure time
The rising edge of the trigger signal is detected in the camera control electronic which is implemented in an FPGA. Before the trigger signal reaches the FPGA it is isolated from the camera environment to allow robust integration of the camera into the vision system. In the signal isolator the trigger signal is delayed by time t
disoinput
. This signal is clocked into the
FPGA which leads to a jitter of t
jitter
. The pulse can be delayed by the time t
triggerdelay
which can be configured by a user defined value via camera software. The trigger offset delay t
triggeroffset
results then from the synchronous design of the FPGA state machines. The
exposure time t
exposure
is controlled with an internal exposure time controller.
4.4 Trigger and Strobe 41
4 Functionality
The trigger pulse from the internal camera control starts also the strobe control state machines. The strobe can be delayed by t
strobedelay
with an internal counter which can be controlled by
the customer via software settings. The strobe offset delay t
strobedelay
results then from the synchronous design of the FPGA state machines. A second counter determines the strobe duration t
strobeduration
(strobe-duration). For a robust system design the strobe output is also
isolated from the camera electronic which leads to an additional delay of t
disooutput
. Table
4.8 gives an overview over the minimum and maximum values of the parameters.
External Trigger with Pulsewidth controlled Exposure Time
In the external trigger mode with Pulsewidth controlled exposure time the rising edge of the trigger pulse starts the camera states machine, which controls the sensor. The falling edge of the trigger pulse stops the image acquisition. Additionally the optional external strobe output is controlled by the rising edge of the trigger pulse. Timing diagram Fig. 4.27 shows the detailed timing for the external trigger mode with pulse width controlled exposure time.
e x t e r n a l t r i g g e r p u l s e i n p u t
t r i g g e r a f t e r i s o l a t o r
t r i g g e r p u l s e r i s i n g e d g e c a m e r a c o n t r o l
d e l a y e d t r i g g e r r i s i n g e d g e f o r s h u t t e r s e t
i n t e r n a l s h u t t e r c o n t r o l
d e l a y e d t r i g g e r f o r s t r o b e c o n t r o l
i n t e r n a l s t r o b e c o n t r o l
e x t e r n a l s t r o b e p u l s e o u t p u t
t
d - i s o - i n p u t
t
j i t t e r
t
t r i g g e r - d e l a y
t
e x p o s u r e
t
s t r o b e - d e l a y
t
d - i s o - o u t p u t
t
s t r o b e - d u r a t i o n
t r i g g e r p u l s e f a l l i n g e d g e c a m e r a c o n t r o l
d e l a y e d t r i g g e r f a l l i n g e d g e s h u t t e r r e s e t
t
j i t t e r
t
t r i g g e r - d e l a y
t
e x p o s u r e
t
t r i g g e r - o f f s e t
t
s t r o b e - o f f s e t
Figure 4.27: Timing diagram for the Pulsewidth controlled exposure time
The timing of the rising edge of the trigger pulse until to the start of exposure and strobe is equal to the timing of the camera controlled exposure time (see Section 4.4.3). In this mode however the end of the exposure is controlled by the falling edge of the trigger Pulsewidth: The falling edge of the trigger pulse is delayed by the time t
disoinput
which is results from the signal isolator. This signal is clocked into the FPGA which leads to a jitter of t
jitter
. The pulse is
42
then delayed by t
triggerdelay
by the user defined value which can be configured via camera
software. After the trigger offset time t
triggeroffset
the exposure is stopped.
4.4.4 Trigger Delay
The trigger delay is a programmable delay in milliseconds between the incoming trigger edge and the start of the exposure. This feature may be required to synchronize to external strobe with the exposure of the camera.
4.4.5 Burst Trigger
The camera includes a burst trigger engine. When enabled, it starts a predefined number of acquisitions after one single trigger pulse. The time between two acquisitions and the number of acquisitions can be configured by a user defined value via the camera software. The burst trigger feature works only in the mode "Camera controlled Exposure Time". The burst trigger signal can be configured to be active high or active low. When the frequency of the incoming burst triggers is higher than the duration of the programmed burst sequence, then some trigger pulses will be missed. A missed burst trigger counter counts these events. This counter can be read out by the user.
e x t e r n a l t r i g g e r p u l s e i n p u t
t r i g g e r a f t e r i s o l a t o r
t r i g g e r p u l s e i n t e r n a l c a m e r a c o n t r o l
d e l a y e d t r i g g e r f o r s h u t t e r c o n t r o l
i n t e r n a l s h u t t e r c o n t r o l
d e l a y e d t r i g g e r f o r s t r o b e c o n t r o l
i n t e r n a l s t r o b e c o n t r o l
e x t e r n a l s t r o b e p u l s e o u t p u t
t
d - i s o - i n p u t
t
j i t t e r
t
t r i g g e r - d e l a y
t
e x p o s u r e
t
s t r o b e - d e l a y
t
d - i s o - o u t p u t
t
s t r o b e - d u r a t i o n
t
t r i g g e r - o f f s e t
t
s t r o b e - o f f s e t
d e l a y e d t r i g g e r f o r b u r s t t r i g g e r e n g i n e
t
b u r s t - t r i g g e r - d e l a y
t
b u r s t - p e r i o d - t i m e
Figure 4.28: Timing diagram for the burst trigger mode
The timing diagram of the burst trigger mode is shown in Fig. 4.28. The timing of the "external trigger pulse input" until to the "trigger pulse internal camera control" is equal to
4.4 Trigger and Strobe 43
4 Functionality
the timing in the section Fig. 4.27. This trigger pulse then starts after a user configurable burst trigger delay time t
bursttriggerdelay
the internal burst engine, which generates n internal triggers for the shutter- and the strobe-control. A user configurable value defines the time t
burstperiodtime
between two acquisitions.
MV1-D1312C-160 MV1-D1312C-160
Timing Parameter Minimum Maximum
t
disoinput
45 ns 60 ns
t
jitter
0 25 ns
t
triggerdelay
0 0.42 s
t
bursttriggerdelay
0 0.42 s
t
burstperiodtime
depends on camera settings 0.42 s
t
triggeroffset
(non burst mode) 100 ns 100 ns
t
triggeroffset
(burst mode) 125 ns 125 ns
t
exposure
10 µs 0.42 s
t
strobedelay
0 0.42 s
t
strobeoffset
(non burst mode) 100 ns 100 ns
t
strobeoffset
(burst mode) 125 ns 125 ns
t
strobeduration
200 ns 0.42 s
t
disooutput
45 ns 60 ns
t
triggerpulsewidth
200 ns n/a
Number of bursts n 1 30000
Table 4.8: Summary of timing parameters relevant in the external trigger mode using camera (MV1­D1312C-160)
4.4.6 Software Trigger
The software trigger enables to emulate an external trigger pulse by the camera software through the serial data interface. It works with both burst mode enabled and disabled. As soon as it is performed via the camera software, it will start the image acquisition(s), depending on the usage of the burst mode and the burst configuration. The trigger mode must be set to Interface Trigger or I/O Trigger.
4.4.7 Strobe Output
The strobe output is an opto-isolated output located on the power supply connector that can be used to trigger a strobe. The strobe output can be used both in free-running and in trigger mode. There is a programmable delay available to adjust the strobe pulse to your application.
The strobe output needs a separate power supply. Please see Section Section
5.1.3 and Figure Fig. 4.24 and Fig. 4.25 for more information.
.
44
4.5 Data Path Overview
The data path is the path of the image from the output of the image sensor to the output of the camera. The sequence of blocks is shown in figure Fig. 4.29.
I m a g e S e n s o r
F P N C o r r e c t i o n
D i g i t a l O f f s e t
D i g i t a l G a i n
F i n e G a i n
S t a t u s l i n e i n s e r t i o n
T e s t i m a g e s i n s e r t i o n
A p p l y d a t a r e s o l u t i o n
I m a g e o u t p u t
C o l o u r C h a n n e l
F i n e G a i n
L o o k - u p t a b l e ( L U T )
Figure 4.29: camera data path
.
4.5 Data Path Overview 45
4 Functionality
4.6 Image Correction
4.6.1 Overview
The camera possesses image pre-processing features, that compensate for non-uniformities caused by the sensor, the lens or the illumination. This method of improving the image quality is generally known as ’Shading Correction’ or ’Flat Field Correction’ and consists of an of offset correction and a pixel interpolation.
Since the correction is performed in hardware, there is no performance limita­tion of the cameras for high frame rates.
The offset correction subtracts a configurable positive or negative value from the live image and thus reduces the fixed pattern noise of the CMOS sensor. In addition, hot pixels can be removed by interpolation. The offset correction works on a pixel-per-pixel basis, i.e. every pixel is corrected separately. For the correction, a black reference image is required. Then, the correction values are determined automatically in the camera.
Do not set any reference images when gain or LUT is enabled! Read the follow­ing sections very carefully.
Correction values of the reference image can be saved into the internal flash memory, but this overwrites the factory presets. Then the reference images that are delivered by factory cannot be restored anymore.
4.6.2 Offset Correction (FPN, Hot Pixels)
The offset correction is based on a black reference image, which is taken at no illumination (e.g. lens aperture completely closed). The black reference image contains the fixed-pattern noise of the sensor, which can be subtracted from the live images in order to minimise the static noise.
Offset correction algorithm
After configuring the camera with a black reference image, the camera is ready to apply the offset correction:
1. Determine the average value of the black reference image.
2. Subtract the black reference image from the average value.
3. Mark pixels that have a grey level higher than 1008 DN (@ 12 bit) as hot pixels.
4. Store the result in the camera as the offset correction matrix.
5. During image acquisition, subtract the correction matrix from the acquired image and
interpolate the hot pixels (see Section 4.6.2).
46
4
4
4
31
21
3 1
4 32
3
4
1
1
2 4 14
4
3
1
3
4
b l a c k r e f e r e n c e i m a g e
1
1
1
2
- 1
2
- 2
- 1
0
1
- 1
1
- 1
0
2
0
- 1
0
- 2
0
1
1
- 2
- 2 - 2
a v e r a g e
o f b l a c k
r e f e r e n c e
p i c t u r e
=
-
o f f s e t c o r r e c t i o n m a t r i x
Figure 4.30: Schematic presentation of the offset correction algorithm
How to Obtain a Black Reference Image
In order to improve the image quality, the black reference image must meet certain demands.
The black reference image must be obtained at no illumination, e.g. with lens aperture closed or closed lens opening.
It may be necessary to adjust the black level offset of the camera. In the histogram of the black reference image, ideally there are no grey levels at value 0 DN after adjustment of the black level offset. All pixels that are saturated black (0 DN) will not be properly corrected (see Fig. 4.31). The peak in the histogram should be well below the hot pixel threshold of 1008 DN @ 12 bit.
Camera settings may influence the grey level. Therefore, for best results the camera settings of the black reference image must be identical with the camera settings of the image to be corrected.
0 200 400 600 800 1000 1200 1400 1600
0
0.2
0.4
0.6
0.8
1
Histogram of the uncorrected black reference image
Grey level, 12 Bit [DN]
Relative number of pixels [−]
black level offset ok black level offset too low
Figure 4.31: Histogram of a proper black reference image for offset correction
Hot pixel correction
Every pixel that exceeds a certain threshold in the black reference image is marked as a hot pixel. If the hot pixel correction is switched on, the camera replaces the value of a hot pixel by an average of its neighbour pixels (see Fig. 4.32). The correction algorithm considers the bayer pattern. E.g. If a blue pixel is hot, it will calculate an average value of the nearest two blue neighbour pixels in a line. This will be done for all colour channel individually.
4.6 Image Correction 47
4 Functionality
h o t
p i x e l
p
n
p
n - 1
p
n + 1
p
n
=
p
n - 1
+ p
n + 1
2
Figure 4.32: Hot pixel interpolation
4.6.3 Corrected Image
Offset, gain and hot pixel correction can be switched on separately. The following configurations are possible:
No correction
Offset correction only
Offset and hot pixel correction
Hot pixel correction only
Offset and gain correction
Offset, gain and hot pixel correction
In addition, the black reference image that are currently stored in the camera RAM can be output. Table 4.9 shows the minimum and maximum values of the correction matrices, i.e. the range that the offset algorithm can correct.
Minimum Maximum
Offset correction -1023 DN @ 12 bit +1023 DN @ 12 bit
Table 4.9: Offset correction ranges
.
48
4.7 Digital Gain and Offset
Gain x1, x2, x4 and x8 are digital amplifications, which means that the digital image data are multiplied in the camera module by a factor 1, 2, 4 or 8, respectively. It is implemented as a binary shift of the image data, which means that there will be missing codes in the output image as the LSB’s of the gray values are set to ’0’. E.g. for gain x2, the output value is shifted by 1 and bit 0 is set to ’0’. A user-defined value can be subtracted from the gray value in the digital offset block. This feature is not available in Gain x1 mode. If digital gain is applied and if the brightness of the image is too big then the output image might be saturated. By subtracting an offset from the input of the gain block it is possible to avoid the saturation.
4.8 Fine Gain
The MV1-D1312C camera series have to types of fine gains included. An image overall fine gain and RGB channel fine gain. The image fine gain can be used to adjust the brightness of the whole image. The RGB channel fine gain is used to calibrate the white balance in an image, which has to be set according to the current lighting condition.
The RGB gain is multiplied by the image overall fine gain and digital gain.
4.9 Channel Colour Level Transformation (LUT)
Channel colour level transformation is remapping of the colour level values of an input image to new values. The look-up table (LUT) is used to convert the channel colour value of each pixel in an image into another channel colour value. It is typically used to implement a transfer curve for contrast expansion. The camera performs a 12-to-8-bit mapping, so that 4096 input channel colour levels can be mapped to 256 output channel colour levels. The use of the three available modes is explained in the next sections. Two LUT and a Region-LUT feature are available in the MV1-D1312C camera series (see Section 4.9.4).
The output channel colour level resolution of the look-up table (independent of gain, gamma or user-definded mode) is always 8 bit.
There are 2 predefined functions, which generate a look-up table and transfer it to the camera. For other transfer functions the user can define his own LUT file.
Some commonly used transfer curves are shown in Fig. 4.33. Line a denotes a negative or inverse transformation, line b enhances the image contrast between colour channel values x0 and x1. Line c shows brightness thresholding and the result is an image with only black and white levels. and line d applies a gamma correction (see also Section 4.9.2).
4.9.1 Gain
The ’Gain’ mode performs a digital, linear amplification with clamping (see Fig. 4.34). It is configurable in the range from 1.0 to 4.0 (e.g. 1.234).
4.7 Digital Gain and Offset 49
4 Functionality
a
y = f ( x )
x
x
m a x
x
0
x
1
y
m a x
b
c
d
Figure 4.33: Commonly used LUT transfer curves
0 200 400 600 800 1000 1200
0
50
100
150
200
250
300
Grey level transformation − Gain: y = (255/1023) ⋅ a ⋅ x
x: grey level input value (10 bit) [DN]
y: grey level output value (8 bit) [DN]
a = 1.0 a = 2.0 a = 3.0 a = 4.0
Figure 4.34: Applying a linear gain with clamping to an image
50
4.9.2 Gamma
The ’Gamma’ mode performs an exponential amplification, configurable in the range from 0.4 to 4.0. Gamma > 1.0 results in an attenuation of the image (see Fig. 4.35), gamma < 1.0 results in an amplification (see Fig. 4.36). Gamma correction is often used for tone mapping and better display of results on monitor screens.
0 200 400 600 800 1000 1200
0
50
100
150
200
250
300
Grey level transformation − Gamma: y = (255 / 1023γ) xγ (γ 1)
x: grey level input value (10 bit) [DN]
y: grey level output value (8 bit) [DN]
γ = 1.0 γ = 1.2 γ = 1.5 γ = 1.8 γ = 2.5 γ = 4.0
Figure 4.35: Applying gamma correction to an image (gamma > 1)
0 200 400 600 800 1000 1200
0
50
100
150
200
250
300
Grey level transformation − Gamma: y = (255 / 1023γ) xγ (γ 1)
x: grey level input value (10 bit) [DN]
y: grey level output value (8 bit) [DN]
γ = 1.0 γ = 0.9 γ = 0.8 γ = 0.6 γ = 0.4
Figure 4.36: Applying gamma correction to an image (gamma < 1)
4.9 Channel Colour Level Transformation (LUT) 51
4 Functionality
4.9.3 User-defined Look-up Table
In the ’User’ mode, the mapping of input to output channel colour levels can be configured arbitrarily by the user. There is an example file in the PFRemote folder. LUT files can easily be generated with a standard spreadsheet tool. The file has to be stored as tab delimited text file.
U s e r L U T
y = f ( x )
1 2 b i t
8 b i t
Figure 4.37: Data path through LUT
4.9.4 Region LUT and LUT Enable
Two LUTs and a Region-LUT feature are available in the MV1-D1312C camera series. Both LUTs can be enabled independently (see 4.10). LUT 0 superseds LUT1. When Region-LUT feature is enabled, then the LUTs are only active in a user defined region. Examples are shown in Fig. 4.38 and Fig. 4.39. Fig. 4.38 shows an example of overlapping Region-LUTs. LUT 0, LUT 1 and Region LUT are enabled. LUT 0 is active in region 0 ((x00, x01), (y00, y01)) and it supersedes LUT 1 in the overlapping region. LUT 1 is active in region 1 ((x10, x11), (y10, y11)). Fig. 4.39 shows an example of keyhole inspection in a laser welding application. LUT 0 and LUT 1 are used to enhance the contrast by applying optimized transfer curves to the individual regions. LUT 0 is used for keyhole inspection. LUT 1 is optimized for seam finding. Fig. 4.40 shows the application of the Region-LUT to a camera image. The original image without image processing is shown on the left-hand side. The result of the application of the Region-LUT is shown on the right-hand side. One Region-LUT was applied on a small region on the lower part of the image where the brightness has been increased.
Enable LUT 0 Enable LUT 1 Enable Region LUT Description
- - - LUT are disabled.
X don’t care - LUT 0 is active on whole image.
- X - LUT 1 is active on whole image.
X - X LUT 0 active in Region 0.
X X X LUT 0 active in Region 0 and LUT 1 active
in Region 1. LUT 0 supersedes LUT1.
Table 4.10: LUT Enable and Region LUT
.
52
L U T 0
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
L U T 1
x 0 0 x 1 0 x 0 1 x 1 1
y 1 0 y 0 0
y 0 1
y 1 1
Figure 4.38: Overlapping Region-LUT example
L U T 0
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
L U T 1
( 0 , 0 )
( 1 3 1 1 , 1 0 8 1 )
L U T 1
L U T 0
Figure 4.39: Region-LUT in keyhole inspection
4.9 Channel Colour Level Transformation (LUT) 53
4 Functionality
Figure 4.40: Region-LUT example with camera image; left: original image; right: gain 4 region in the are of the date print of the bottle
54
4.10 Image Information and Status Line
There are camera properties available that give information about the acquired images, such as an image counter, average image value and the number of missed trigger signals. These properties can be queried by software. Alternatively, a status line within the image data can be switched on that contains all the available image information.
4.10.1 Counters and Average Value
Image counter The image counter provides a sequential number of every image that is output.
After camera startup, the counter counts up from 0 (counter width 24 bit). The counter can be reset by the camera control software.
Real Time counter The time counter starts at 0 after camera start, and counts real-time in units
of 1 micro-second. The time counter can be reset by the software in the SDK (Counter width 32 bit).
Missed trigger counter The missed trigger counter counts trigger pulses that were ignored by
the camera because they occurred within the exposure or read-out time of an image. In free-running mode it counts all incoming external triggers (counter width 8 bit / no wrap around).
Missed burst trigger counter The missed burst trigger counter counts trigger pulses that were
ignored by the camera in the burst trigger mode because they occurred while the camera still was processing the current burst trigger sequence.
Average image value The average image value gives the average of an image in 12 bit format
(0 .. 4095 DN), regardless of the currently used grey level resolution.
4.10.2 Status Line
If enabled, the status line replaces the last row of the image with camera status information. Every parameter is coded into fields of 4 pixels (LSB first) and uses the lower 8 bits of the pixel value, so that the total size of a parameter field is 32 bit (see Fig. 4.41). The assignment of the parameters to the fields is listed in 4.11.
The status line is available in all camera modes.
4 8 1 2 1 6 2 0
P r e a m b l e
F i e l d 0
0P i x e l :
1 2 3 5 6 7 9 1 0 1 1 1 3 1 4 1 5 1 7 1 8 1 9 2 1 2 2 2 3
L S B
M S B
F F 0 0 A A 5 5
F i e l d 1 F i e l d 2 F i e l d 3 F i e l d 4
L S B L S B L S B L S B L S B
M S B M S B M S B M S B M S B
Figure 4.41: Status line parameters replace the last row of the image
.
4.10 Image Information and Status Line 55
4 Functionality
Start pixel index Parameter width [bit] Parameter Description
0
32 Preamble: 0x55AA00FF
4 24 Image Counter (see Section 4.10.1)
8 32 Real Time Counter (see Section 4.10.1)
12 8 Missed Trigger Counter (see Section 4.10.1)
16 12 Image Average Value (see Section 4.10.1)
20 24 Integration Time in units of clock cycles (see Table 3.3)
24 16 Burst Trigger Number
28 8 Missed Burst Trigger Counter
32 11 Horizontal start position of ROI (Window.X)
36 11 Horizontal end position of ROI
(= Window.X + Window.W - 1)
40 11 Vertical start position of ROI (Window.Y).
In MROI-mode this parameter is 0.
44 11 Vertical end position of ROI (Window.Y + Window.H - 1).
In MROI-mode this parameter is the total height - 1.
48 2 Trigger Source
52 2 Digital Gain
56 2 Digital Offset
60 16 Camera Type Code (see 4.12)
64 32 Camera Serial Number
Table 4.11: Assignment of status line fields
Camera Model Camera Type Code
MV1-D1312C-160-CL-12 282
Table 4.12: Type codes of MV1-D1312C cameras series
56
4.11 Test Images
Test images are generated in the camera FPGA, independent of the image sensor. They can be used to check the transmission path from the camera to the frame grabber. Independent from the configured grey level resolution, every possible grey level appears the same number of times in a test image. Therefore, the histogram of the received image must be flat.
A test image is a useful tool to find data transmission errors that are caused most often by a defective cable between camera and frame grabber.
The analysis of the test images with a histogram tool gives the correct result at full resolution only.
4.11.1 Ramp
Depending on the configured grey level resolution, the ramp test image outputs a constant pattern with increasing grey level from the left to the right side (see Fig. 4.42).
Figure 4.42: Ramp test images: 8 bit output (left), 10 bit output (middle),12 (right)
4.11.2 LFSR
The LFSR (linear feedback shift register) test image outputs a constant pattern with a pseudo-random grey level sequence containing every possible grey level that is repeated for every row. The LFSR test pattern was chosen because it leads to a very high data toggling rate, which stresses the interface electronic and the cable connection. In the histogram you can see that the number of pixels of all grey values are the same. Please refer to application note [AN026] for the calculation and the values of the LFSR test image.
4.11.3 Troubleshooting using the LFSR
To control the quality of your complete imaging system enable the LFSR mode and check the histogram at 1024x1024 resolution. If your frame grabber application does not provide a real-time histogram, store the image and use a graphic software tool to display the histogram. In the LFSR (linear feedback shift register) mode the camera generates a constant pseudo-random test pattern containing all grey levels. If the data transmission is error free, the histogram of the received LFSR test pattern will be flat (Fig. 4.44). On the other hand, a non-flat histogram (Fig. 4.45) indicates problems, that may be caused either by the cable, by the connectors or by the frame grabber.
4.11 Test Images 57
4 Functionality
Figure 4.43: LFSR (linear feedback shift register) test image
A possible origin of failure message can be caused by the CameraLink®cable which exceeds the maximum length. Also, CameraLink®cables may suffer either from stress due to wrong installation or from severe electromagnetic interfer­ence.
.
58
Some thinner CameraLink®cables have a predefined direction. In these cables not all twisted pairs are separately shielded to meet the RS644 standard. These pairs are used for the transmission of the RX/TX and for the CC1 to CC4 low frequency control signals.
Figure 4.44: LFSR test pattern received at the frame grabber and typical histogram for error-free data transmission
Figure 4.45: LFSR test pattern received at the frame grabber and histogram containing transmission errors
CameraLink®cables contain wire pairs, which are twisted in such a way that the cable impedance matches with the LVDS driver and receiver impedance. Excess stress on the cable results in transmission errors which causes distorted images. Therefore, please do not stretch and bend a CameraLink cable.
In robots applications, the stress that is applied to the CameraLink®cable is especially high due to the fast movement of the robot arm. For such applications, special drag chain capable cables are available. Please contact the Photonfocus Support for consulting expertise. Appropriate CameraLink®cable solutions are available from Photonfocus.
.
4.11 Test Images 59
4 Functionality
4.12 Configuration Interface (CameraLink®)
A CameraLink®camera can be controlled by the user via a RS232 compatible asynchronous serial interface. This interface is contained within the CameraLink®interface as shown in Fig.
4.46 and is physically not directly accessible. Instead, the serial communication is usually routed through the frame grabber. For some frame grabbers it might be necessary to connect a serial cable from the frame grabber to the serial interface of the PC.
C a m e r a L i n k
C a m e r a
I m a g e d a t a , F V A L , L V A L , D V A L
P i x e l C l o c k
C C S i g n a l s
S e r i a l I n t e r f a c e
F r a m e ­g r a b b e r
C a m e r a L i n k
Figure 4.46: CameraLink serial interface for camera communication
.
60
5
Hardware Interface
5.1 Connectors
5.1.1 CameraLink®Connector
The CameraLink®cameras are interfaced to external components via
a CameraLink®connector, which is defined by the CameraLink®standard as a 26 pin, 0.5" Mini Delta-Ribbon (MDR) connector to transmit configuration, image data and trigger.
a subminiature connector for the power supply, 7-pin Binder series 712.
The connectors are located on the back of the camera. Fig. 5.1 shows the plugs and the status LED which indicates camera operation.
Figure 5.1: Rear view of the CameraLink camera
The CameraLink®interface and connector are specified in [CL]. For further details including the pinout please refer to Appendix A. This connector is used to transmit configuration, image data and trigger signals.
5.1.2 Power Supply
The camera requires a single voltage input (see Table 3.4). The camera meets all performance specifications using standard switching power supplies, although well-regulated linear power supplies provide optimum performance.
It is extremely important that you apply the appropriate voltages to your camera. Incorrect voltages will damage the camera.
For further details including the pinout please refer to Appendix A.
.
61
5 Hardware Interface
5.1.3 Trigger and Strobe Signals
The power connector contains an external trigger input and a strobe output.
The trigger input is equipped with a constant current diode which limits the current of the optocoupler over a wide range of voltages. Trigger signals can thus directly get connected with the input pin and there is no need for a current limiting resistor, that depends with its value on the input voltage. The input voltage to the TRIGGER pin must not exceed +15V DC, to avoid damage to the internal ESD protection and the optocoupler!
In order to use the strobe output, the internal optocoupler must be powered with 5 .. 15 V DC. The STROBE signal is an open-collector output, therefore, the user must connect a pull-up resistor (see Table 5.1) to STROBE_VDD (5 .. 15 V DC) as shown in Fig. 5.2. This resistor should be located directly at the signal receiver.
Figure 5.2: Circuit for the trigger input signals
The maximum sink current of the STROBE pin is 8 mA. Do not connect inductive or capacitive loads, such loads may result in damage of the optocoupler! If the
application requires this, please use voltage suppressor diodes in parallel with this components to protect the optocoupler.
62
STROBE_VDD Pull-up Resistor
15 V > 3.9 kOhm
10 V > 2.7 kOhm
8 V > 2.2 kOhm
7 V > 1.8 kOhm
5 V > 1.0 kOhm
Table 5.1: Pull-up resistor for strobe output and different voltage levels
5.1.4 Status Indicator (CameraLink®cameras)
A dual-color LED on the back of the camera gives information about the current status of the CameraLink®cameras.
LED Green Green when an image is output. At slow frame rates, the LED blinks with the
FVAL signal. At high frame rates the LED changes to an apparently continuous green light, with intensity proportional to the ratio of readout time over frame time.
LED Red Red indicates an active serial communication with the camera.
Table 5.2: Meaning of the LED of the CameraLink®cameras
5.2 CameraLink®Data Interface
The CameraLink®standard contains signals for transferring the image data, control information and the serial communication.
Data signals: CameraLink®data signals contain the image data. In addition, handshaking
signals such as FVAL, LVAL and DVAL are transmitted over the same physical channel.
Camera control information: Camera control signals (CC-signals) can be defined by the camera
manufacturer to provide certain signals to the camera. There are 4 CC-signals available and all are unidirectional with data flowing from the frame grabber to the camera. For example, the external trigger is provided by a CC-signal (see Table 5.3 for the CC assignment).
CC1 EXSYNC External Trigger. May be generated either by the frame grabber itself
(software trigger) or by an external event (hardware trigger).
CC2 CTRL0 Control0. This signal is reserved for future purposes and is not used.
CC3 CTRL1 Control1. This signal is reserved for future purposes and is not used.
CC4 CTRL2 Control2. This signal is reserved for future purposes and is not used.
Table 5.3: Summary of the Camera Control (CC) signals as used by Photonfocus
Pixel clock: The pixel clock is generated on the camera and is provided to the frame grabber
for synchronisation.
5.2 CameraLink®Data Interface 63
5 Hardware Interface
Serial communication: A CameraLink®camera can be controlled by the user via a RS232
compatible asynchronous serial interface. This interface is contained within the CameraLink®interface and is physically not directly accessible. Refer to Section 4.12 for more information.
C a m e r a L i n k
C a m e r a
I m a g e d a t a , F V A L , L V A L , D V A L
P i x e l C l o c k
C C S i g n a l s
S e r i a l I n t e r f a c e
F r a m e ­g r a b b e r
C a m e r a L i n k
Figure 5.3: CameraLink interface system
The frame grabber needs to be configured with the proper tap and resolution settings, otherwise the image will be distorted or not displayed with the correct aspect ratio. Refer to Table 3.3 and to Section 3.5 for a summary of frame grabber relevant specifications. Fig. 5.3 shows symbolically a CameraLink®system. For more information about taps refer to the relevant application note [AN021] on the Photonfocus website.
64
6
The PFRemote Control Tool
6.1 Overview
PFRemote is a graphical configuration tool for Photonfocus cameras. The latest release can be downloaded from the support area of www.photonfocus.com. All Photonfocus cameras can be either configured by PFRemote, or they can be programmed with custom software using the PFLib SDK ([PFLIB]).
6.2 PFRemote and PFLib
As shown in Fig. 6.1, the camera parameters can be controlled by PFRemote and PFLib respectively. To grab an image use the software or the SDK that was delivered with your frame grabber.
Figure 6.1: PFRemote and PFLib in context with the CameraLink frame grabber software
6.3 Operating System
The PFRemote GUI is available for Windows OS only. For Linux or QNX operating systems, we provide the necessary libraries to control the camera on request, but there is no graphical user interface available.
If you require support for Linux or QNX operating systems, you may contact us for details of support conditions.
6.4 Installation Notes
Before installing the required software with the PFInstaller, make sure that your frame grabber software is installed correctly. Several DLLs are necessary in order to be able to communicate with the cameras:
65
6 The PFRemote Control Tool
PFCAM.DLL: The main DLL file that handles camera detection, switching to specific camera DLL and provides the interface for the SDK.
’CAMERANAME’.DLL: Specific camera DLL, e.g. mv1_d1312_160.dll.
COMDLL.DLL: Communication DLL. This COMDLL is not necessarily CameraLink®specific, but may depend on a CameraLink®API compatible DLL, which should also be provided by your frame grabber manufacturer.
CLALLSERIAL.DLL: Interface to CameraLink®frame grabber which supports the clallserial.dll.
CLSER_USB.DLL: Interface to USB port.
More information about these DLLs is available in the SDK documentation [SW002].
6.5 Graphical User Interface (GUI)
PFRemote consists of a main window (Fig. 6.2) and a configuration dialog. In the main window, the camera port can be opened or closed, and log messages are displayed at the bottom. The configuration dialog appears as a sub window as soon as a camera port was opened successfully. In the sub window of PFRemote the user can configure the camera properties. The following sections describe the general structure of PFRemote.
6.5.1 Port Browser
On start, PFRemote displays a list of available communication ports in the main window.
Figure 6.2: PFRemote main window with PortBrowser and log messages
To open a camera on a specific port double click on the port name (e.g. USB). Alternatively
right click on the port name and choose Open & Configure.... The port is then queried for a
compatible Photonfocus camera. In the PFRemote main window, there are two menus with the following entries available:
File Menu
Clear Log: Clears the log file buffer
Quit: Exit the program
Help Menu
About: Copyright notice and version information
Help F1: Invoke the online help (PFRemote documentation)
66
6.5.2 Ports, Device Initialization
After starting PFRemote, the main window as shown in Fig. 6.2 will appear. In the PortBrowser in the upper left corner you will see a list of supported ports.
Depending on the configuration, your port names may differ, and not every port may be functional.
If your frame grabber supports clallserial.dll version 1.1 ( CameraLink®compliant standard Oct 2001), the name of the manufacturer is shown in the PortBrowser.
If your frame grabber supports clallserial.dll version 1.0 (CameraLink®compliant standard Oct 2000), the PortBrowser shows either the name of the dll or the manufacturer name or displays "Unknown".
If your frame grabber does not support clallserial.dll, copy the clserXXXX.dll of your frame grabber in the PFRemote directory and rename it to clser.dll. The PortBrowser will then indicate this DLL as "clser.dll at PFRemote directory".
After connecting the camera, the device can be opened with a double click on the port name or by right-clicking on the port name and choosing Open & Configure. If the initialisation of the camera was successful, the configuration dialog will open. The device is closed when PFRemote is closed. Alternatively, e.g. when connecting another camera or evaluation kit, the device can also be closed explicitely by right clicking on the port name and choosing Close. Make sure that the configuration dialog is closed prior to closing the port.
Errors, warnings or other important activities are logged in a log window at the bottom of the main window.
If the device does not open, check the following:
Is the power LED of the camera active? Do you get an image in the display software of your frame grabber?
Verify all cable connections and the power supply.
Check the communication LED of the camera: do you see some activity when you try to access the camera?
6.5 Graphical User Interface (GUI) 67
6 The PFRemote Control Tool
6.5.3 Main Buttons
The buttons on the right side of the configuration dialog store and reset the camera configuration.
Figure 6.3: Main buttons
Reset: Reset the camera and load the default configuration.
Store as defaults: Store the current configuration in the camera flash memory as the default
configuration. After a reset, the camera will load this configuration by default.
Settings file - File Load: Load a stored configuration from a file.
Settings file - File Save: Save current configuration to a file.
Factory Reset: Reset camera and reset the configuration to the factory defaults.
6.6 Device Properties
Cameras or sensor devices are generally addressed as ’device’ in this software. These devices have properties that are accessed by a property name. These property names are translated into register accesses on the driver DLL. The property names are reflected in the GUI as far as practicable. A property name normally has a special mark up throughout this document, for example: ExposureTime. Some properties are grouped into a structure whose member is accessed via dot notation, e.g. Window.X (for the start X value of a region of interest). When changing a property, the property name can always be seen in the log window of the main program window.
68
7
Graphical User Interface (GUI)
7.1 MV1-D1312C-160
This section describes the parameters of the following camera:
MV1-D1312C-160-CL, CameraLink interface and COLOR sensor
The following sections are grouped according to the tabs in the configuration dialog.
Figure 7.1: MV1-D1312C-160 frame rate and average value
Frame Rate [fps :] Shows the actual frame rate of the camera in frames per second.
Update: To update the value of the frame rate, click on this button.
Average Value: Greyscale average of the actual image. This value is in 12bit (0...4095).
Update: To update the value of the average, click on this button.
69
7 Graphical User Interface (GUI)
7.1.1 Exposure
This tab contains exposure settings.
Figure 7.2: MV1-D1312C-160 exposure panel
Exposure
Exposure time [ms :] Configure the exposure time in milliseconds.
Constant Frame Rate: When the Constant Frame Rate (CFR) is switched on, the frame rate
(number of frames per second) can be varied from almost 0 up to the maximum frame rate. Thus, fewer images can be acquired than would otherwise be possible. When Constant Frame Rate is switched off, the camera delivers images as fast as possible, depending on the exposure time and the read-out time.
Frame time [ms :] Configure the frame time in milliseconds. Only available if Constant Frame
Rate is enabled. The minimum frame time depends on the exposure time and readout time.
Simultaneous readout (Interleave)
The simultaneous readout mode allows higher frame rate.
Simultaneous readout (Interleave): Enable the simultaneous readout mode.
Combination of property Trigger.Interleave and property LinLog.Mode is not available! Combination of property Trigger.Interleave and property
Trigger.LevelControlled is not available! Combination of property Trig­ger.Interleave and property Trigger.EnBurstTrigger is not available!
70
7.1.2 Window
This tab contains the settings for the region of interest.
Figure 7.3: MV1-D1312C-160 window panel
Region of Interest
The region of interest (ROI) is defined as a rectangle (X, Y), (W, H) where
X: X - coordinate, starting from 0 in the upper left corner.
Y: Y - coordinate, starting from 0 in the upper left corner.
W: Window width (in steps of 32 pixel).
H: Window height.
Set to max ROI: Set Window to maximal ROI (X=0; Y=0; W=1312; H=1082).
Window width is only available in steps of 32 pixel.
Multi - ROI
This camera can handle up to 512 different regions of interest. The multiple ROIs are joined together and form a single image, which is transferred to the frame grabber. An ROI is defined by its starting value in y-direction and its height. The width and the horizontal offset are specified by X and W settings. The maximum frame rate in MROI mode depends on the number of rows and columns being read out. Overlapping ROIs are allowed.
7.1 MV1-D1312C-160 71
7 Graphical User Interface (GUI)
Enable MROI: Enable MROI. If MROI is enabled, the ROI and MROI settings cannot be changed.
Load File...: Load a user defined MROI-file into the camera. There is an exmaple file in the
PFRemote directory.
Save File...: Save the current MROI settings to a *.txt file.
H tot: Shows the sum of all MROIs as the total image height.
72
7.1.3 Trigger
This tab contains trigger and strobe settings.
Figure 7.4: MV1-D1312C-160 trigger panel
Trigger
Trigger Source:
Free running: The camera continuously delivers images with a certain configurable frame rate.
Interface Trigger: The Trigger signal is applied to the camera by the CameraLink frame grabber
or the USB interface respectively.
I/O Trigger: The trigger signal is applied directly to the camera on the power supply connector.
Exposure time defined by:
Camera: The exposure time is defined by the property ExposureTime.
Trigger Pulse Width: The exposure time is defined by the pulse width of the trigger signal
(level-controlled exposure).
This property disables LinLog, Burst trigger and simultaneous readout mode.
Exposure time defined by "Trigger Pulse Width" is also known as Level controlled trigger.
Further trigger settings:
7.1 MV1-D1312C-160 73
7 Graphical User Interface (GUI)
Trigger Delay: Programmable delay in milliseconds between the incoming trigger edge and
the start of the exposure.
Trigger signal active low: Define the trigger signal to be active high (default) or active low.
Burst Trigger
An external trigger event start a predefined number of acquisition. The period time between the acquisitions can be configured.
Enable Burst Trigger: Delay in milliseconds from the input trigger edge to the rising edge of
the strobe output signal.
Number of Burst Triggers: Set the number of burst
Burst Trigger Period [ms :] Set the time between the burst in milliseconds.
Burst Trigger Delay [ms :] Set the delay of the burst trigger in milliseconds.
Strobe
The camera generates a strobe output signal that can be used to trigger a strobe. The delay, pulse width and polarity can be defined by software. To turn off strobe output, set StrobePulseWidth to 0.
Strobe Delay [ms :] Delay in milliseconds from the input trigger edge to the rising edge of the
strobe output signal.
Strobe Pulse Width [ms :] The pulse width of the strobe trigger in milliseconds.
Strobe signal active low: Define the strobe output to be active high (default) or active low.
74
7.1.4 Data Output
This tab contains image data settings.
Figure 7.5: MV1-D1312C-160 data output panel
Output Mode
Output Mode:
Normal: Normal mode.
LFSR: Test image. Linear feedback shift register (pseudo-random image). The pattern depends
on the grey level resolution.
Ramp: Test image. Values of pixel are incremented by 1, starting at each row. The pattern
depends on the grey level resolution.
Resolution:
8 Bit: Grey level resolution of 8 bit.
10 Bit: Grey level resolution of 10 bit.
12 Bit: Grey level resolution of 12 bit.
Digital Gain:
1x: No digital gain, normal mode.
2x: Digital gain 2.
4x: Digital gain 4.
8x: Digital gain 8.
Digital Offset: Substracts an offset from the data. Fine gain The fine gain can be used to adjust the brightness of the whole image in small steps.
7.1 MV1-D1312C-160 75
7 Graphical User Interface (GUI)
Color
The RGB channel fine gain is used to calibrate the white balance in an image, which has to be set according to the current lighting condition.
Fine gain blue: RGB channel gain for blue.
Fine gain green1: RGB channel gain for green1.
Fine gain green2: RGB channel gain for green2.
Fine gain red: RGB channel gain for red.
76
7.1.5 LUT (Look-Up-Table)
This tab contains LUT settings.
Figure 7.6: MV1-D1312C-160 LUT panel
Grey level transformation is remapping of the grey level values of an input image to new values which transform the image in some way. The look-up-table (LUT) is used to convert the greyscale value of each pixel in an image into another grey value. It is typically used to implement a transfer curve for contrast expansion. This camera performs a 12-to-8-bit mapping, so that 4096 input grey levels can be mapped to 256 output grey levels (0 to 4096 and 0 to 255). This camera support 2 LUT, both are identical. The default LUTs is a gain function with value =
1. LUT0 has higher priority as LUT1.
Both LUT can be configured with the built-in Gain / Gamma functions or with a LUT-file
LUTX
Enable LUT X Enable the LUTX
Gain: Linear function. Y = 256 / 4096 * value * X; Valid range for value [1...4].
Gamma: Gamma function. Y = 256 / 4096^value * X ^ value; Valid range for value [0.4...4].
value: Enter a value. The LUT will be calculated and downloaded to the camera.
Region LUT
Both LUT can be configured with ROI vlaues. The LUT is only workind inside the the ROI values. Overlapping is possible. LUT0 has higher priority.
Enable Reagion LUT: Enable the region LUT functionality.
7.1 MV1-D1312C-160 77
7 Graphical User Interface (GUI)
Region of LUTX:
X: X - coordinate of region LUT, starting from 0 in the upper left corner.
Y: Y - coordinate of region LUT, starting from 0 in the upper left corner.
W: Region LUT window width (in steps of 32 pixel).
H: Region LUT window height.
Set to max ROI: Set Region LUT window to maximal ROI (X=0; Y=0; W=1312; H=1082).
LUT Files
To load or save a LUT file
LUT Index: Select the LUT, you want to load or save a file.
File functions:
Load File...: Load a user defined LUT - file into the camera (*.txt tab delimited). There is an
example in the PFRemote directory (mv1_d1312_80_lut.txt or mv1_d1312_160_lut.txt).
Save File...: Save LUT from camera into a file.
78
7.1.6 LinLog
This tab contains LinLog and Skimming settings.
Figure 7.7: MV1-D1312C-160 linlog panel
LinLog
The LinLog technology from Photonfocus allows a logarithmic compression of high light intensities. In contrast to the classical non-integrating logarithmic pixel, the LinLog pixel is an integrating pixel with global shutter and the possibility to control the transition between linear and logarithmic mode (Section 4.2.2). There are 3 predefined LinLog settings available. Alternatively, custom settings can be defined in the User defined Mode.
LinLog Mode: Off: LinLog is disabled. Low/Normal/High compression: Three LinLog
presettings. User defined: Value1, Time1, Value2 and Time2. The Linlog times are per thousand of the exposure time. Time 800 means 80% of the exposure time.
7.1 MV1-D1312C-160 79
7 Graphical User Interface (GUI)
7.1.7 Correction
This tab contains correction settings.
Figure 7.8: MV1-D1312C-160 correction panel
Correction Mode
This camera has image pre-processing features, that compensate for non-uniformities caused by the sensor, the lens or the illumination.
Off: No correction.
Offset: Activate offset correction
Offset + Hotpixel: Activate offset and hot pixel correction.
Hotpixel: Activate hot pixel correction.
Black Level Offset
It may be necessary to adjust the black level offset of the camera.
Black Level Offset: Black level offset value. Use this to adjust the black level.
Calibration
Offset (FPN), Hotpixel Correction: The offset correction is based on a black reference image,
which is taken at no illumination (e.g. lens aperture completely closed). The black reference image contains the fixed-pattern noise of the sensor, which can be subtracted from the live images in order to minimize the static noise. Close the lens of the camera. Click on the Validation button. If the Set Black Ref - button is still inactive, the average of
80
the image is out of range. Change to panel Charateristics and change the Property BlackLevelOffset until the average of the image is between 160 and 400DN. Click again on the Validation button and then on the Set Black Ref Button.
If only offset and hot pixel correction is needed it is not necessary to calibrate a grey image. (see Calculate)
Calculate: Calculate the correction values into the camera RAM. To make the correction values
permanent, use the ’Save to Flash’ button.
Save to Flash: Save the current correction values to the internal flash memory.
This will overwrite the factory presets.
7.1 MV1-D1312C-160 81
7 Graphical User Interface (GUI)
7.1.8 Info
This panel shows camera specific information such as type code, serial number and firmware revision of the FPGA and microcontroller and the description of the camera interface.
Figure 7.9: MV1-D1312C-160 info panel
Camera Info
Camera name: Name of the connected camera.
Typecode: Type code of the connected camera.
Serial: Serial number of the connected camera.
FPGA Sensor Revision: Firmware revision of built-in Sensor FPGA of the connected camera.
uC Revision: Firmware revision of built-in microcontroller of the connected camera.
Interface: Description of the camera interface.
For any support requests, please enclose the information provided on this tab.
Counter
The camera has the following counters.
Image: The image counter is a 24 bit real-time counter and is incremented by 1 for every new
image.
82
Missed Trigger: This is a counter for trigger pulses that were blocked because the trigger pulse
was received during image exposure or readout. In free-running mode it counts all pulses received from interface trigger or from I/O trigger interface.
Missed Burst Trigger: This is a counter for burst trigger pulses that were blocked because the
burst trigger pulse was received during the last burst is not yet finished.
To update the value of the information properties, click on the Update-Button; to reset the properties, click on the Reset-Button.
Status Line
Enable Status Line: The status line replaces the last line of an image with image information,
please refer the manual for additional information.
Temperature
Imager PCB [deg C :] The temperature of the imager PCB.
Imager [deg C :] The temperature of the imager device.
ADC PCB [deg C :] The temperature of the Analog-Digital-Converter PCB.
Update: Press this button to update all temperature values.
7.1 MV1-D1312C-160 83
7 Graphical User Interface (GUI)
84
8
Mechanical and Optical Considerations
8.1 Mechanical Interface
During storage and transport, the camera should be protected against vibration, shock, moisture and dust. The original packaging protects the camera adequately from vibration and shock during storage and transport. Please either retain this packaging for possible later use or dispose of it according to local regulations.
8.1.1 MV1 cameras with CameraLink®Interface
Figure 8.1: Mechanical dimensions of the CameraLink model, displayed without and with C-Mount adapter
Fig. 8.1: Shows the mechanical drawing of the camera housing for the MV1-D1312C CMOS cameras. The depth of the camera housing is given in Table 8.1 (all values in [mm]).
Camera Models MV1-D1312C Series
X (housing depth) 45 mm
Table 8.1: Model-specific parameters
85
8 Mechanical and Optical Considerations
8.2 Optical Interface
8.2.1 Cleaning the Sensor
The sensor is part of the optical path and should be handled like other optical components: with extreme care. Dust can obscure pixels, producing dark patches in the images captured. Dust is most visible when the illumination is collimated. Dark patches caused by dust or dirt shift position as the angle of illumination changes. Dust is normally not visible when the sensor is positioned at the exit port of an integrating sphere, where the illumination is diffuse.
1. The camera should only be cleaned in ESD-safe areas by ESD-trained personnel using wrist straps. Ideally, the sensor should be cleaned in a clean environment. Otherwise, in dusty environments, the sensor will immediately become dirty again after cleaning.
2. Use a high quality, low pressure air duster (e.g. Electrolube EAD400D, pure compressed inert gas, www.electrolube.com) to blow off loose particles. This step alone is usually sufficient to clean the sensor of the most common contaminants.
Workshop air supply is not appropriate and may cause permanent damage to the sensor.
3. If further cleaning is required, use a suitable lens wiper or Q-Tip moistened with an appropriate cleaning fluid to wipe the sensor surface as described below. Examples of suitable lens cleaning materials are given in Table 8.2. Cleaning materials must be ESD-safe, lint-free and free from particles that may scratch the sensor surface.
Do not use ordinary cotton buds. These do not fulfil the above requirements and permanent damage to the sensor may result.
4. Wipe the sensor carefully and slowly. First remove coarse particles and dirt from the sensor using Q-Tips soaked in 2-propanol, applying as little pressure as possible. Using a method similar to that used for cleaning optical surfaces, clean the sensor by starting at any corner of the sensor and working towards the opposite corner. Finally, repeat the procedure with methanol to remove streaks. It is imperative that no pressure be applied to the surface of the sensor or to the black globe-top material (if present) surrounding the optically active surface during the cleaning process.
86
Product Supplier Remark
EAD400D Airduster Electrolube, UK www.electrolube.com
Anticon Gold 9"x 9" Wiper Milliken, USA ESD safe and suitable for
class 100 environments. www.milliken.com
TX4025 Wiper Texwipe www.texwipe.com
Transplex Swab Texwipe
Small Q-Tips SWABS BB-003
Q-tips Hans J. Michael GmbH,
Germany
www.hjm.de
Large Q-Tips SWABS CA-003
Q-tips Hans J. Michael GmbH,
Germany
Point Slim HUBY-340 Q-tips Hans J. Michael GmbH,
Germany
Methanol Fluid Johnson Matthey GmbH,
Germany
Semiconductor Grade
99.9% min (Assay), Merck 12,6024, UN1230, slightly flammable and poisonous. www.alfa-chemcat.com
2-Propanol (Iso-Propanol)
Fluid Johnson Matthey GmbH,
Germany
Semiconductor Grade
99.5% min (Assay) Merck 12,5227, UN1219, slightly flammable. www.alfa-chemcat.com
Table 8.2: Recommended materials for sensor cleaning
For cleaning the sensor, Photonfocus recommends the products available from the suppliers as listed in Table 8.2.
Cleaning tools (except chemicals) can be purchased directly from Photonfocus (www.photonfocus.com).
.
8.2 Optical Interface 87
8 Mechanical and Optical Considerations
8.3 Compliance
C E C o m p l i a n c e S t a t e m e n t
M V - D 1 0 2 4 - 2 8 - C L - 1 0 , M V - D 1 0 2 4 - 8 0 - C L - 8 , M V - D 1 0 2 4 - 1 6 0 - C L - 8
M V - D 7 5 2 - 2 8 - C L - 1 0 , M V - D 7 5 2 - 8 0 - C L - 8 , M V - D 7 5 2 - 1 6 0 - C L - 8
M V - D 6 4 0 - 3 3 - C L - 1 0 , M V - D 6 4 0 - 6 6 - C L - 1 0 , M V - D 6 4 0 - 4 8 - U 2 - 8 M V - D 6 4 0 C - 3 3 - C L - 1 0 , M V - D 6 4 0 C - 6 6 - C L - 1 0 , M V - D 6 4 0 C - 4 8 - U 2 - 8
M V - D 1 0 2 4 E - 4 0 , M V - D 7 5 2 E - 4 0 , M V - D 7 5 0 E - 2 0 ( C a m e r a L i n k a n d U S B 2 . 0 M o d e l s ) , M V - D 1 0 2 4 E - 8 0 , M V - D 1 0 2 4 E - 1 6 0
M V - D 1 0 2 4 E - 3 D 0 1 - 1 6 0
M V 2 - D 1 2 8 0 - 6 4 0 - C L - 8
S M 2 - D 1 0 2 4 - 8 0 / V i s i o n C a m P S
D S 1 - D 1 0 2 4 - 4 0 - C L , D S 1 - D 1 0 2 4 - 4 0 - U 2 , D S 1 - D 1 0 2 4 - 8 0 - C L , D S 1 - D 1 0 2 4 - 1 6 0 - C L
D S 1 - D 1 3 1 2 - 1 6 0 - C L M V 1 - D 1 3 1 2 ( I ) - 4 0 - C L , M V 1 - D 1 3 1 2 ( I ) - 8 0 - C L , M V 1 - D 1 3 1 2 ( I ) - 1 6 0 - C L , M V 1 - D 1 3 1 2 ( I ) - 2 4 0 - C L , E L 1 - D 1 3 1 2 - 1 6 0 - C L
D i g i p e a t e r C L B 2 6
a r e i n c o m p l i a n c e w i t h t h e b e l o w m e n t i o n e d s t a n d a r d s a c c o r d i n g t o t h e p r o v i s i o n s o f E u r o p e a n S t a n d a r d s D i r e c t i v e s :
W e ,
P h o t o n f o c u s A G , C H - 8 8 5 3 L a c h e n , S w i t z e r l a n d
d e c l a r e u n d e r o u r s o l e r e s p o n s i b i l i t y t h a t t h e f o l l o w i n g p r o d u c t s
E N 6 1 0 0 0 - 6 - 3 : 2 0 0 1 E N 6 1 0 0 0 - 6 - 2 : 2 0 0 1 E N 6 1 0 0 0 - 4 - 6 : 1 9 9 6 E N 6 1 0 0 0 - 4 - 4 : 1 9 9 6 E N 6 1 0 0 0 - 4 - 3 : 1 9 9 6 E N 6 1 0 0 0 - 4 - 2 : 1 9 9 5 E N 5 5 0 2 2 : 1 9 9 4
P h o t o n f o c u s A G , D e c e m b e r 2 0 0 9
Figure 8.2: CE Compliance Statement
88
9
Warranty
The manufacturer alone reserves the right to recognize warranty claims.
9.1 Warranty Terms
The manufacturer warrants to distributor and end customer that for a period of two years from the date of the shipment from manufacturer or distributor to end customer (the "Warranty Period") that:
the product will substantially conform to the specifications set forth in the applicable documentation published by the manufacturer and accompanying said product, and
the product shall be free from defects in materials and workmanship under normal use.
The distributor shall not make or pass on to any party any warranty or representation on behalf of the manufacturer other than or inconsistent with the above limited warranty set.
9.2 Warranty Claim
The above warranty does not apply to any product that has been modified or al­tered by any party other than manufacturer, or for any defects caused by any use of the product in a manner for which it was not designed, or by the negligence of any party other than manufacturer.
89
9 Warranty
90
10
References
All referenced documents can be downloaded from our website at www.photonfocus.com.
CL CameraLink®Specification, January 2004
SW002 PFLib Documentation, Photonfocus, August 2005
MAN025 User Manual "microDisplayUSB2.0", Photonfocus, November 2005
AN001 Application Note "LinLog", Photonfocus, December 2002
AN006 Application Note "Quantum Efficiency", Photonfocus, February 2004
AN007 Application Note "Camera Acquisition Modes", Photonfocus, March 2004
AN008 Application Note "Photometry versus Radiometry", Photonfocus, December 2004
AN010 Application Note "Camera Clock Concepts", Photonfocus, July 2004
AN021 Application Note "CameraLink®", Photonfocus, July 2004
AN026 Application Note "LFSR Test Images", Photonfocus, September 2005
AN030 Application Note "LinLog®Parameter Optimization Strategies", February 2009
91
10 References
92
A
Pinouts
A.1 Power Supply Connector
The power supply plugs are available from Binder connectors at www.binder-connector.de. Fig. A.2 shows the power supply plug from the solder side. The pin assignment of the power supply plug is given in Table A.2.
It is extremely important that you apply the appropriate voltages to your camera. Incorrect voltages will damage or destroy the camera.
Figure A.1: Power connector assembly
Connector Type Order Nr.
7-pole, plastic 99-0421-00-07
7-pole, metal 99-0421-10-07
Table A.1: Power supply connectors (Binder subminiature series 712)
93
A Pinouts
1
2
3
4
5
6
7
Figure A.2: Power supply plug, 7-pole (rear view of plug, solder side)
Pin I/O Type Name Description
1 PWR VDD +12 V DC (± 10%)
2 PWR GND Ground
3 O RESERVED Do not connect
4 PWR STROBE-VDD +5 .. +15 V DC
5 O STROBE Strobe control (opto-isolated)
6 I TRIGGER External trigger (opto-isolated), +5 .. +15V DC
7 PWR GROUND Signal ground (for opto-isolated strobe signal)
Table A.2: Power supply plug pin assignment
A.2 CameraLink®Connector
The pinout for the CameraLink®26 pin, 0.5" Mini D-Ribbon (MDR) connector is according to the CameraLink®standard ([CL]) and is listed here for reference only (see Table A.3). The drawing of the CameraLink®cable plug is shown in Fig. A.3.
CameraLink®cables can be purchased from Photonfocus directly (www.photonfocus.com).
21 3 4 5 6 7 8 9 1 0 1 1 1 2 1 3
1 4 1 5 1 6 1 7 1 8 1 9 2 0 2 1 2 2 2 3 2 4 2 5 2 6
Figure A.3: CameraLink cable 3M MDR-26 plug (both ends)
.
94
PIN IO Name Description
1 PW SHIELD Shield
2 O N_XD0 Negative LVDS Output, CameraLink®Data D0
3 O N_XD1 Negative LVDS Output, CameraLink®Data D1
4 O N_XD2 Negative LVDS Output, CameraLink®Data D2
5 O N_XCLK Negative LVDS Output, CameraLink®Clock
6 O N_XD3 Negative LVDS Output, CameraLink®Data D3
7 I P_SERTOCAM Positive LVDS Input, Serial Communication to the camera
8 O N_SERTOFG Negative LVDS Output, Serial Communication from the camera
9 I N_CC1 Negative LVDS Input, Camera Control 1 (CC1)
10 I N_CC2 Positive LVDS Input, Camera Control 2 (CC2)
11 I N_CC3 Negative LVDS Input, Camera Control 3 (CC3)
12 I P_CC4 Positive LVDS Input, Camera Control 4 (CC4)
13 PW SHIELD Shield
14 PW SHIELD Shield
15 O P_XD0 Positive LVDS Output, CameraLink®Data D0
16 O P_XD1 Positive LVDS Output, CameraLink®Data D1
17 O P_XD2 Positive LVDS Output, CameraLink®Data D2
18 O P_XCLK Positive LVDS Output, CameraLink®Clock
19 O P_XD3 Positive LVDS Output, CameraLink®Data D3
20 I N_SERTOCAM Negative LVDS Input, Serial Communication to the camera
21 O P_SERTOFG Positive LVDS Output, Serial Communication from the camera
22 I P_CC1 Positive LVDS Input, Camera Control 1 (CC1)
23 I N_CC2 Negative LVDS Input, Camera Control 2 (CC2)
24 I P_CC3 Positive LVDS Input, Camera Control 3 (CC3)
25 I N_CC4 Negative LVDS Input, Camera Control 4 (CC4)
26 PW SHIELD Shield
S PW SHIELD Shield
Table A.3: Pinout of the CameraLink®connector
.
A.2 CameraLink®Connector 95
A Pinouts
96
B
Revision History
Revision Date Changes
1.0 March 2010 First release
97
Loading...