Tektronix 2002 User manual

Model 2002
8.5 Digit Multimeter
Calibration Manual
2002-905-01 Rev. E September 2022
tek.com/keithley
*P2002B-905-01E*
2002-905-01E
Calibration Manual
© 2022, Keithley Instruments, LLC
Cleveland, Ohio, U.S.A.
All rights reserved.
Any unauthorized reproduction, photocopy, or use of the information herein, in whole or in part, without the prior written approval of Keithley Instruments, LLC, is strictly prohibited.
These are the original instructions in English.
All Keithley Instruments product names are trademarks or registered trademarks of Keithley Instruments, LLC. Other brand names are trademarks or registered trademarks of their respective holders.
Document number: 2002-905-01 Rev. E September 2022

Safety precautions

The following safety precautions should be observed before using this product and any associated instrumentation. Although some instruments and accessories would normally be used with nonhazardous voltages, there are situations where hazardous conditions may be present.
This product is intended for use by personnel who recognize shock hazards and are familiar with the safety precautions required to avoid possible injury. Read and follow all installation, operation, and maintenance information carefully before using the product. Refer to the user documentation for complete product specifications.
If the product is used in a manner not specified, the protection provided by the product warranty may be impaired.
The types of product users are:
Responsible body is the individual or group responsible for the use and maintenance of equipment, for ensuring that the equipment is operated within its specifications and operating limits, and for ensuring that operators are adequately trained.
Operators use the product for its intended function. They must be trained in electrical safety procedures and proper use of the instrument. They must be protected from electric shock and contact with hazardous live circuits.
Maintenance personnel perform routine procedures on the product to keep it operating properly, for example, setting the line voltage or replacing consumable materials. Maintenance procedures are described in the user documentation. The procedures explicitly state if the operator may perform them. Otherwise, they should be performed only by service personnel.
Service personnel are trained to work on live circuits, perform safe installations, and repair products. Only properly trained service personnel may perform installation and service procedures.
Keithley products are designed for use with electrical signals that are measurement, control, and data I/O connections, with low transient overvoltages, and must not be directly connected to mains voltage or to voltage sources with high transient overvoltages. Measurement Category II (as referenced in IEC 60664) connections require protection for high transient overvoltages often associated with local AC mains connections. Certain Keithley measuring instruments may be connected to mains. These instruments will be marked as category II or higher.
Unless explicitly allowed in the specifications, operating manual, and instrument labels, do not connect any instrument to mains.
Exercise extreme caution when a shock hazard is present. Lethal voltage may be present on cable connector jacks or test fixtures. The American National Standards Institute (ANSI) states that a shock hazard exists when voltage levels greater than 30 V RMS, 42.4 V peak, or 60 VDC are present. A good safety practice is to expect that hazardous voltage is present in any unknown circuit before measuring.
Operators of this product must be protected from electric shock at all times. The responsible body must ensure that operators are prevented access and/or insulated from every connection point. In some cases, connections must be exposed to potential human contact. Product operators in these circumstances must be trained to protect themselves from the risk of electric shock. If the circuit is capable of operating at or above 1000 V, no conductive part of the circuit may be exposed.
Do not connect switching cards directly to unlimited power circuits. They are intended to be used with impedance-limited sources. NEVER connect switching cards directly to AC mains. When connecting sources to switching cards, install protective devices to limit fault current and voltage to the card.
Before operating an instrument, ensure that the line cord is connected to a properly-grounded power receptacle. Inspect the connecting cables, test leads, and jumpers for possible wear, cracks, or breaks before each use.
When installing equipment where access to the main power cord is restricted, such as rack mounting, a separate main input power disconnect device must be provided in close proximity to the equipment and within easy reach of the operator.
For maximum safety, do not touch the product, test cables, or any other instruments while power is applied to the circuit under test. ALWAYS remove power from the entire test system and discharge any capacitors before connecting or disconnecting cables or jumpers, installing or removing switching cards, or making internal changes, such as installing or removing jumpers.
Do not touch any object that could provide a current path to the common side of the circuit under test or power line (earth) ground. Always make measurements with dry hands while standing on a dry, insulated surface capable of withstanding the voltage being measured.
For safety, instruments and accessories must be used in accordance with the operating instructions. If the instruments or accessories are used in a manner not specified in the operating instructions, the protection provided by the equipment may be impaired.
Do not exceed the maximum signal levels of the instruments and accessories. Maximum signal levels are defined in the specifications and operating information and shown on the instrument panels, test fixture panels, and switching cards.
When fuses are used in a product, replace with the same type and rating for continued protection against fire hazard.
Chassis connections must only be used as shield connections for measuring circuits, NOT as protective earth (safety ground) connections.
If you are using a test fixture, keep the lid closed while power is applied to the device under test. Safe operation requires the use of a lid interlock.
If a screw is present, connect it to protective earth (safety ground) using the wire recommended in the user documentation.
The symbol on an instrument means caution, risk of hazard. The user must refer to the operating instructions located in the user documentation in all cases where the symbol is marked on the instrument.
The symbol on an instrument means warning, risk of electric shock. Use standard safety precautions to avoid personal contact with these voltages.
The symbol on an instrument shows that the surface may be hot. Avoid personal contact to prevent burns.
The symbol indicates a connection terminal to the equipment frame.
If this symbol is on a product, it indicates that mercury is present in the display lamp. Please note that the lamp must be properly disposed of according to federal, state, and local laws.
The WARNING heading in the user documentation explains hazards that might result in personal injury or death. Always read the associated information very carefully before performing the indicated procedure.
The CAUTION heading in the user documentation explains hazards that could damage the instrument. Such damage may invalidate the warranty.
The CAUTION heading with the symbol in the user documentation explains hazards that could result in moderate or minor injury or damage the instrument. Always read the associated information very carefully before performing the indicated procedure. Damage to the instrument may invalidate the warranty.
Instrumentation and accessories shall not be connected to humans.
Before performing any maintenance, disconnect the line cord and all test cables.
To maintain protection from electric shock and fire, replacement components in mains circuits — including the power transformer, test leads, and input jacks — must be purchased from Keithley. Standard fuses with applicable national safety approvals may be used if the rating and type are the same. The detachable mains power cord provided with the instrument may only be replaced with a similarly rated power cord. Other components that are not safety-related may be purchased from other suppliers as long as they are equivalent to the original component (note that selected parts should be purchased only through Keithley to maintain accuracy and functionality of the product). If you are unsure about the applicability of a replacement component, call a Keithley office for information.
Unless otherwise noted in product-specific literature, Keithley instruments are designed to operate indoors only, in the following environment: Altitude at or below 2,000 m (6,562 ft); temperature 0 °C to 50 °C (32 °F to 122 °F); and pollution degree 1 or 2.
To clean an instrument, use a cloth dampened with deionized water or mild, water-based cleaner. Clean the exterior of the instrument only. Do not apply cleaner directly to the instrument or allow liquids to enter or spill on the instrument. Products that consist of a circuit board with no case or chassis (e.g., a data acquisition board for installation into a computer) should never require cleaning if handled according to instructions. If the board becomes contaminated and operation is affected, the board should be returned to the factory for proper cleaning/servicing.
Safety precaution revision as of June 2018.

Table of Contents

1 Performance Verification
1.1 Introduction......................................................................................................................................................... 1-1
1.2 Environmental conditions ................................................................................................................................... 1-1
1.3 Warm-up period .................................................................................................................................................. 1-1
1.4 Line power .......................................................................................................................................................... 1-2
1.5 Recommended test equipment ............................................................................................................................ 1-2
1.6 Verification limits ............................................................................................................................................... 1-2
1.6.1 Reading limit calculation example.............................................................................................................. 1-2
1.6.2 Additional derating factors.......................................................................................................................... 1-2
1.7 Restoring default conditions ............................................................................................................................... 1-2
1.8 Verification procedures....................................................................................................................................... 1-4
1.8.1 DC volts verification................................................................................................................................... 1-4
1.8.2 AC volts verification................................................................................................................................... 1-6
1.8.3 DC current verification ............................................................................................................................. 1-11
1.8.4 AC current verification ............................................................................................................................. 1-12
1.8.5 Resistance verification .............................................................................................................................. 1-13
1.8.6 Frequency accuracy verification ............................................................................................................... 1-15
1.8.7 Temperature reading checks ..................................................................................................................... 1-17
2 Calibration
2.1 Introduction......................................................................................................................................................... 2-1
2.2 Environmental conditions ................................................................................................................................... 2-2
2.3 Warm-up period .................................................................................................................................................. 2-2
2.4 Line power .......................................................................................................................................................... 2-2
2.5 Calibration lock................................................................................................................................................... 2-2
2.5.1 Comprehensive calibration lock.................................................................................................................. 2-2
2.5.2 Low-level calibration lock .......................................................................................................................... 2-2
2.5.3 IEEE-488 bus calibration lock status .......................................................................................................... 2-2
2.6 IEEE-488 bus calibration commands.................................................................................................................. 2-2
2.7 Calibration errors ................................................................................................................................................ 2-4
2.7.1 Front panel error reporting.......................................................................................................................... 2-4
2.7.2 IEEE-488 bus error reporting...................................................................................................................... 2-4
2.8 Comprehensive calibration ................................................................................................................................. 2-4
2.8.1 Recommended equipment for comprehensive calibration.......................................................................... 2-4
2.8.2 Front panel comprehensive calibration ....................................................................................................... 2-4
2.8.3 IEEE-488 bus comprehensive calibration................................................................................................... 2-9
2.9 AC self-calibration............................................................................................................................................ 2-12
2.9.1 Front panel AC calibration........................................................................................................................ 2-12
2.9.2 IEEE-488 bus AC self-calibration ............................................................................................................ 2-13
i
2.10 Low-level calibration......................................................................................................................................... 2-13
2.10.1 Recommended equipment for low-level calibration ................................................................................. 2-13
2.10.2 Low-level calibration summary................................................................................................................. 2-13
2.10.3 Front panel low-level calibration procedure.............................................................................................. 2-16
2.10.4 IEEE-488 bus low-level calibration procedure ......................................................................................... 2-20
2.11 Single-point calibration ..................................................................................................................................... 2-24
2.11.1 Front panel single-point calibration........................................................................................................... 2-24
2.11.2 IEEE-488 bus single-point calibration ...................................................................................................... 2-24
3 Calibration Command Reference
3.1 Introduction ......................................................................................................................................................... 3-1
3.2 Commands........................................................................................................................................................... 3-1
3.2.1 Command summary..................................................................................................................................... 3-1
3.3 :CALibration:PROTected Subsystem ................................................................................................................. 3-3
3.3.1 :INIT ............................................................................................................................................................ 3-3
3.3.2 :LOCK ......................................................................................................................................................... 3-3
3.3.3 :SWITch?..................................................................................................................................................... 3-4
3.3.4 :SAVE.......................................................................................................................................................... 3-4
3.3.5 :DATA? ....................................................................................................................................................... 3-4
3.3.6 :DATE ......................................................................................................................................................... 3-5
3.3.7 :NDUE......................................................................................................................................................... 3-5
3.3.8 :DC .............................................................................................................................................................. 3-6
3.3.9 :LLEVel..................................................................................................................................................... 3-11
3.4 :CALibration:UNPRotected Subsystem............................................................................................................ 3-13
3.4.1 :ACCompensation ..................................................................................................................................... 3-13
3.5 Bus error reporting ............................................................................................................................................ 3-13
3.5.1 Calibration error summary ........................................................................................................................ 3-13
3.5.2 Detecting Calibration Errors...................................................................................................................... 3-13
3.6 Detecting calibration step completion ............................................................................................................... 3-14
3.6.1 Using the *OPC? Query ............................................................................................................................ 3-14
3.6.2 Using the *OPC command ........................................................................................................................ 3-14
3.6.3 Generating an SRQ on calibration complete ............................................................................................. 3-14
APPENDICES
A Specifications ............................................................................................................................................. A-1
B Calibration Programs .................................................................................................................................. B-1
C Calibration Messages.................................................................................................................................. C-1
D Calibration Command Summary................................................................................................................ D-1
ii

List of Illustrations

1 Performance Verification
Figure 1-1 Connections for DC volts verification ........................................................................................................ 1-5
Figure 1-2 Connections for AC volts verification (all except 2MHz) .......................................................................... 1-7
Figure 1-3 Connections for 2MHz AC volts verification ............................................................................................. 1-7
Figure 1-4 Connections for DC current verification................................................................................................... 1-11
Figure 1-5 Connections for AC current verification................................................................................................... 1-12
Figure 1-6 Connections for resistance verification (20 -2M ranges)..................................................................... 1-14
Figure 1-7 Connections for resistance verification (20M and 200M ranges)....................................................... 1-14
Figure 1-8 1G resistor test box construction............................................................................................................ 1-15
Figure 1-9 Connections for frequency accuracy verification ..................................................................................... 1-16
2 Calibration
Figure 2-1 Low-thermal short connections................................................................................................................... 2-5
Figure 2-2 Calibrator connections for DC volts and ohms portion of comprehensive calibration............................... 2-6
Figure 2-3 Connections for amps comprehensive calibration ...................................................................................... 2-8
Figure 2-4 Calibrator voltage connections.................................................................................................................. 2-18
Figure 2-5 Synthesizer connections............................................................................................................................ 2-20
APPENDICES
Figure B-1 Low-thermal short connections.................................................................................................................. B-3
Figure B-2 Connections for comprehensive calibration............................................................................................... B-4
Figure B-3 Calibrator voltage connections................................................................................................................... B-4
Figure B-4 Calibrator current connections ................................................................................................................... B-4
Figure B-5 Synthesizer connections............................................................................................................................. B-5
iii

List of Tables

1 Performance Verification
Table 1-1 Recommended Test Equipment for Performance Verification.................................................................. 1-3
Table 1-2 Limits for DCV verification....................................................................................................................... 1-5
Table 1-3 Limits for normal mode AC voltage verification....................................................................................... 1-8
Table 1-4 Limits for low-frequency mode AC voltage verification........................................................................... 1-9
Table 1-5 Limits for AC peak voltage verification .................................................................................................. 1-10
Table 1-6 Limits for DC current verification ........................................................................................................... 1-11
Table 1-7 Limits for AC current verification ........................................................................................................... 1-13
Table 1-8 Limits for resistance verification (20 -200M ranges) ......................................................................... 1-15
Table 1-9 Limits for resistance verification (1G range)........................................................................................ 1-15
Table 1-10 Frequency verification limits ................................................................................................................... 1-16
Table 1-11 Thermocouple temperature reading checks ............................................................................................. 1-17
Table 1-12 RTD probe temperature reading checks................................................................................................... 1-18
2 Calibration
Table 2-1 IEEE-488 bus calibration command summary .......................................................................................... 2-3
Table 2-2 Recommended equipment for comprehensive calibration......................................................................... 2-4
Table 2-3 Front panel comprehensive calibration summary ...................................................................................... 2-5
Table 2-4 IEEE-488 bus comprehensive calibration summary.................................................................................. 2-9
Table 2-5 Ohms calibration summary ...................................................................................................................... 2-11
Table 2-6 Amps calibration summary ...................................................................................................................... 2-12
Table 2-7 Recommended equipment for low-level calibration................................................................................ 2-14
Table 2-8 Low-level calibration summary ............................................................................................................... 2-15
Table 2-9 Ohms calibration summary ...................................................................................................................... 2-22
Table 2-10 Amps calibration summary ...................................................................................................................... 2-22
3 Calibration Command Reference
Table 3-1 IEEE-488 bus calibration command summary .......................................................................................... 3-2
Table 3-2 Comprehensive calibration commands ...................................................................................................... 3-6
Table 3-3 Low-level calibration commands............................................................................................................. 3-11
Appendices
Table B-1 Recommended equipment for comprehensive calibration......................................................................... B-2
Table B-2 Recommended equipment for low-level calibration.................................................................................. B-2
Table C-1 Calibration error messages......................................................................................................................... C-2
Table C-2 Calibration constants returned by :CAL:PROT:DATA? query................................................................. C-4
Table D-1 Calibration commands ............................................................................................................................... D-1
v
Performance Verification
1

1.1 Introduction

The procedures in this section are intended to verify that Model 2002 accuracy is within the limits stated in the instru­ment one-year accuracy specifications. These procedures can be performed when the instrument is first received to ensure that no damage or misadjustment has occurred during ship­ment. Verification may also be performed whenever there is a question of instrument accuracy, or following calibration, if desired.
NOTE
If the instrument is still under warranty, and its performance is outside specified limits, contact your Keithley representa­tive or the factory to determine the correct course of action.
This section includes the following:
1.2 Environmental conditions: Covers the temperature
and humidity limits for verification.
1.3 Warm-up period: Describes the length of time the
Model 2002 should be allowed to warm up before testing.
1.4 Line power: Covers power line voltage ranges during
testing.
1.5 Recommended equipment: Summarizes recom-
mended equipment and pertinent specifications.
1.6 Verification limits: Explains how reading limits were
calculated.
1.7 Restoring factory default conditions: Gives step-by-
step procedures for restoring default conditions before each test procedure.
1.8 Verification procedures: Details procedures to verify
measurement accuracy of all Model 2002 measure­ment functions.

1.2 Environmental conditions

Verification measurements should be made at an ambient temperature of 18–28°C (65–82°F), and at a relative humid­ity of less than 80% unless otherwise noted.

1.3 Warm-up period

The Model 2002 must be allowed to warm up for the following time period before performing the verification procedures:
• DC volts and ohms: four hours
• AC volts, AC current, DC current: one hour
If the instrument has been subjected to temperature extremes (outside the range stated in paragraph 1.2), allow additional time for internal temperatures to stabilize. Typically, it takes one additional hour to stabilize a unit that is 10°C (18°F) out­side the specified temperature range.
The test equipment should also be allowed to warm up for the minimum period specified by the manufacturer.
1-1
Performance Verification
×

1.4 Line power

The Model 2002 should be tested while operating from a line voltage in the range of 90–134V or 180–250V at a frequency of 50, 60, or 400Hz.

1.5 Recommended test equipment

Table 1-1 lists all test equipment required for verification. Alternate equipment may be used as long as that equipment has specifications at least as good as those listed in the table.
NOTE
The calibrator listed in Table 1-1 is suffi­ciently accurate to verify Model 2002 accuracy to total factory calibration uncer­tainty. It is not accurate enough to verify Model 2002 relative accuracy specifica­tions alone.
1.6 Verification limits
The verification limits stated in this section have been calcu­lated using the Model 2002 one-year relative accuracy spec­ifications and the total absolute uncertainty of the factory recommended calibrator (see Table 1-1). DCV, DCI, and ohms limits also include factory calibration uncertainty. (See specifications.) Those who are using calibration sources with better absolute uncertainty should recalculate the limits us­ing the Model 2002 relative accuracy specifications, the ab­solute uncertainty specifications of the calibration sources, and factory calibration uncertainty (DCV, DCI, and ohms).
1.6.1 Reading limit calculation example
As an example of how reading limits are calculated, assume that the 20VDC range is being tested using a 19V input val­ue, and the various specifications are as follows:
• Model 2002 relative accuracy: ±(10ppm of reading +
0.15ppm of range)
• Model 2002 factory calibration uncertainty: ±2.6ppm of reading
• Calibrator total absolute uncertainty at 19V output: ±5.4ppm
The calculated limits are:
Reading limits = 19V ± [(19V × (10ppm + 2.6ppm)) + (20V
0.15ppm) + 19V × 5.4ppm]
Reading limits = 19V ± 0.000345V
Reading limits = 18.999655V to 19.000345V
1.6.2 Additional derating factors
Certain functions and ranges are subject to certain derating factors that must be included when calculating reading limits. For example, coupling errors must be added to low­frequency AC limits, while AC voltage limits for inputs above 100V are subject to additional derating factors.
Always read the associated specification notes to determine if any derating factors apply before calculating reading limits.

1.7 Restoring default conditions

Before performing each performance verification procedure, restore instrument bench default conditions as follows:
1. From the normal display mode, press the MENU key. The instrument will display the following:
MAIN MENU
SAVESETUP GPIB CALIBRATION
2. Select SAVESETUP, and press ENTER. The following will be displayed:
SETUP MENU
SAVE RESTORE POWERON RESET
3. Select RESET, and press ENTER. The display will then appear as follows:
RESET ORIGINAL DFLTS
BENCH GPIB
4. Select BENCH, then press ENTER. The following will be displayed:
RESETTING INSTRUMENT
ENTER to confirm; EXIT to abort
5. Press ENTER again to confirm instrument reset. The in­strument will return to the normal display with bench defaults restored.
1-2
Performance Verification
Table 1-1
Recommended Test Equipment for Performance Verification
Mfg. Model Description Specifications*
Fluke 5700A Calibrator ±5ppm basic uncertainty.
DC Voltage:
190mV: ±11ppm
1.9V: ±7ppm 19V: ±5ppm 190V: ±7ppm 1000V: ±9ppm
AC Voltage, 10Hz-1MHz
(40Hz-20kHz specifications): 190mV: ±150ppm
1.9V: ±78ppm 19V: ±78ppm 190V: ±85ppm 750V: ±85ppm (50Hz-1kHz)
DC current:
190µA: ±103ppm
1.9mA: ±55ppm 19mA: ±55ppm 190mA: ±65ppm
1.9A: ±96ppm
AC Current, 40Hz-10kHz
(40Hz-1kHz specifications): 190µA: ±245ppm
1.9mA: ±160ppm 19mA: ±160ppm 190mA: ±170ppm
1.9A: ±670ppm
Resistance:
19Ω: ±26ppm 190Ω: ±17ppm
1.9kΩ: ±12ppm 19kΩ: ±11ppm 190kΩ: ±13ppm
1.9MΩ: ±19ppm 19MΩ: ±47ppm 100MΩ: ±120ppm
Fluke 5725A Amplifier AC Voltage, 1kHz-10kHz:
Fluke 5700A-03 Wideband AC option 190mV ± 0.22%, 1.9V ± 0.3% @ 2MHz
Fluke 5440A-7002 Low-thermal cable set
Keithley CA-18-1 Low-capacitance cable Low-capacitance dual banana to dual banana shielded cable (for
Keithley R-289-1G 1GΩ resistor NOTE: Resistor should be characterized to within ±1,000ppm and
Metal component box (for 1GΩ resistor)
Banana plugs (2) for test box One insulated, one non-insulated.
Keithley 3940 Multifunction Synthesizer 1Hz-15MHz, ±5ppm
General Radio
Megaohmmeter 1GΩ, ±0.5%
* 90-day calibrator specifications shown include total absolute uncertainty at specified output.
1433-T Precision Decade Resistance Box 10-400Ω, ±0.02%
750V: ±85ppm
ACV), 1.2m (4 ft.) in length.
mounted in shielded test box (see procedure).
1-3
Performance Verification
1.8 Verification procedures
The following paragraphs contain procedures for verifying instrument accuracy specifications for the following measur­ing functions:
• DC volts
• AC volts
• DC current
• AC current
• Resistance
• Frequency
• Temperature
NOTE
The following verification procedures are intended to verify the accuracy of the Model 2002 and include reading limits based on the Model 2002 relative accuracy specifications and the total uncertainty of the recommended calibrator. DCV, DCI, and ohms limits include factory calibra­tion uncertainty.
1.8.1 DC volts verification
DC voltage accuracy is verified by applying accurate DC voltages from a calibrator to the Model 2002 input and veri­fying that the displayed readings fall within specified ranges.
Follow the steps below to verify DCV measurement accuracy.
CAUTION
Do not exceed 1100V peak between IN­PUT HI and INPUT LO, or instrument damage may occur.
1. Connect the Model 2002 to the calibrator, as shown in Figure 1-1. Be sure to connect calibrator HI to Model 2002 INPUT HI and calibrator LO to Model 2002 INPUT LO as shown.
NOTE
Use shielded, low-thermal connections when testing the 200mV and 2V ranges to avoid errors caused by noise or thermal offsets. Connect the shield to calibrator output LO.
If the Model 2002 is out of specifications and not under war­ranty, refer to the calibration procedures in Section 2.
WARNING
The maximum common-mode voltage (voltage between INPUT LO and chas­sis ground) is 500V peak. Exceeding this value may cause a breakdown in insula­tion, creating a shock hazard. Some of the procedures in this section may expose you to dangerous voltages. Use standard safety precautions when such dangerous voltages are encountered to avoid personal injury caused by electric shock.
NOTE
Do not connect test equipment to the Model 2002 through a scanner or other switching equipment.
2. Turn on the Model 2002 and the calibrator, and allow a four-hour warm-up period before making measurements.
3. Restore Model 2002 factory default conditions, as explained in paragraph 1.7.
4. Set Model 2002 operating modes as follows: A. From normal display, press CONFIG then DCV. B. Select SPEED, then press ENTER. C. Select HIACCURACY, then press ENTER. D. Select FILTER, then press ENTER. E. Select AVERAGING, then press ENTER. F. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER.
G. Press EXIT to return to normal display.
5. Select the Model 2002 200mV DC range. (If the FILT annunciator is off, press the FILTER key to enable the filter.)
NOTE
Do not use auto-ranging for any of the ver­ification tests because auto-range hystere­sis may cause the Model 2002 to be on an incorrect range.
1-4
Performance Verification
6. Set the calibrator output to 0.00000mVDC, and allow the reading to settle.
7. Enable the Model 2002 REL mode. Leave REL enabled for the remainder of the DC volts verification test.
8. Set the calibrator output to +190.00000mVDC, and allow the reading to settle.
9. Verify that the Model 2002 reading is within the limits summarized in Table 1-2.
10. Repeat steps 8 and 9 for the remaining ranges and volt­ages listed in Table 1-2.
11. Repeat the procedure for each of the ranges with nega­tive voltages of the same magnitude as those listed in Table 1-2.
Input HI
Model 2002
Output HI
Table 1-2
Limits for DCV verification
2002 DCV Range
Applied DC Voltage
Reading Limits (1 year, 18° to 28°C)
200mV 190.000000mV 189.991911mV to 190.008089mV
2V 1.90000000V 1.89996058 to 1.90003942V
20V 19.0000000V 18.9996550V to 19.0003450V
200V 190.000000V 189.993691V to 190.006309V
1000V 1000.0000V 999.94640V to 1000.05360V
NOTES:
1. Repeat procedure for negative voltages of same magnitude.
2. Reading limits shown include total absolute uncertainty of recommended calibrator (see Table 1-1) and factory calibration uncertainty (see specifications).
5700A Calibrator (Output DC Voltage)
2002 MULTIMETER
Figure 1-1
Connections for DC volts verification
Input LO
Note : Use shielded, low-thermal cables when testing 200mV and 2V ranges.
Output LO
1-5
Performance Verification
1.8.2 AC volts verification
AC voltage accuracy is checked by applying accurate AC voltages at specific frequencies from an AC calibration source and then verifying that each Model 2002 AC voltage reading falls within the specified range. The two ACV verifi­cation procedures that follow include:
• Normal Mode
• Low-frequency Mode
CAUTION
Do not exceed 1100V peak between IN­PUT HI and INPUT LO, or 2 ×
7
10
V•Hz input, or instrument damage
may occur.
Normal mode
1. Turn on the Model 2002, calibrator, and amplifier, and allow a one-hour warm-up period before making measurements.
2. Connect the Model 2002 to the calibrator, as shown in Figure 1-2. Be sure to connect amplifier HI to Model 2002 INPUT HI and amplifier LO to Model 2002 INPUT LO as shown. Connect the power amplifier to the calibrator using the appropriate connector on the rear of the calibrator.
3. Restore Model 2002 factory default conditions, as ex­plained in paragraph 1.7.
4. Select the ACV function and the 200mV range on the Model 2002, and make sure that REL is disabled.
NOTE
Do not use REL to null offsets when per­forming AC volts tests. Also, do not enable the filter.
5. Set the calibrator output to 190.000mVAC at a fre­quency of 100Hz, and allow the reading to settle.
6. Verify that the Model 2002 reading is within the limits summarized in Table 1-3.
7. Repeat steps 5 and 6 for 190mVAC at the remaining fre­quencies listed in Table 1-3 (except 2MHz). Verify that instrument readings fall within the required limits listed in the table.
8. Repeat steps 5 through 7 for the 2V, 20V, 200V, and 750VAC ranges using the input voltages and limits stated in Table 1-3.
9. Connect the Model 2002 to the wideband calibrator out­put (see Figure 1-3).
10. Set the calibrator output to 190.000mV at a frequency of 2MHz.
11. Verify that the reading is within the limits shown in Table 1-3.
12. Repeat steps 10 and 11 for 1.90000V input on the 2V range.
CAUTION
Do not attempt to test the 20V–1000V ranges at 2MHz.
1-6
Performance Verification
5725 Amplifier (Connect to calibrator)
Model 2002
1.90000 VAC RMS
2001 MULTIMETER
Input HI
Output HI
Input LO
CA-18-1 Low-
capacitance cable
Figure 1-2
Connections for AC volts verification (all except 2MHz)
BNC to dual
banana
50
terminator
Model 2002
Output LO
5700A Calibrator (Output AC Voltage)
5725 Amplifier (Connect to calibrator)
1.90000 VAC RMS
2002 MULTIMETER
50 Coax
Figure 1-3
Connections for 2MHz AC volts verification
5700A Calibrator (Output AC Voltage)
Wideband
output
1-7
Performance Verification
Table 1-3
Limits for normal mode AC voltage verification
*
180.100mVto199.900mV
186.000mVto194.000mV
1.80100Vto1.99900V
1.86000Vto1.94000V
18.2000Vto19.8000V
***
188.525mVto191.475mV
1.88525Vto1.91475V
18.8525Vto19.1475V
to
190.607V
189.393V
189.830Vto190.170V
189.868Vto190.132V
*****
748.87Vto751.13V
190.115mV
1.89400Vto1.90600V
to
1.89885V
1.89933Vto1.90068V
189.400mVto190.600mV
to
189.885mV
Reading limits (1 year, 18°C to 28°C)
189.933mVto190.068mV
1.90115V
18.9400Vto19.0600V
18.9837Vto19.0163V
18.9875Vto19.0125V
1-8
2002
Applied
ACV
100Hz 1kHz 5kHz 25kHz 50kHz 100kHz 200kHz 1MHz 2MHz
voltage
range
189.942mVto190.058mV
189.942mVto190.058mV
to
200mV 190.000mV 189.914mV
190.087mV
1.89942Vto1.90058V
1.89942Vto1.90058V
to
1.90087V
2V 1.90000V 1.89914V
18.9894Vto19.0106V
18.9913Vto19.0087V
to
19.0116V
20V 19.0000V 18.9885V
to
190.113V
189.887V
189.906Vto190.094V
to
190.122V
200V 190.000V 189.878V
749.02Vto750.98V
749.09Vto750.91V
V•Hz input.
to
7
751.02V
750V 750.00V 748.98V
** CAUTION: Do not exceed 2 × 10
** Use wideband option and connections for 2MHz tests.
NOTE: Reading limits shown include total absolute uncertainty of recommended calibrator (see Table 1-1). Reading limits also include the adder for AC Coupling of the input.
Performance Verification
Table 1-4
Limits for low-frequency mode AC voltage verification
2002 ACV
range
Applied
voltage
Reading limits (1 year, 18°C to 28°C)
10Hz 50Hz 100Hz
200mV 190.000mV 189.837mV
to
190.163mV
189.904mV
to
190.097mV
189.923mV to
190.077mV
2V 1.90000V 1.89875V
to
1.90125V
1.89923V to
1.90078V
1.89942V to
1.90058V
20V 19.0000V 18.9837V
to
19.0163V
18.9904V to
19.0097V
18.9913V to
19.0087V
200V 190.000V 189.849V
to
190.151V
189.906V to
190.094V
189.906V to
190.094V
750V 750.00V * 749.09V
to
750.91V
749.09V to
750.91V
* Recommended calibrator/amplifier cannot source this voltage/frequency. Notes:
1. Specifications above 100Hz are the same as normal mode.
2. Limits shown include total absolute uncertainty of recommended calibrator (see Table 1-1).
Low-frequency mode
1. Turn on the Model 2002, calibrator, and amplifier, and allow a one-hour warm-up period before making measurements.
2. Connect the Model 2002 to the calibrator, as shown in Figure 1-2. Be sure to connect the amplifier HI to Model 2002 INPUT HI and amplifier LO to Model 2002 INPUT LO as shown. Connect the power amplifier to the calibrator using the appropriate connector on the rear of the calibrator.
3. Restore Model 2002 factory default conditions, as explained in paragraph 1.7.
4. Select the ACV function and the 200mV range on the Model 2002, and make sure that REL is disabled.
NOTE
Do not use REL to null offsets when per­forming AC volts tests. Also, do not enable the filter.
5. Select the low-frequency mode as follows: A. Press CONFIG ACV, select AC-TYPE, then press
ENTER. B. Select LOW-FREQ-RMS, then press ENTER. C. Press EXIT as required to return to normal display.
6. Set the calibrator output to 190.000mVAC at a fre­quency of 10Hz, and allow the reading to settle.
7. Verify that the Model 2002 reading is within the limits summarized in Table 1-4.
8. Repeat steps 6 and 7 for 190mVAC at the remaining fre­quencies listed in the table.
9. Repeat steps 6 through 8 for the 2V, 20V, 200V, and 750VAC ranges, using the input voltages and limits stated in Table 1-4.
1-9
Performance Verification
AC peak mode
1. Turn on the Model 2002, calibrator, and amplifier, and allow a one-hour warm-up period before making measurements.
2. Connect the Model 2002 to the calibrator, as shown in Figure 1-2. Be sure to connect the amplifier HI to Model 2002 INPUT HI, and the amplifier LO to MODEL 2002 INPUT LO as shown. Connect the power amplifier to the calibrator using the appropriate connector on the rear of the calibrator.
3. Restore the Model 2002 factory default conditions.
4. Select the ACV function and the 200mV range on the Model 2002, and make sure that REL is disabled.
NOTE
Do not use REL to null offsets when per­forming AC volts tests. Use AC coupling for 5kHz-1MHz tests. Use AC+DC cou­pling for 20Hz tests. (Use CONFIG-ACV to set up coupling).
5. Select the AC peak and filter modes as follows: A. Press CONFIG then ACV, select AC-TYPE, then
press ENTER.
B. Select PEAK, then press ENTER. C. Select FILTER, then press ENTER. D. Select AVERAGING, then press ENTER. E. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER. F. Press EXIT as necessary to return to normal display. G. If the FLT annunciator is off, press FILTER to
enable the filter.
6. Set the calibrator output to 100.000mVAC at a fre­quency of 5kHz, and allow the reading to settle.
7. Verify that the Model 2002 reading is within the limits summarized in Table 1-5.
8. Repeat steps 6 and 7 for 100mVAC at the remaining fre­quencies listed in the table.
9. Repeat steps 6 through 8 for the 2V, 20V, 200V, and 750VAC ranges, using the input voltages and limits stated in Table 1-6.
CAUTION
Do not apply more than 400V at 50kHz, 80V at 250kHz, 40V at 500kHz, or 20V at 1MHz, or instrument damage may occur.
10. Set input coupling to AC+DC, then repeat the procedure for a 20Hz input signal.
Table 1-5
Limits for AC peak voltage verification
2002 ACV range
200mV 100mV 139.9mV
2V 1V 1.407V
20V 10V 13.99V
200V 100V 140.7V
Applied voltage*
20Hz† 5kHz 25kHz 50kHz 100kHz 250kHz 500kHz 750kHz 1MHz
139.9mV
to
142.9mV
to
142.9mV
1.407V
to
1.421V
to
1.421V
13.99V
to
14.30V
to
14.30V
140.7V
to
142.2V
to
142.2V
750V 500V 701.3V
to
712.9V
** Calibrator voltage is given as an RMS value. Model 2002 reading limits are peak AC values.
** CAUTION: Do not apply more than 2 × 10 † Use AC+DC input coupling for 20Hz tests only. (Use CONFIG-ACV to set coupling.) NOTE: Limits shown include uncertainty of recommended calibrator.
7
V•Hz.
Allowable readings (1 year, 18°C to 28°C)
139.9mV to
143.0mV
1.407V to
1.422V
13.98V to
14.30V
140.6V to
142.2V
701.0V
139.8mV to
143.0mV
1.406V to
1.422V
13.98V to
14.31V
140.6V to
142.3V
139.7mV to
143.2mV
1.405V to
1.424V
13.97V to
14.32V
140.5V to
142.4V
** ** ** ** ** **
to
713.2V
138.6mV to
144.2mV
1.394V to
1.434V
13.86V to
14.42V
136.5mV to
146.4mV
1.373V to
1.456V
13.65V to
14.64V
132.2mV to
150.6mV
1.330V to
1.498V
13.22V to
15.06V
** ** ** **
127.3mV to
155.5mV
1.281V to
1.547V
12.73V to
15.55V
1-10
Performance Verification
1.8.3 DC current verification
DC current accuracy is checked by applying accurate DC currents from a calibrator to the instrument AMPS input and then verifying that the current readings fall within appropri­ate limits.
Follow the steps below to verify DCI measurement accuracy.
CAUTION
Do not apply more than 2A, 250V to the AMPS input, or the amps protection fuse will blow.
1. Connect the Model 2002 to the calibrator, as shown in Figure 1-4. Be sure to connect calibrator HI to the AMPS input, and connect calibrator LO to INPUT LO as shown.
2. Turn on the Model 2002 and the calibrator, and allow a one-hour warm-up period before making measure­ments. Be sure the calibrator is set for normal current output.
3. Restore Model 2002 factory default conditions, as explained in paragraph 1.7.
4. Set digital filter averaging as follows: A. From normal display, press CONFIG then DCI. B. Select FILTER, then press ENTER. C. Select AVERAGING, then press ENTER. D. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER.
E. Press EXIT as necessary to return to normal display.
5. Select the DC current function (DCI) and the 200µA range on the Model 2002. (If the FILT annunciator is off, press the FILTER key to enable the filter.)
6. Set the calibrator output to +190.0000µADC, and allow the reading to settle.
7. Verify that the Model 2002 reading is within the limits summarized in Table 1-6.
8. Repeat steps 6 and 7 for the remaining ranges and cur­rents listed in Table 1-6.
9. Repeat the procedure for each of the ranges with nega­tive currents of the same magnitude as those listed in Table 1-6.
Table 1-6
Limits for DC current verification
2002 DCI range
Applied DC current
Reading limits (1 year, 18°C to 28°C)
200µA 190.0000µA 189.9010µA to 190.0990µA
2mA 1.900000mA 1.899114mA to 1.900886mA
20mA 19.00000mA 18.99085mA to 19.00915mA
200mA 190.0000mA 189.8816mA to 190.1184mA
2A 1.900000A 1.898108A to 1.901892A
NOTES:
1. Repeat procedure for negative currents.
2. Reading limits shown include total absolute uncertainty of recom-
mended calibrator (see Table 1-1) and factory calibration uncertainty (see specifications).
Model 2002
19.00000 mADC
2002 MULTIMETER
Figure 1-4
Connections for DC current verification
Input LO
Amps
5700A Calibrator (Output DC Current)
Output HI
Output LO
Note: Be sure calibrator is set for
normal current output.
1-11
Performance Verification
1.8.4 AC current verification
AC current verification is performed by applying accurate AC currents at specific frequencies and then verifying that Model 2002 readings fall within specified limits.
Follow the steps below to verify ACI measurement accuracy.
CAUTION
Do not apply more than 2A, 250V to the AMPS input, or the current protection fuse will blow.
1. Connect the Model 2002 to the calibrator, as shown in Figure 1-5. Be sure to connect calibrator HI to the AMPS input, and connect calibrator LO to INPUT LO as shown.
Model 2002
190.000 µAAC RMS
2002 MULTIMETER
Input LO
Output HI
2. Turn on the Model 2002 and the calibrator, and allow a one-hour warm-up period before making measure­ments. Be sure the calibrator is set for normal current output.
3. Restore Model 2002 factory default conditions, as explained in paragraph 1.7.
4. Select the AC current function and the 200µA range on the Model 2002.
5. Set the calibrator output to 190.000µA AC at a fre­quency of 40Hz, and allow the reading to settle.
6. Verify that the Model 2002 reading is within the limits for the present current and frequency summarized in Table 1-7.
7. Repeat steps 5 and 6 for each frequency listed in Table 1-7.
8. Repeat steps 5 through 7 for the remaining ranges and frequencies listed in Table 1-7.
5700A Calibrator (Output AC Current)
Figure 1-5
Connections for AC current verification
Amps
Output LO
Note: Be sure calibrator is set for
normal current output.
1-12
Table 1-7
Limits for AC current verification
Performance Verification
2002 ACI
range
200µA 190.000µA 188.260mV
2mA 1.90000mA 1.88355V
20mA 19.0000mA 18.8355V
200mA 190.000mA 188.355V
2A 1.90000A 1.88250V
NOTE: Reading limits shown include total absolute uncertainty of recommended calibrator (see Table 1-1).
Applied AC
current
40Hz 100Hz 1kHz 10kHz
to
191.740mV
to
1.91645V
to
19.1645V
to
191.645V
to
1.91750V
1.8.5 Resistance verification
Resistance verification is performed by connecting accurate resistance values to the instrument and verifying that Model 2002 resistance readings are within stated limits.
Follow the steps below to verify resistance measurement accuracy.
CAUTION
Do not apply more than 1100V peak between INPUT HI and LO or more than 150V peak between SENSE HI and LO, or instrument damage may occur.
20 – 2M range verification
1. Using shielded 4-wire connections, connect the Model 2002 to the calibrator, as shown in Figure 1-6. Be sure to connect calibrator HI and LO terminals to the Model 2002 HI and LO terminals (including SENSE HI and LO) as shown.
2. Turn on the Model 2002 and the calibrator, and allow a four-hour warm-up period before making measurements.
3. Set the calibrator for 4-wire resistance (external sense on).
4. Restore Model 2002 factory default conditions, as explained in paragraph 1.7.
Reading limits (1 year, 18°C to 28°C)
189.562mV to
190.439mV
1.89657V to
1.90344V
18.9657V to
19.0344V
189.657V to
190.344V
1.89552V to
1.90449V
5. Set Model 2002 operating modes as follows:
A. From normal display, press CONFIG then 4. B. Select SPEED, then press ENTER. C. Select HIACCURACY, then press ENTER. D. Select FILTER, then press ENTER. E. Select AVERAGING, then press ENTER. F. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER. G. Select OFFSETCOMP, then press ENTER. H. Select ON, then press ENTER. (Note that OFFSET-
COMP cannot be used with the 200k and 2M
ranges.) I. Press EXIT to return to normal display.
6. Select the 4 function, and place the instrument on the 20 range. (If the FILT annunciator is off, press the FILTER key to enable the filter.)
7. Set the calibrator to output 19, and allow the reading to settle. Verify that the reading is within the limits stat­ed in Table 1-8.
Resistance values available in the Model 5700A calibrator may be slightly different than the stated nominal resistance values. Limits stated in Table 1-8 should be recal­culated based on actual calibrator resis­tance values.
189.210mV to
190.790mV
1.89742V to
1.90258V
18.9742V to
19.0258V
189.742V to
190.258V
1.89390V to
1.90610V
NOTE
189.020mV to
190.980mV
1.89742V to
1.90258V
18.9742V to
19.0258V
189.685V to
190.315V
1.89105V to
1.90895V
1-13
Performance Verification
8. Set the calibrator output to 190, and allow the reading
to settle.
9. Verify that the reading is within the limits stated in Table 1-8. (NOTE: Recalculate limits if calibrator resistance is not exactly as listed.)
10. Repeat steps 8 and 9 for the 2k through 2M ranges
using the values listed in Table 1-8. (Do not use offset compensation for the 200k and 2M ranges.)
20M and 200M range verification
1. Connect the DC calibrator and Model 2002 using the 2­wire connections shown in Figure 1-7.
2. Set the calibrator to the 2-wire mode (external sense off).
3. Set Model 2002 operating modes as follows:
Sense HI
Sense HI
Model 2002
Input HI
1.90000000 kΩ OCmp
2002 MULTIMETER
Output HI
A. From normal display, press CONFIG then 2. B. Select SPEED, then press ENTER. C. Select HIACCURACY, then press ENTER. D. Select FILTER, then press ENTER. E. Select AVERAGING, then press ENTER. F. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER.
G. Press EXIT to return to normal display.
4. Select the Model 2002 2 function, and change to the 20M range. (If the FILT annunciator is off, press the FILTER key to enable the filter.)
5. Set the calibrator to output 19M, and allow the reading to settle.
6. Verify that the reading is within the limits for the 20M range stated in Table 1-8. (NOTE: Recalculate limits if actual calibrator resistance differs from value shown.)
7. Repeat steps 4 through 6 for the 200M range (output 100M).
5700A Calibrator (Output 4-wire Resistance)
Input LO
Sense LO
Note : Use shielded cables to minimize noise. Enable calibrator external sense mode.
Output LO
Figure 1-6
Connections for resistance verification (20Ω-2MΩ ranges)
Model 2002
Input HI
19.0000000 M
2002 MULTIMETER
Input LO
Note: Use shielded cable to minimize noise.
Disable calibrator external sense mode.
Figure 1-7
Connections for resistance verification (20MΩ and 200MΩ ranges)
Output HI
Output LO
Sense LO
5700A Calibrator (Output 2-Wire Resistance)
1-14
Performance Verification
Table 1-8
Limits for resistance verification (20Ω-200MΩ ranges)
Nominal 2002 range
applied
resistance
Reading limits (1 year, 18°C to 28°C)
20 19 18.9985025 to 19.0014975 200 190 189.991277 to 190.008723 2k 1.9k 1.89994714k to 1.90005286k 20k 19k 18.9994638k to 19.0005362k 200k 190k 189.989313k to 190.010687k 2M 1.9M 1.89981109M to 1.90018891M 20M 19M 18.9940619M to 19.0059381M 200M100M 99.930910M to 100.069090M
Notes:
1. Limits shown include total absolute calibrator uncertainty (see Table 1-1) and factory calibration uncertainty (see specifications), and are based on nominal calibration values shown. Recalculate limits using Model 2002 relative accuracy specifications, factory calibration uncer­tainty, and calibrator absolute uncertainty if calibrator resistance val­ues differ from nominal values shown.
2. Use 4-wire connections and function for 20Ω-2MΩ ranges. Use 2­wire connections and function for 20M and 200M ranges.
1G range verification
1. Mount the 1G resistor and the banana plugs to the test
box, as shown in Figure 1-8. Be sure to mount the banana plugs with the correct spacing. The resistor should be completely enclosed in and shielded by the metal test box. The resistor LO lead should be electri­cally connected to the test box to provide adequate shielding.
1G Resistor (Keithley part # R-289-1G)
Insulated
Plug
HI
0.75"
LO
Banana
Plugs
Non-insulated Plug
Note: Resistor must be accurately characterized before use (see text).
Metal
Test Box
Figure 1-8
1GΩ resistor test box construction
2. Characterize the 1G resistor to within ±1,000ppm or
better using an accurate megohm bridge or similar equipment. Record the characterized value where indi­cated in Table 1-9. Also compute the limits based on the value of R using the formula at the bottom of the table.
NOTE
The actual value of the 1G resistor should not exceed 1.05G.
3. Set Model 2002 operating modes as follows:
A. From normal display, press CONFIG then 2. B. Select SPEED, then press ENTER. C. Select HIACCURACY, then press ENTER. D. Select FILTER, then press ENTER. E. Select AVERAGING, then press ENTER. F. Using the cursor and range keys, set the averaging
parameter to 10 readings, then press ENTER.
G. Press EXIT to return to normal display.
4. Select the 2-wire ohms function (2) and the 1G
range on the Model 2002. (If the FILT annunciator is off, press the FILTER key to enable the filter.)
5. Connect the 1G resistor test box (from steps 1 and 2)
to the INPUT HI and LO terminals of the Model 2002. (Be sure that the box shield is connected to INPUT LO.) Allow the reading to settle.
6. Verify that the Model 2002 reading is within the limits you calculated and recorded in Table 1-9.
Table 1-9
Limits for resistance verification (1GΩ range)
Characterized resistor (R)
Reading limits (1 year, 18°C to 28°C)
_________ G _________ G to _________ G
* 1 year limits = R ± (0.002065R + 15,000) Where R = characterized value of 1G resistor in ohms.
1.8.6 Frequency accuracy verification
Frequency accuracy verification is performed by connecting an accurate frequency source to Model 2002 inputs, and then verifying that the frequency readings are within stated limits.
Use the procedure below to verify the frequency measure­ment accuracy of the Model 2002.
1. Connect the frequency synthesizer to the Model 2002 INPUT terminals, as shown in Figure 1-9.
2. Turn on both instruments, and allow a one-hour warm­up period before measurement.
1-15
Performance Verification
3. Set the synthesizer operating modes as follows:
FREQ: 1Hz AMPTD: 5V p-p OFFSET: 0V MODE: CONT FCTN: sine
4. Restore Model 2002 factory defaults, as explained in paragraph 1.7.
5. Set maximum signal level to 10V as follows: A. Press CONFIG then FREQ. B. Select MAX-SIGNAL-LEVEL, then press ENTER. C. Choose 10V, then press ENTER. D. Press EXIT to return to normal display.
6. Press the FREQ key to select the frequency function.
7. Verify that the Model 2002 frequency reading is within the limits shown in the first line of Table 1-10.
BNC-to-Dual Banana Plug
Model 2002
1.0000 MHz
Adapter
8. Set the synthesizer to each of the frequencies listed in Table 1-10, and verify that the Model 2002 frequency reading is within the required limits.
Table 1-10
Frequency verification limits
Synthesizer frequency
Reading limits (1 year, 18°C to 28°C)
1Hz 0.9997Hz to 1.0003Hz
10Hz 9.9970Hz to 10.003Hz
100Hz 99.970Hz to 100.03Hz
1kHz 0.9997kHz to 1.0003kHz
10kHz 9.9970kHz to 10.003kHz
100kHz 99.970kHz to 100.03kHz
1MHz 0.9997MHz to 1.0003MHz
10MHz 9.9970MHz to 10.003MHz
15MHz 14.996MHz to 15.004MHz
Model 3930A or 3940 Synthesizer
Main
Function
Output
Figure 1-9
Connections for frequency accuracy verification
1-16
50Ω BNC Coaxial Cable
Performance Verification
1.8.7 Temperature reading checks
When using thermocouples, the Model 2002 displays tem­perature by measuring the DC thermocouple voltage, and then calculating the corresponding temperature. Similarly, the instrument computes RTD temperature readings by mea­suring the resistance of the RTD probe and calculating tem­perature from the resistance value.
Since the instrument computes temperature from DCV and resistance measurements, verifying the accuracy of those DCV and resistance measurement functions guarantees the accuracy of corresponding temperature measurements. Thus, it is not necessary to perform a comprehensive temper­ature verification procedure if DCV and resistance verifica­tion procedures show the instrument meets its specifications in those areas. However, those who wish to verify that the Model 2002 does in fact properly display temperature can use the following procedure to do so.
Selecting the temperature sensor
Follow the steps below to select the type of temperature sensor:
1. From normal display, press CONFIG then TEMP.
2. Select SENSOR, then press ENTER.
3. Select 4-WIRE-RTD or THERMOCOUPLE as desired, then press ENTER.
4. Select the type of RTD probe or thermocouple you wish to test, then return to the CONFIG TEMPERATURE menu.
5. Select UNITS, then press ENTER.
6. Select DEG-C, then press ENTER.
7. Press EXIT as necessary to return to normal display.
8. Press the TEMP key to place the Model 2002 in the tem­perature display mode. Refer to further information below on how to check thermocouple and RTD probe readings.
Thermocouple temperature reading checks
To check thermocouple readings, simply apply the appropri­ate DC voltage listed in Table 1-11 to the Model 2002 INPUT jacks using a precision DC voltage source (such as the one used to verify DC voltage accuracy in paragraph 1.8.1), and check the displayed temperature reading. Be sure to use low­thermal cables for connections between the DC calibrator and the Model 2002 when making these tests.
NOTE
The voltages shown are based on a 0°C reference junction temperature. Use the
CONFIG-TEMP menu to set the default reference junction temperature to 0°C.
Table 1-11
Thermocouple temperature reading checks
Reading limits Thermocouple type
J -7.659mV
K -5.730mV
T -5.439mV
E -8.561mV
R 0.054mV
S 0.055mV
B 0.632mV
* Voltages shown are based on ITS-90 standard using 0°C reference junction temperature. Use CONFIG-TEMP menu to set default reference junction to 0°C. NOTE: Reading limits shown do not include DCV calibrator uncertainty.
Applied DC voltage*
0mV
1.277mV
5.269mV
42.280mV
0mV
1.000mV
4.096mV
54.138mV
0mV
0.992mV
4.278mV
20.255mV
0mV
1.495mV
6.319mV
75.621mV
0.647mV
4.471mV
20.877mV
0.646mV
4.233mV
18.503mV
1.241mV
4.834mV
13.591mV
(°C) 1 year,
18°C to 28°C
-190.5 to -189.5
-0.5 to +0.5
24.5 to 25.5
99.5 to 100.5
749.5 to 750.0
-190.5 to -189.5
-0.5 to +0.5
24.5 to 25.5
99.5 to 100.5
1349.5 to 1350.5
-190.5 to -189.5
-0.5 to +0.5
24.5 to 25.5
99.5 to 100.5
389.5 to 390.5
-190.6 to -189.4
-0.6 to +0.6
24.4 to 25.6
99.4 to 100.6
989.4 to 990.6
7 to 13
97 to 103
497 to 503
1747 to 1753
7 to 13
97 to 103
497 to 503
1747 to 1753
355 to 365
495 to 505
995 to 1005
1795 to 1805
1-17
Performance Verification
RTD temperature reading checks
Use a precision decade resistance box (see Table 1-1) to sim­ulate probe resistances at various temperatures (Table 1-12). Be sure to use 4-wire connections between the decade resis­tance box and the Model 2002.
Table 1-12
RTD probe temperature reading checks
Reading limits RTD probe type
PT385 (α=0.00385)
PT392 (α=0.00392)
NOTE: Reading limits shown do not include uncertainty of resistance standards.
Applied resistance
22.80
60.25 100
109.73
138.50
313.59
63.68 100
109.90
139.16
266.94
(°C) 1 year,
18°C to 28°C
-190.068 to -189.932
-100.021 to -99.979
-0.021 to +0.021
24.979 to 25.021
99.979 to 100.021
599.932 to 600.068
-90.021 to -89.979
-0.021 to +0.021
24.979 to 25.021
99.979 to 100.021
449.932 to 450.068
1-18
2

Calibration

2.1 Introduction

This section gives detailed procedures for calibrating the Model 2002. Basically, there are three types of calibration procedures:
• Comprehensive calibration
• AC self-calibration
• Low-level calibration
Comprehensive calibration requires accurate calibration equipment to supply precise DC voltages, DC currents, and resistance values. AC self-calibration requires no external equipment and can be performed at any time by the operator. Low-level calibration is normally performed only at the fac­tory when the instrument is manufactured and is not usually required in the field.
NOTE
Low-level calibration is required in the field only if the Model 2002 has been repaired, or if the other calibration proce­dures cannot bring the instrument within stated specifications.
A single-point calibration feature is also available to allow the user to calibrate a single function or range without having to perform the entire calibration procedure.
Section 2 includes the following information:
2.2 Environmental conditions: States the temperature
and humidity limits for calibration.
2.3 Warm-up period: Discusses the length of time the
Model 2002 should be allowed to warm up before calibration.
2.4 Line power: States the power line voltage limits when
calibrating the unit.
2.5 Calibration lock: Explains how to unlock calibration
with the CAL switch.
2.6 IEEE-488 bus calibration commands: Summarizes
bus commands used for calibration, lists a simple cal­ibration program, and also discusses other important aspects of calibrating the instrument over the bus.
2.7 Calibration errors: Details front panel error messages
that might occur during calibration and also explains how to check for errors over the bus.
2.8 Comprehensive calibration: Covers comprehensive
(user) calibration from the front panel and over the IEEE-488 bus.
2.9 AC self-calibration: Discusses the AC user calibra-
tion process, both from the front panel and over the IEEE-488 bus.
2.10 Low-level calibration: Explains how to perform the
low-level calibration procedure, which is normally required only at the factory.
2.11 Single-point calibration: Outlines the basic methods
for calibrating only a single function or range instead of having to go through the entire calibration procedure.
2-1
Calibration

2.2 Environmental conditions

Calibration procedures should be performed at an ambient temperature of 23° ± 5°C, and at a relative humidity of less than 80% unless otherwise noted.
NOTE
If the instrument is normally used over a different ambient temperature range, cali­brate the instrument at the center of that temperature range.
If the internal temperature of the Model 2002 drifts exces­sively during calibration, an error will be generated. See Appendix C for additional information.

2.3 Warm-up period

The Model 2002 must be allowed to warm up for at least four hours before calibration. If the instrument has been subjected to temperature extremes (outside the range stated in para­graph 2.2), allow additional time for internal temperatures to stabilize. Typically, it takes one additional hour to stabilize a unit that is 10°C (18°F) outside the specified temperature range.
NOTE
Placement of the OPTION SLOT cover affects the internal temperature of the Model 2002. To achieve T ifications, the OPTION SLOT cover must be in the same position (on or off) as when the Model 2002 is to be used.
The calibration equipment should also be allowed to warm up for the minimum period specified by the manufacturer.
± 1°C spec-
CAL

2.4 Line power

The Model 2002 should be calibrated while operating from a line voltage in the range of 90-134V or 180-250V at 50, 60, or 400Hz.
If you attempt comprehensive or low-level calibration with­out performing the unlocking procedure, the following mes­sage will be displayed:
CALIBRATION LOCKED
Press the CAL switch to unlock.
Note that it is not necessary to unlock calibration for the AC­only self-calibration procedure. Also, IEEE-488 bus calibra­tion command queries such as the :DATE and :DATA com­mands are not protected by the calibration lock.
2.5.2 Low-level calibration lock
To unlock low-level calibration, press in and hold the CAL switch while turning on the power. Low-level calibration can then be performed.
NOTE
Do not unlock low-level calibration unless you have the appropriate equipment and intend to perform low-level calibration. See paragraph 2.10 for low-level calibra­tion details.
2.5.3 IEEE-488 bus calibration lock status
You can determine the status of either calibration lock over the bus by using the appropriate query. To determine com­prehensive calibration lock status, send the following query:
:CAL:PROT:SWIT?
The instrument will respond with the calibration lock status:
0: comprehensive calibration locked 1: comprehensive calibration unlocked
To determine the status of the low-level calibration lock, send the following query:
:CAL:PROT:LLEV:SWIT?
Responses to this calibration lock query include:
0: low-level calibration locked 1: low-level calibration unlocked

2.5 Calibration lock

2.5.1 Comprehensive calibration lock
Before performing comprehensive calibration, you must first unlock calibration by momentarily pressing in on the recessed CAL switch. The instrument will display the fol­lowing message:
CALIBRATION UNLOCKED
Calibration can now be performed
2-2
Refer to paragraph 2.6 below and Section 3 for more details on calibration commands.

2.6 IEEE-488 bus calibration commands

Table 2-1 summarizes calibration commands used to cali­brate the instrument over the IEEE-488 bus (GPIB).
Table 2-1
IEEE-488 bus calibration command summary
Command Description
:CALibration
:PROTected
:INITiate :LOCK :SWITch? :SAVE :DATA? :DATE <yr>, <mon>, <day> :DATE? :NDUE <yr>, <mon>, <day> :NDUE? :DC
:ZERO :V2 <NRf> :V20 <NRf> :OHM1M <NRf> :OHM200K <NRf> :OHM20K <NRf> :OHM2K <NRf> :OHM200 <NRf> :OHM20 <NRf> :A200U <NRf> :A2M <NRf> :A20M <NRf> :A200M <NRf> :A2 <NRf> :OPEN
:LLEVel
:SWITch?
Calibration root command.
All commands in this subsystem are protected by the CAL switch (except queries).
Initiate calibration. Lock out calibration (opposite of enabling cal with CAL switch). Request comprehensive CAL switch state. (0 = locked; 1 = unlocked) Save cal constants to EEROM. Download cal constants from 2002. Send cal date to 2002. Request cal date from 2002. Send next due cal date to 2002. Request next due cal date from 2002. Comprehensive calibration subsystem.
Short-circuit calibration step. +2V DC calibration step. +20V DC calibration step. 1M calibration step. 200k calibration step. 20k calibration step. 2k calibration step.
200 calibration step. 20 calibration step. 200µA DC calibration step. 2mA DC calibration step. 20mA DC calibration step. 200mA DC calibration step. 2A DC calibration step. Open circuit calibration step.
Low-level calibration subsystem.
Request low-level CAL switch state. (0 = locked; 1 = unlocked)
:STEP <Step #>
1 2 3 4 5 6 7 8
9 10 11 12 13
:UNPRotected
:ACCompensation
NOTE: Upper-case letters indicate short form of each command. For example, instead of sending ":CALibration:PROTected:INITiate", send ":CAL:PROT:INIT".
20V AC at 1kHz step. 20V AC at 30kHz step. 200V AC at 1kHz step. 200V AC at 30kHz
1.5V AC at 1kHz step. 200mV AC at 1kHz step. 5mV AC at 100kHz step.
0.5mV AC at 1kHz step. +100V DC step.
-20V DC step. Rear inputs short-circuit step. 20mA AC at 1kHz step. 2V AC at 1Hz step.
Commands in this subsystem not protected by CAL switch.
Perform user AC calibration (disconnect all cables)
Calibration
2-3
Loading...
+ 67 hidden pages