Keithley 2420 Service Manual

Model 2420 3A SourceMeter
Service Manual
®
A GREATER MEASURE OF CONFIDENCE
W ARRANTY
Keithley Instruments, Inc. warrants this product to be free from defects in material and workmanship for a period of 1 year from date of shipment.
Keithley Instruments, Inc. warrants the following items for 90 days from the date of shipment: probes, cables, rechargeable batteries, diskettes, and documentation.
During the warranty period, we will, at our option, either repair or replace any product that proves to be defecti ve. To exercise this warranty, write or call your local Keithley representative, or contact Keithley headquarters in
Cleveland, Ohio. You will be given prompt assistance and return instructions. Send the product, transportation prepaid, to the indicated service facility . Repairs will be made and the product returned, transportation prepaid. Repaired or replaced products are warranted for the balance of the original warranty period, or at least 90 days.
LIMIT A TION OF W ARRANTY
This warranty does not apply to defects resulting from product modification without Keithley’s express written consent, or misuse of any product or part. This warranty also does not apply to fuses, software, non-rechar geable batteries, damage from battery leakage, or problems arising from normal wear or failure to follow instructions.
THIS WARRANTY IS IN LIEU OF ALL OTHER WARRANTIES, EXPRESSED OR IMPLIED, INCLUD­ING ANY IMPLIED WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR USE. THE REMEDIES PROVIDED HEREIN ARE BUYER’S SOLE AND EXCLUSIVE REMEDIES.
NEITHER KEITHLEY INSTRUMENTS, INC. NOR ANY OF ITS EMPLOYEES SHALL BE LIABLE FOR ANY DIRECT , INDIRECT, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OF ITS INSTRUMENTS AND SOFTWARE EVEN IF KEITHLEY INSTRUMENTS, INC., HAS BEEN ADVISED IN ADVANCE OF THE POSSIBILITY OF SUCH DAMAGES. SUCH EXCLUDED DAM­AGES SHALL INCLUDE, BUT ARE NOT LIMITED TO: COSTS OF REMOVAL AND INSTALLATION, LOSSES SUSTAINED AS THE RESULT OF INJURY T O ANY PERSON, OR DAMAGE T O PROPER TY.
Keithley Instruments, Inc.
Sales Offices: BELGIUM: Bergensesteenweg 709 • B-1600 Sint-Pieters-Leeuw • 02-363 00 40 • Fax: 02/363 00 64
CHINA: Y uan Chen Xin Building, Room 705 • 12 Yumin Road, Dewai, Madian • Beijing 100029 • 8610-6202-2886 • Fax: 8610-6202-2892 FINLAND: Tietäjäntie 2 • 02130 Espoo • Phone: 09-54 75 08 10 • Fax: 09-25 10 51 00 FRANCE: 3, allée des Garays • 91127 Palaiseau Cédex • 01-64 53 20 20 • Fax: 01-60 11 77 26 GERMANY: Landsberger Strasse 65 • 82110 Germering • 089/84 93 07-40 • Fax: 089/84 93 07-34 GREAT BRITAIN: Unit 2 Commerce Park, Brunel Road • Theale • Berkshire RG7 4AB • 0118 929 7500 • F ax: 0118 929 7519 INDIA: Flat 2B, Willocrissa • 14, Rest House Crescent • Bangalore 560 001 • 91-80-509-1320/21 • Fax: 91-80-509-1322 ITALY: Viale San Gimignano, 38 • 20146 Milano • 02-48 39 16 01 • Fax: 02-48 30 22 74 KOREA: FL., URI Building • 2-14 Yangjae-Dong • Seocho-Gu, Seoul 137-130 • 82-2-574-7778 • Fax: 82-2-574-7838 NETHERLANDS: Postbus 559 • 4200 AN Gorinchem • 0183-635333 • Fax: 0183-630821 SWEDEN: c/o Regus Business Centre • Frosundaviks Allé 15, 4tr • 169 70 Solna • 08-509 04 679 • F ax: 08-655 26 10 SWITZERLAND: Kriesbachstrasse 4 • 8600 Dübendorf • 01-821 94 44 • Fax: 01-820 30 81 TAIWAN: 1FL., 85 Po Ai Street • Hsinchu, Taiwan, R.O.C. • 886-3-572-9077• Fax: 886-3-572-9031
28775 Aurora Road • Cleveland, Ohio 44139 • 440-248-0400 • Fax: 440-248-6168
1-888-KEITHLEY (534-8453) • www.keithley.com
© Copyright 2001 Keithley Instruments, Inc.
Printed in the U.S.A.
11/01
Model 2420 3A Sour ceMeter
Service Manual
®
©1997, Keithley Instruments, Inc.
All rights reserved.
Cleveland, Ohio, U.S.A.
Fifth Printing, October 2001
Document Number: 2420-902-01 Rev. E
Manual Print History
The print history shown below lists the printing dates of all Revisions and Addenda created for this manual. The Revision Le vel letter increases alphabetically as the manual under goes sub­sequent updates. Addenda, which are released between Revisions, contain important change in­formation that the user should incorporate immediately into the manual. Addenda are numbered sequentially . When a new Re vision is created, all Addenda associated with the previous Re vision of the manual are incorporated into the new Revision of the manual. Each ne w Revision includes a revised copy of this print history page.
Revision A (Document Number 2420-902-01)................................................................April 1997
Addendum A (Document Number 2420-902-02).............................................................April 1997
Revision B (Document Number 2420-902-01)..................................................................July 1998
Revision C (Document Number 2420-902-01)............................................................ January 1999
Revision D (Document Number 2420-902-01).................................................................June 2000
Revision E (Document Number 2420-902-01) ............................................................October 2001
All Keithley product names are trademarks or registered trademarks of Keithley Instruments, Inc. Other brand names are trademarks or registered trademarks of their respective holders.
S
afety Precautions
The following safety precautions should be observed before using this product and any associated instrumentation. Although some instruments and accessories would normally be used with non-hazardous voltages, there are situations where hazardous conditions may be present.
This product is intended for use by qualified personnel who recognize shock hazards and are familiar with the safety precautions required to avoid possible injury. Read and follow all installation, operation, and maintenance information carefully before us­ing the product. Refer to the manual for complete product specifications.
If the product is used in a manner not specified, the protection provided by the product may be impaired. The types of product users are:
Responsible body
ment is operated within its specifications and operating limits, and for ensuring that operators are adequately trained.
Operators
instrument. They must be protected from electric shock and contact with hazardous live circuits.
Maintenance personnel
voltage or replacing consumable materials. Maintenance procedures are described in the manual. The procedures explicitly state if the operator may perform them. Otherwise, they should be performed only by service personnel.
Service personnel
trained service personnel may perform installation and service procedures. Keithley products are designed for use with electrical signals that are rated Installation Category I and Installation Category II,
as described in the International Electrotechnical Commission (IEC) Standard IEC 60664. Most measurement, control, and data I/O signals are Installation Category I and must not be directly connected to mains voltage or to voltage sources with high tran­sient over-voltages. Installation Cate gory II connections require protection for high transient over -voltages often associated with local A C mains connections. Assume all measurement, control, and data I/O connections are for connection to Category I sourc­es unless otherwise marked or described in the Manual.
Exercise extreme caution when a shock hazard is present. Lethal voltage may be present on cable connector jacks or test fixtures. The American National Standards Institute (ANSI) states that a shock hazard exists when v oltage le vels greater than 30V RMS,
42.4V peak, or 60VDC are present.
circuit before measuring.
Operators of this product must be protected from electric shock at all times. The responsible body must ensure that operators are prevented access and/or insulated from every connection point. In some cases, connections must be exposed to potential human contact. Product operators in these circumstances must be trained to protect themselves from the risk of electric shock. If the circuit is capable of operating at or above 1000 volts,
Do not connect switching cards directly to unlimited power circuits. They are intended to be used with impedance limited sourc­es. NEVER connect switching cards directly to AC mains. When connecting sources to switching cards, install protective de­vices to limit fault current and voltage to the card.
Before operating an instrument, make sure the line cord is connected to a properly grounded power receptacle. Inspect the con­necting cables, test leads, and jumpers for possible wear, cracks, or breaks before each use.
When installing equipment where access to the main power cord is restricted, such as rack mounting, a separate main input pow­er disconnect device must be provided, in close proximity to the equipment and within easy reach of the operator.
For maximum safety, do not touch the product, test cables, or any other instruments while power is applied to the circuit under test. ALWAYS remove power from the entire test system and discharge any capacitors before: connecting or disconnecting ca­bles or jumpers, installing or removing switching cards, or making internal changes, such as installing or removing jumpers.
is the individual or group responsible for the use and maintenance of equipment, for ensuring that the equip-
use the product for its intended function. They must be trained in electrical safety procedures and proper use of the
perform routine procedures on the product to keep it operating properly, for example, setting the line
are trained to work on live circuits, and perform safe installations and repairs of products. Only properly
A good safety practice is to expect that hazardous voltage is present in any unknown
no conductive part of the circuit may be exposed.
Do not touch any object that could provide a current path to the common side of the circuit under test or power line (earth) ground. Al­ways make measurements with dry hands while standing on a dry , insulated surface capable of withstanding the voltage being measured.
The instrument and accessories must be used in accordance with its specifications and operating instructions or the safety of the equipment may be impaired.
Do not exceed the maximum signal levels of the instruments and accessories, as defined in the specifications and operating in­formation, and as shown on the instrument or test fixture panels, or switching card.
When fuses are used in a product, replace with same type and rating for continued protection against fire hazard. Chassis connections must only be used as shield connections for measuring circuits, NOT as safety earth ground connections. If you are using a test fixture, keep the lid closed while power is applied to the device under test. Safe operation requires the use
of a lid interlock. If a screw is present, connect it to safety earth ground using the wire recommended in the user documentation.
!
The symbol on an instrument indicates that the user should refer to the operating instructions located in the manual.
The symbol on an instrument shows that it can source or measure 1000 volts or more, including the combined effect of normal and common mode voltages. Use standard safety precautions to avoid personal contact with these voltages.
The
WARNING
information very carefully before performing the indicated procedure.
CAUTION
The ranty.
Instrumentation and accessories shall not be connected to humans. Before performing any maintenance, disconnect the line cord and all test cables. T o maintain protection from electric shock and fire, replacement components in mains circuits, including the power transformer ,
test leads, and input jacks, must be purchased from Keithley Instruments. Standard fuses, with applicable national safety ap­provals, may be used if the rating and type are the same. Other components that are not safety related may be purchased from other suppliers as long as they are equivalent to the original component. (Note that selected parts should be purchased only through Keithley Instruments to maintain accuracy and functionality of the product.) If you are unsure about the applicability of a replacement component, call a Keithley Instruments office for information.
To clean an instrument, use a damp cloth or mild, water based cleaner. Clean the exterior of the instrument only. Do not apply cleaner directly to the instrument or allow liquids to enter or spill on the instrument. Products that consist of a circuit board with no case or chassis (e.g., data acquisition board for installation into a computer) should never require cleaning if handled accord­ing to instructions. If the board becomes contaminated and operation is affected, the board should be returned to the factory for proper cleaning/servicing.
heading in a manual explains dangers that might result in personal injury or death. Al ways read the associated
heading in a manual explains hazards that could damage the instrument. Such damage may inv alidate the war -
11/01
T able of Contents
1
Performance Verification
Introduction.........................................................................................1-2
Verification test requirements ..............................................................1-2
Environmental conditions.............................................................1-2
Warm-up period............................................................................1-2
Line power....................................................................................1-3
Recommended test equipment .........................................................1-3
1Ω resistor characterization .........................................................1-3
Verification limits .............................................................................1-4
Example limits calculation............................................................1-4
Resistance limits calculation.........................................................1-4
Restoring factory defaults ................................................................1-5
Performing the verification procedures............................................1-5
Test summary................................................................................1-5
Test considerations .......................................................................1-6
Setting the source range and output value....................................1-6
Setting the measurement range.....................................................1-7
Compliance considerations ..............................................................1-7
Compliance limits.........................................................................1-7
Type of compliance.......................................................................1-7
Maximum compliance values.......................................................1-8
Determining compliance limit......................................................1-8
Taking the SourceMeter out of compliance..................................1-8
Output voltage accuracy...................................................................1-9
Voltage measurement accuracy...................................................1-10
Output current accuracy..............................................................1-11
Current measurement accuracy......................................................1-13
Resistance measurement accuracy .................................................1-15
Calibration
2
Introduction......................................................................................2-2
Environmental conditions.................................................................2-2
Temperature and relative humidity...............................................2-2
Warm-up period ............................................................................2-2
Line power....................................................................................2-2
Calibration considerations................................................................2-3
Calibration cycle...........................................................................2-3
Recommended calibration equipment...........................................2-3
1Ω resistor characterization..........................................................2-4
Unlocking calibration.......................................................................2-4
Unlocking calibration from the front panel..................................2-4
Unlocking calibration by remote..................................................2-5
Changing the password .......................................................................2-6
Changing the password from the front panel...............................2-6
Changing the password by remote...............................................2-6
Resetting the calibration password......................................................2-6
Viewing calibration dates and calibration count .................................2-7
Calibration errors.............................................................................2-7
Front panel error reporting ...........................................................2-7
Remote error reporting .................................................................2-7
Front panel calibration........................................................................2-7
Remote calibration ............................................................................2-14
Remote calibration commands...................................................2-14
Recommended calibration parameters.......................................2-15
Remote calibration procedure ....................................................2-17
Single-range calibration....................................................................2-22
Routine Maintenance
3
Introduction.........................................................................................3-2
Line fuse replacement .........................................................................3-2
T roubleshooting
4
Introduction.........................................................................................4-2
Repair considerations..........................................................................4-2
Power-on self-test................................................................................4-2
Front panel tests..................................................................................4-3
KEYS test.....................................................................................4-3
DISPLAY PATTERNS test ..........................................................4-3
CHAR SET test............................................................................4-4
Principles of operation ........................................................................4-4
Overall block diagram..................................................................4-4
Analog circuits .............................................................................4-4
Power supply................................................................................4-6
Output stage..................................................................................4-7
A/D converter...............................................................................4-8
Active guard.................................................................................4-8
Digital circuitry............................................................................4-8
Display board circuit theory.........................................................4-9
Troubleshooting................................................................................4-10
Display board checks .................................................................4-10
Power supply checks..................................................................4-11
Digital circuitry checks ..............................................................4-11
Analog circuitry checks..............................................................4-12
Battery replacement..........................................................................4-12
Battery replacement precautions................................................4-12
Battery replacement procedure...................................................4-13
No comm link error...........................................................................4-13
Disassembly
5
Introduction.........................................................................................5-2
Handling and cleaning.........................................................................5-2
Handling PC boards......................................................................5-2
Solder repairs................................................................................5-2
Static sensitive devices........................................................................5-3
Assembly drawings .............................................................................5-3
Case cover removal..............................................................................5-3
Analog board removal .........................................................................5-4
Digital board removal..........................................................................5-5
Front panel disassembly ......................................................................5-6
Removing power components..............................................................5-6
Power module removal.................................................................5-6
Instrument re-assembly........................................................................5-7
Replaceable Parts
6
Introduction..........................................................................................6-2
Parts lists..............................................................................................6-2
Ordering information...........................................................................6-2
Factory service.....................................................................................6-2
Component layouts..............................................................................6-2
A
Specifications
Accuracy calculations.......................................................................A-10
Measurement accuracy..............................................................A-10
Source accuracy........................................................................A-10
B
Command Reference
Introduction.........................................................................................B-2
Command summary............................................................................ B-2
Miscellaneous commands...................................................................B-3
Detecting calibration errors ................................................................ B-8
Reading the error queue............................................................... B-8
Error summary.............................................................................B-8
Status byte EAV (Error Available) bit.......................................... B-9
Generating an SRQ on error........................................................ B-9
Detecting calibration step completion.............................................. B-10
Using the *OPC? query............................................................. B-10
Using the *OPC command........................................................B-10
Generating an SRQ on calibration complete.............................B-11
C
Calibration Programs
Introduction........................................................................................ C-2
Computer hardware requirements...................................................... C-2
Software requirements........................................................................ C-2
Calibration equipment........................................................................ C-2
General program instructions............................................................. C-2
Program C-1 Model 2420 calibration program ........................... C-4
Requesting calibration constants........................................................ C-7
Program C-2 Requesting calibration constants ........................... C-7
List of Illustrations
1
Performance Verification
Connections for voltage verification tests....................................1-9
Connections for10µA to 1A range current verification tests .....1-11
Connections for 3A range current verification tests ..................1-12
Connections for resistance accuracy verification.......................1-15
2
Calibration
Voltage calibration test connections.............................................2-8
10µA to 1A range current calibration test connections..............2-10
3A range current calibration test connections............................2-12
Routine Maintenance
3
Rear panel ....................................................................................3-2
T roubleshooting
4
Overall block diagram..................................................................4-5
Analog circuitry block diagram...................................................4-5
Power supply block diagram........................................................4-6
Output state simplified schematic................................................4-7
Digital circuitry block diagram....................................................4-9
List of T ables
1
Performance Verification
Maximum compliance values...........................................................1-8
Output voltage accuracy limits.......................................................1-10
Voltage measurement accuracy limits ............................................1-11
Output current accuracy limits.......................................................1-13
Current measurement accuracy limits............................................1-14
Ohms measurement accuracy limits...............................................1-16
Calibration
2
Recommended calibration equipment............................................. 2-4
Calibration unlocked states ..............................................................2-5
Front panel voltage calibration.......................................................2-10
Front panel current calibration.......................................................2-13
Remote calibration command summary.........................................2-15
Recommended :CAL:PROT:SENS parameter ranges....................2-16
Recommended :CAL:PROT:SOUR parameter ranges...................2-16
Voltage calibration initialization commands ..................................2-18
Voltage range calibration commands .............................................2-19
Current calibration initialization commands..................................2-20
Current range calibration commands .............................................2-21
Recommended verification equipment ............................................1-3
Routine Maintenance
3
Power line fuse.................................................................................3-3
4
T roubleshooting
Display board checks .....................................................................4-10
Power supply checks......................................................................4-11
Digital circuitry checks ..................................................................4-11
Analog circuitry checks..................................................................4-12
6
Replaceable Parts
Analog board parts list .....................................................................6-3
Digital board parts list....................................................................6-12
Display board parts list...................................................................6-17
Mechanical parts list.......................................................................6-18
B
Command Reference
Remote calibration command summary ......................................... B-2
Recommended :CAL:PROT:SENS parameter ranges.................... B-6
Recommended :CAL:PROT:SOUR parameter ranges................... B-7
Calibration errors............................................................................ B-9
1
Performance
V erification
1-2 Performance Verification
Introduction
Use the procedures in this section to verify that Model 2420 accuracy is within the limits stated
in the instrument’s one-year accurac y specifications. You can perform these verification procedures:
When you first receive the instrument to make sure that it was not damaged during shipment.
To verify that the unit meets factory specifications.
To determine if calibration is required.
Following calibration to make sure it was performed properly.
WARNING
NOTE
The information in this section is intended for qualified service personnel only. Do not attempt these procedures unless you are qualified to do so. Some of these procedures may expose you to hazardous voltages, which could cause personal injury or death if contacted. Use standard safety pre­cautions when working with hazardous voltages.
If the instrument is still under warranty and its performance is outside specified limits, con­tact your Keithle y r epr esentative or the factory to determine the correct course of action.
V erification test requirements
Be sure that you perform the verification tests:
Under the proper environmental conditions.
After the specified warm-up period.
Using the correct line voltage.
Using the proper test equipment.
Using the specified output signals and reading limits.
Environmental conditions
Conduct your performance verification procedures in a test environment with:
An ambient temperature of 18-28°C (65-82°F).
A relative humidity of less than 70% unless otherwise noted.
W arm-up period
Allow the Model 2420 to warm up for at least one hour before conducting the verification
procedures.
If the instrument has been subjected to temperature extremes (those outside the ranges stated above), allow additional time for the instrument’s internal temperature to stabilize. T ypically , al­low one extra hour to stabilize a unit that is 10°C (18°F) outside the specifi ed temperature range.
Also, allow the test equipment to warm up for the minimum time specified by the manuf acturer .
Line power
The Model 2420 requires a line voltage of 90 to 250V and a line frequency of 50 to 60Hz.
Verification tests must be performed within this range.
Recommended test equipment
T able 1-1 summarizes recommended verification equipment. You can use alternate equipment as long as that equipment has specifications at least as good as those listed in Table 1-1. Keep in mind, howev er, that test equipment uncertainty will add to the uncertainty of each measurement. Generally, test equipment uncertainty should be at least four times better than corresponding Model 2420 specifications. Table 1-1 lists the specifications of the recommended test equipment, includ­ing maximum allowable uncertainty for alternate test equipment, which is shown in parentheses.
Table 1-1
Recommended verification equipment
Description Manufacturer/Model Specifications
Digital Multimeter Hewlett Packard
HP3458A
DC Voltage* 1V:
Performance Verification 1-3
±5.6ppm 10V: 100V:
±4.3ppm
±6.3ppm
µ
DC Current* 10
Resistance Calibrator Fluke 5450A Resistance** 1.9
Precision Resistor*** Isotec RUG-Z-1R00-0.1 1
* 90-day, full-range accuracy specifications of ranges required for various measurement points. ** 90-day, ±5°C specifications of nominal resistance values shown. Use actual values for tests. Maximum uncertainty of
alternate test equipment shown in parentheses.
*** Required for verification of 3A current range. Characterize resistor to ±300ppm or better using recommended DMM
before verifying 3A current measurement range.
A:
µ
100 1mA: 10mA: 100mA: 1A:
19
190
1.9k
19k 190k
1.9M 19M
, ±0.1%, 100W
±25ppm
A:
±23ppm
±20ppm
±20ppm
±35ppm
±110ppm
:
±65ppm (±460ppm)
:
±23ppm (±280ppm)
:
±10.5ppm (±230ppm)
:
±8ppm (±200ppm)
:
±7.5ppm (±195ppm)
:
±8.5ppm (±200ppm)
:
±11.5ppm (±180ppm)
:
±30ppm (±635ppm)
1-4 Performance Verification
1Ω resistor characterization
The recommended 1Ω resistor should be characterized to ±300ppm or better before verifying the 3A current measurement range. (You need not characterize the resistor if you are checking only the 3A current source range.) Use the 4-wire ohms function of the DMM recommended in T able 1-1 to measure the resistance value, and then use that measured value to calculate the cur ­rent during the 3A current measurement range test procedure.
V erification limits
The verification limits stated in this section have been calculated using only the Model 2420 one-year accuracy specifications, and they do not include test equipment uncertainty. If a par­ticular measurement falls outside the allowable range, recalculate new limits based on Model 2420 specifications and corresponding test equipment specifications.
Example limits calculation
As an example of how verification limits are calculated, assume you are testing the 20V DC output range using a 20V output value. Using the Model 2420 20V range one-year accuracy specification of ±(0.02% of output + 2.4mV offset), the calculated output limits are:
Output limits = 20V ± [(20V
Output limits = 20V ± (0.004 + 0.0024)
Output limits = 20V ± 0.0064V
Output limits = 19.9936V to 20.0064V
Resistance limits calculation
When verifying the resistance measurement accuracy, it will probably be necessary to recal­culate resistance limits based on the actual calibrator resistance values. You can calculate resis­tance reading limits in the same manner described above, b ut be sure to use the actual calibrator resistance values and the Model 2420 normal accuracy specifications for your calculations.
As an example, assume you are testing the 20k inal 19k accuracy specifications of ±(0.063% of reading + 3
calibrator resistor is 19.01kΩ. Using the Model 2420 20kΩ range one-year normal
Reading limits = 19.01k
Reading limits = 19.01k
Reading limits = 18.9950kΩ to 19.0250k
Ω Ω
×
0.02%) + 2.4mV]
range, and that the actual value of the nom-
), the recalculated reading limits are:
± [(19.01kΩ × 0.063%) + 3Ω] ±15
Restoring factory defaults
Before performing the verification procedures, restore the instrument to its factory front panel
(bench) defaults as follows:
1. Press the MENU key. The instrument will display the following prompt:
MAIN MENU
SAVESETUP COMMUNICATION CAL
2. Select SAVESETUP, and then press ENTER. The unit then displays:
SAVESETUP MENU
GLOBAL SOURCE-MEMORY
3. Select GLOBAL, and then press ENTER. The unit then displays:
GLOBAL SETUP MENU
SAVE RESTORE POWERON RESET
4. Select RESET, and then press ENTER. The unit displays:
RESET ORIGINAL DFLTS
BENCH GPIB
5. Select BENCH, and then press ENTER. The unit then displays:
RESETTING INSTRUMENT
ENTER to confirm; EXIT to abort
6. Press ENTER to restore bench defaults, and note the unit displays the following:
RESET COMPLETE
BENCH defaults are now restored
Press ENTER to continue
7. Press ENTER then EXIT as necessary to return to normal display.
Performance Verification 1-5
Performing the verification test procedures
T est summary
DC voltage output accuracy
DC voltage measurement accuracy
DC current output accuracy
DC current measurement accuracy
Resistance measurement accuracy
If the Model 2420 is not within specifications and not under warranty , see the calibration pro-
cedures in Section 2 for information on calibrating the unit.
1-6 Performance Verification
T est considerations
When performing the verification procedures:
Be sure to restore factory front panel defaults as previously outlined.
Make sure that the test equipment is properly warmed up and connected to the Model 2420 INPUT/OUTPUT jacks. Also be sure that the front panel jacks are selected with the TERMINALS key.
Make sure the Model 2420 is set to the correct source range (see below).
Be sure that the Model 2420 output is turned on before making measurements.
Be sure the test equipment is set up for the proper function and range.
Allow the Model 2420 output signal to settle before making a measurement.
Do not connect test equipment to the Model 2420 through a scanner, multiplexer, or other switching equipment.
WARNING
CAUTION
The maximum common-mode voltage (voltage between LO and chassis ground) is 250V peak. Exceeding this value may cause a breakdown in in­sulation, creating a shock hazard.
The maximum voltage between INPUT/OUTPUT HI and LO or 4-WIRE SENSE HI and LO is 75V peak. The maximum voltage between INPUT/ OUTPUT HI and 4-WIRE SENSE HI or between INPUT/OUTPUT LO and 4-WIRE SENSE LO is 5V. Exceeding these voltage values may result in instrument damage.
Setting the source range and output value
Before testing each verification point, you must properly set the source range and output v al-
ue as outlined below.
1. Press either the SOURCE V or SOURCE I ke y to select the appropriate source function.
2. Press the EDIT key as required to select the source display field. Note that the cursor will flash in the source field while its value is being edited.
3. With the cursor in the source display field flashing, set the source range to the lo west pos­sible range for the value to be sourced using the up or do wn RANGE ke y. For example, you should use the 20V source range to output a 20V source value. With a 20V source value and the 20V range selected, the source field display will appear as follows:
Vsrc:+20.0000 V
4. With the source field cursor flashing, set the source output to the required value using either:
• The SOURCE adjustment and left and right arrow keys.
• The numeric keys.
5. Note that the source output value will be updated immediately; you need not press ENTER when setting the source value.
Setting the measurement range
When simultaneously sourcing and measuring either voltage or current, the measure range is coupled to the source range, and you cannot independently control the measure range. Thus, it is not necessary for you to set the measure range when testing voltage or current measurement accuracy.
Compliance considerations
Compliance limits
When sourcing voltage, you can set the SourceMeter to limit current from 10nA to 3.15A. Conversely, when sourcing current, you can set the SourceMeter to limit voltage from 0.2mV to 63V. The SourceMeter output will not exceed the programmed compliance limit.
T ypes of compliance
There are two types of compliance that can occur: “real” and “range.” Depending on which value is lower , the output will clamp at either the displayed compliance setting (“real”) or at the maximum measurement range reading (“range”).
Performance Verification 1-7
The “real” compliance condition can occur when the compliance setting is less than the high­est possible reading of the measurement range. When in compliance, the source output clamps at the displayed compliance value. For example, if the compliance voltage is set to 1V and the measurement range is 2V, the output voltage will clamp (limit) at 1V.
“Range” compliance can occur when the compliance setting is higher than the possible read­ing of the selected measurement range. When in compliance, the source output clamps at the maximum measurement range reading (not the compliance value). For example, if the compli­ance voltage is set to 1V and the measurement range is 200mV, the output voltage will clamp (limit) at 210mV.
1-8 Performance Verification
Maximum compliance values
The maximum compliance values for the measurement ranges are summarized in Table 1-2.
Table 1-2
Maximum compliance values
Measurement range
200mV 2V 20V 60V
10
µ
A
100
µ
A 1mA 10mA 100mA 1A 3A
When the SourceMeter goes into compliance, the “Cmpl” label or the units label (i.e., “mA”)
for the compliance display will flash.
Maximum compliance value
210mV
2.1V 21V 63V
10.5
µ
A
105
µ
A
1.05mA
10.5mA 105mA
1.05A
3.15A
Determining compliance limit
The relationships to determine which compliance is in effect are summarized as follows.
They assume that the measurement function is the same as the compliance function.
Compliance Setting < Measurement Range = Real Compliance
Measurement Range < Compliance Setting = Range Compliance
You can determine the compliance that is in effect by comparing the displayed compliance setting to the present measurement range. If the compliance setting is lower than the maximum possible reading on the present measurement range, the compliance setting is the compliance limit. If the compliance setting is higher than the measurement range, the maximum reading on that measurement range is the compliance limit
T aking the SourceMeter out of compliance
Verification measurements should not be made when the SourceMeter is in compliance. For
purposes of the verification tests, the SourceMeter can be taken out of compliance by going into the edit mode and increasing the compliance limit.
NOTE
Do not take the unit out of compliance by decreasing the sour ce value or changing the range. Always use the recommended range and source settings when performing the verification tests.
Output voltage accuracy
Follow the steps belo w to verify that Model 2420 output v oltage accurac y is within specified limits. This test involves setting the output voltage to each full-range value and measuring the voltages with a precision digital multimeter.
1. With the power of f, connect the digital multimeter to the Model 2420 INPUT/OUTPUT jacks, as shown in Figure 1-1.
Figure 1-1
Connections for voltage verification tests
Performance Verification 1-9
4- WIRE
INPUT/
SENSE
OUTPUT
HI
75V
75V
5V PEAK
PEAK
PEAK
MEAS
EDIT
V
I
DISPLAY
1
LOCAL
67
DIGITS SPEED
230
REL
FILTER
89
STORE
TOGGLE
POWER
2420 3A SourceMeter
SOURCE
FCTN
I
V
4
EDIT
5
TRIG
SWEEP
LIMIT
+/-
EXIT ENTER
RECALL
CONFIG MENU
Model 2420
LO
250V PEAK
RANGE AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Input HI
Input LO
Digital Multimeter
2. Select the multimeter DC volts measuring function.
3. Set the voltage source protection to NONE. To do so, press CONFIG then SOURCE V to access the CONFIGURE V-SOURCE menu. Select PROTECTION, and set the volt­age source protection limit to NONE.
4. Press the Model 2420 SOURCE V key to source voltage, and make sure the source out­put is turned on.
5. Verify output voltage accuracy for each of the voltages listed in Table 1-3. For each test point:
• Select the correct source range.
• Set the Model 2420 output voltage to the indicated value.
• Verify that the multimeter reading is within the limits given in the table.
6. Repeat the procedure for negative output voltages with the same magnitudes as those listed in Table 1-3.
7. Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
1-10 Performance Verification
Table 1-3
Output voltage accuracy limits
Model 2420
source range
200mV 2V 20V 60V
Model 2420 output
voltage setting
200.000mV
2.00000V
20.0000V
60.0000V
V oltage measurement accuracy
Follow the steps below to verify that Model 2420 voltage measurement accuracy is within specified limits. The test inv olves setting the source v oltage to full-range values, as measured by a precision digital multimeter, and then v erifying that the Model 2420 voltage readings are with­in required limits.
1. With the power of f, connect the digital multimeter to the Model 2420 INPUT/OUTPUT jacks , as shown in Figure 1-1.
2. Select the multimeter DC volts function.
3. Set the voltage source protection to NONE. To do so, press CONFIG then SOURCE V to access the CONFIGURE V-SOURCE menu. Select PROTECTION, and set the volt­age source protection limit to NONE.
4. Set the Model 2420 to both source and measure voltage by pressing the SOURCE V and MEAS V keys, and make sure the source output is turned on.
5. Verify output voltage accuracy for each of the voltages listed in Table 1-4. For each test point:
• Select the correct source range.
• Set the Model 2420 output voltage to the indicated value as measured by the digital multimeter.
• Verify that the Model 2420 voltage reading is within the limits given in the table.
Output voltage limits
(1 year, 18°C-28°C)
199.360 to 200.640mV
1.99900 to 2.00100V
19.9936 to 20.0064V
59.9808 to 60.0192V
NOTE
It may not be possible to set the voltage sour ce to the specified value . Use the closest possible setting, and modify reading limits accordingly.
6. Repeat the procedure for negative source voltages with the same magnitudes as those listed in Table 1-4.
7. Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
Table 1-4
Voltage measurement accuracy limits
Performance Verification 1-11
*Measure range coupled to source range when simultaneously sourcing and measuring voltage. **As measured by precision digital multimeter. Use closest possible value, and modify reading limits
Output current accuracy
limits. The test involves setting the output current to each full-range value and measuring the currents with a precision digital multimeter.
10µA to 1A range accuracy
Figure 1-2
Connections for 10 to 1A range current verification tests
Model 2420 source
and measure range*
200mV 2V 20V 60V
Source voltage*
200.000mV
2.00000V
20.0000V
60.0000V
Model 2420 voltage reading
limits (1 year, 18°C-28°C)
199.676 to 200.324mV
1.99946 to 2.00054V
19.9960 to 20.0040V
59.9880 to 60.0120V
accordingly if necessary.
Follow the steps belo w to verify that Model 2420 output current accuracy is within specified
1. With the power of f, connect the digital multimeter to the Model 2420 INPUT/OUTPUT jacks, as shown in Figure 1-2.
4- WIRE
INPUT/
SENSE
OUTPUT
HI
75V
75V
5V
PEAK
PEAK
µ
A
MEAS
EDIT
V
I
DISPLAY
1
LOCAL
67
DIGITS SPEED
230
REL
FILTER
89
STORE
TOGGLE
POWER
2420 3A SourceMeter
SOURCE
FCTN
I
V
4
5
EDIT
TRIG
SWEEP
LIMIT
+/-
EXIT ENTER
RECALL
CONFIG MENU
PEAK
LO
250V PEAK
RANGE AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Model 2420
Input LO
Amps
Digital Multimeter
2. Select the multimeter DC current measuring function.
3. Press the Model 2420 SOURCE I key to source current, and make sure the source output is turned on.
1-12 Performance Verification
4. Verify output current accuracy for the 10µA-1A range currents listed in Table 1-5. For each test point:
• Select the correct source range.
• Set the Model 2420 output current to the correct value.
• Verify that the multimeter reading is within the limits given in the table.
5. Repeat the procedure for negative output currents with the same magnitudes as those listed in Table 1-5.
6. Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
3A range accuracy
1. With the power of f, connect the digital multimeter and the 1Ω resistor to the Model 2420 INPUT/OUTPUT jacks, as shown in Figure 1-3.
Figure 1-3
Connections for 3A range current verification tests
4- WIRE
INPUT/
SENSE
OUTPUT
HI
75V
75V
5V
PEAK
PEAK
PEAK
V
4
TRIG
+/-
CONFIG MENU
2420 3A SourceMeter
SOURCE
I
5
EDIT
SWEEP
EXIT ENTER
MEAS
EDIT
DISPLAY
TOGGLE
POWER
V
LOCAL
67
DIGITS SPEED
FCTN
I
1
230
REL
LIMIT
FILTER
89
RECALL
STORE
LO
250V
RANGE AUTO RANGE
PEAK
TERMINALS
ON/OFF
FRONT/ REAR
OUTPUT
1 Resistor
Model 2420
Input HI
Input LO
Digital Multimeter
2. Select the multimeter DC volts measuring function.
3. Press the Model 2420 SOURCE I key to source current, and make sure the source output is turned on.
4. Verify output current accuracy for the 3A range. Be sure to:
• Select the 3A source range.
• Set the Model 2420 output current to the correct 3A output value.
• Verify that the multimeter reading is within the 3A range limits given in Table 1-5. (Since the value of the 1 resistor value is assumed to be the same as its nominal value, the DMM voltage reading is the same as the sourced current.)
5. Repeat the procedure for a negative 3A current output value.
6. Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
Table 1-5
Output current accuracy limits
Performance Verification 1-13
Model 2420
source range
10µA 100µA 1mA 10mA 100mA 1A 3A*
* See separate procedure for 3A range. DMM voltage reading is same as sourced current.
1
Specifications valid for continuous output currents below 105mA. For operating abov e 105mA on the 1A range
for >1 minute, derate accuracy 10%/100mA above 105mA. For operating abov e 105mA on the 3A range for >1 minute, derate accuracy 10%/300mA above 105mA.
Model 2420
output current setting
10.0000µA
100.000µA
1.00000mA
10.0000mA
100.000mA
1.00000A
3.00000A
Current measurement accuracy
Follow the steps below to verify that Model 2420 current measurement accuracy is within specified limits. The procedure inv olves applying accurate currents from the Model 2420 current source and then verifying that Model 2420 current measurements are within required limits.
10µA to 1A range accuracy
1. With the power of f, connect the digital multimeter to the Model 2420 INPUT/OUTPUT jacks, as shown in Figure 1-2.
2. Select the multimeter DC current function.
3. Set the Model 2420 to both source and measure current by pressing the SOURCE I and MEAS I keys, and make sure the source output is turned on.
4. V erify measure current accurac y for the 10µA-1A range currents listed in Table 1-6. For each measurement:
• Select the correct source range.
• Set the Model 2420 source output to the correct value as measured by the digital multimeter .
• Verify that the Model 2420 current reading is within the limits given in the table.
Output current limits
(1 year, 18°C-28°C)
9.9947 to 10.0053µA
99.949 to 100.051µA
0.99946 to 1.00054mA
9.9935 to 10.0065mA
99.914 to 100.086mA
0.99843 to 1.00157A
2.99553 to 3.00447A
1 1
NOTE It may not be possible to set the current source to the specified value. Use the closest
possible setting, and modify reading limits accordingly.
5. Repeat the procedure for negative calibrator currents with the same magnitudes as those listed in Table 1-6.
6. Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
NOTE Test currents above 105mA cannot be maintained longer than 1 minute without
affecting accuracy. See derating information in Note 1 under Table 1-5.
1-14 Performance Verification
3A range accuracy
NOTE The 1Ω resistor should be characterized to within ±300ppm before verifying the 3A
current measur ement range. Use the 4-wir e ohms function of the DMM to measur e the resistance value, and then use that measur ed value to calculate the curr ent during the measurement procedure.
1. With the power off, connect the 1 resistor and digital multimeter to the Model 2420
INPUT/OUTPUT jacks, as shown in Figure 1-3.
2. Select the multimeter DC volts function.
3. Set the Model 2420 to both source and measure current by pressing the SOURCE I and MEAS I keys, and make sure the source output is turned on.
4. Verify measurement current accuracy for the 3A range as follows:
• Select the 3A source range.
• Set the Model 2420 source output to the correct 3A value as measured by the digital
multimeter.
• Note the DMM voltage reading, and then calculate the current from the voltage reading
and characterized 1 resistance value as I = V/R, where V is the DMM voltage reading and R is the characterized resistance value.
• V erify that the Model 2420 current reading is within the 3A limits gi ven in the Table 1-6.
NOTE It may not be possible to set the current sour ce to the specified 3A value . Use the clos-
est possible setting, and modify reading limits accordingly.
5. Repeat the procedure for a negative 3A current.
6. Repeat the procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
NOTE Test currents above 105mA cannot be maintained longer than 1 minute without
affecting accuracy. See derating information in Note 1 under Table 1-5.
Table 1-6
Current measurement accuracy limits
Model 2420 source
and measure range*
10µA 100µA 1mA 10mA 100mA 1A 3A
*Measure range coupled to source range when simultaneously sourcing and measuring current. **As measured by precision digital multimeter. Use closest possible value, and modify reading limits
accordingly if necessary.
***Current calculated as follows: I = V/R, where V is the DMM voltage reading, and R is the character-
ized value of the 1 resistor.
Source current**
10.00000µA
100.000µA
1.00000mA
10.0000mA
100.000mA
1.00000A
3.00000A
Model 2420 current reading limits (1
year, 18°C-28°C)
9.9966 to 10.0034µA
99.969 to 100.031µA
0.99967 to 1.00033mA
9.9959 to 10.0041mA
99.939 to 100.061mA
0.99883 to 1.00117A
2.99673 to 3.00327A***
Resistance measurement accuracy
Use the following steps to verify that Model 2420 resistance measurement accuracy is within specified limits. This procedure involves applying accurate resistances from a resis­tance calibrator and then verifying that Model 2420 resistance measurements are within re­quired limits.
CAUTION Before testing the 2Ω and 20 ranges make sure your resistance calibrator
can safely handle the default test currents for those ranges (see Model 2420 and calibrator specifications). If not, use the CONFIG OHMS menu to select the MANUAL source mode, then set the source current to an appropriate safe value. When using the manual source mode, total resistance reading uncer­tainty includes both Source I and Measure V uncertainty (see specifications), and calculated reading limits should take the additional uncertainty into ac­count.
If using the Fluke 5450A resistance calibrator, you cannot use the Auto Ohms mode of the Model 2420 to verify the 2 range. The 1A test current for the 2 range of the Model 2420 will damage the calibrator. On the Model 2420, use the CONFIG OHMS menu to select the MANUAL source mode, and then set the source (test) current to 100mA.
Performance Verification 1-15
1. With the power off, connect the resistance calibrator to the Model 2420 INPUT/OUT-
Figure 1-4
Connections for resistance accuracy verification
PUT and 4-WIRE SENSE jacks, as shown in Figure 1-4. Be sure to use the 4-wire con­nections as shown.
4- WIRE
INPUT/
SENSE
OUTPUT
HI
75V
75V
5V
PEAK
PEAK
PEAK
MEAS
EDIT
DISPLAY
TOGGLE
POWER
V
LOCAL
67
DIGITS SPEED
FCTN
I
V
1
230
4
TRIG
REL
LIMIT
FILTER
+/-
89
RECALL
STORE
CONFIG MENU
Model 2420
Resistance Calibrator
2420 3A SourceMeter
SOURCE
I
EDIT
5
SWEEP
EXIT ENTER
LO
RANGE AUTO
ON/OFF
RANGE
OUTPUT
Output HI
Output LO
250V
PEAK
TERMINALS
FRONT/
REAR
Sense HI
Sense LO
1-16 Performance Verification
2. Select the resistance calibrator external sense mode.
3. Configure the Model 2420 ohms function for the 4-wire sense mode as follows:
• Press CONFIG then MEAS . The instrument will display the following:
CONFIG OHMS
SOURCE SENSE-MODE GUARD
• Select SENSE-MODE, and then press ENTER. The following will be displayed:
SENSE-MODE
2-WIRE 4-WIRE
• Select 4-WIRE, and then press ENTER.
• Press EXIT to return to normal display.
4. Press MEAS to select the ohms measurement function, and make sure the source out-
put is turned on.
5. Verify ohms measurement accuracy for each of the resistance v alues listed in Table 1-7. For each measurement:
• Set the resistance calibrator output to the nominal resistance or closest av ailable value.
NOTE It may not be possible to set the resistance calibrator to the specified value. Use the
closest possible setting, and modify reading limits accordingly.
• Select the appropriate ohms measurement range with the RANGE keys.
• Verify that the Model 2420 resistance reading is within the limits given in the table.
6. Repeat the entire procedure using the rear panel INPUT/OUTPUT and 4-WIRE SENSE jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
Table 1-7
Ohms measurement accuracy limits
Model 2420 range Calibrator resistance*
2 20 200 2k 20k 200k 2M 20M
*Nominal resistance value. **Reading limits based on Model 2420 normal accuracy specifications and nominal resistance values. If actual
resistance values differ from nominal values sho wn, recalculate reading limits using actual calibrator resistance values and Model 2420 normal accuracy specifications. See Verification limits earlier in this section for details.
1.9 19 190
1.9k 19k 190k
1.9M 19M
Model 2420 resistance reading limits**
(1 year, 18°C-28°C)
1.89649 to 1.90351
18.9784 to 19.0216
189.824 to 190.176
1.89845 to 1.90155k
18.9850 to 19.0150k
189.847 to 190.153k
1.89861 to 1.90139M
18.9517 to 19.0483M
2
Calibration
Loading...
+ 105 hidden pages