Keithley Instruments, Inc. warrants this product to be free from defects in material and workmanship for a
period of 1 year from date of shipment.
Keithley Instruments, Inc. warrants the following items for 90 days from the date of shipment: probes, cables,
rechargeable batteries, diskettes, and documentation.
During the warranty period, we will, at our option, either repair or replace any product that proves to be defective.
To exercise this warranty, write or call your local Keithley representative, or contact Keithley headquarters in
Cleveland, Ohio. You will be given prompt assistance and return instructions. Send the product, transportation
prepaid, to the indicated service facility . Repairs will be made and the product returned, transportation prepaid.
Repaired or replaced products are warranted for the balance of the original warranty period, or at least 90 days.
LIMIT A TION OF W ARRANTY
This warranty does not apply to defects resulting from product modification without Keithley’s express written
consent, or misuse of any product or part. This warranty also does not apply to fuses, software, non-rechargeable batteries, damage from battery leakage, or problems arising from normal wear or failure to follow instructions.
THIS WARRANTY IS IN LIEU OF ALL OTHER WARRANTIES, EXPRESSED OR IMPLIED, INCLUDING ANY IMPLIED WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR USE.
THE REMEDIES PROVIDED HEREIN ARE BUYER’S SOLE AND EXCLUSIVE REMEDIES.
NEITHER KEITHLEY INSTRUMENTS, INC. NOR ANY OF ITS EMPLOYEES SHALL BE LIABLE FOR
ANY DIRECT , INDIRECT, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF
THE USE OF ITS INSTRUMENTS AND SOFTWARE EVEN IF KEITHLEY INSTRUMENTS, INC., HAS
BEEN ADVISED IN ADVANCE OF THE POSSIBILITY OF SUCH DAMAGES. SUCH EXCLUDED DAMAGES SHALL INCLUDE, BUT ARE NOT LIMITED TO: COSTS OF REMOVAL AND INSTALLATION,
LOSSES SUSTAINED AS THE RESULT OF INJURY T O ANY PERSON, OR DAMAGE T O PROPER TY.
The print history shown below lists the printing dates of all Revisions and Addenda created
for this manual. The Revision Le vel letter increases alphabetically as the manual under goes subsequent updates. Addenda, which are released between Revisions, contain important change information that the user should incorporate immediately into the manual. Addenda are numbered
sequentially . When a new Re vision is created, all Addenda associated with the previous Re vision
of the manual are incorporated into the new Revision of the manual. Each ne w Revision includes
a revised copy of this print history page.
Revision A (Document Number 2400-902-01)............................................................January 1996
Revision B (Document Number 2400-902-01).......................................................... February 1996
Addendum B (Document Number 2400-902-02).................................................... September 1996
Revision C (Document Number 2400-902-01)..................................................................July 2000
Revision D (Document Number 2400-902-01)........................................................November 2000
All Keithley product names are trademarks or registered trademarks of Keithley Instruments, Inc.
Other brand names are trademarks or registered trademarks of their respective holders.
Page 5
Safety Precautions
The following safety precautions should be observed before using this product and any associated instrumentation. Although some instruments and accessories would normally be used with non-hazardous
voltages, there are situations where hazardous conditions may be present.
This product is intended for use by qualified personnel who recognize shock hazards and are familiar
with the safety precautions required to avoid possible injury. Read the operating information carefully
before using the product.
The types of product users are:
Responsible body is the individual or group responsible for the use and maintenance of equipment, for
ensuring that the equipment is operated within its specifications and operating limits, and for ensuring
that operators are adequately trained.
Operators use the product for its intended function. They must be trained in electrical safety procedures
and proper use of the instrument. They must be protected from electric shock and contact with hazardous
live circuits.
Maintenance personnel perform routine procedures on the product to keep it operating, for example,
setting the line voltage or replacing consumable materials. Maintenance procedures are described in the
manual. The procedures explicitly state if the operator may perform them. Otherwise, they should be
performed only by service personnel.
Service personnel are trained to work on live circuits, and perform safe installations and repairs of prod-
ucts. Only properly trained service personnel may perform installation and service procedures.
Exercise extreme caution when a shock hazard is present. Lethal voltage may be present on cable con-
nector jacks or test fixtures. The American National Standards Institute (ANSI) states that a shock hazard exists when voltage lev els greater than 30V RMS, 42.4V peak, or 60VDC are present.
practice is to expect that hazardous voltage is present in any unknown circuit before measuring.
Users of this product must be protected from electric shock at all times. The responsible body must ensure that users are prevented access and/or insulated from every connection point. In some cases, connections must be exposed to potential human contact. Product users in these circumstances must be
trained to protect themselves from the risk of electric shock. If the circuit is capable of operating at or
above 1000 volts,
As described in the International Electrotechnical Commission (IEC) Standard IEC 664, digital multimeter measuring circuits (e.g., Keithley Models 175A, 199, 2000, 2001, 2002, and 2010) are Installation
Category II. All other instruments’ signal terminals are Installation Category I and must not be connected to mains.
Do not connect switching cards directly to unlimited power circuits. They are intended to be used with
impedance limited sources. NEVER connect switching cards directly to AC mains. When connecting
sources to switching cards, install protective devices to limit fault current and voltage to the card.
Before operating an instrument, make sure the line cord is connected to a properly grounded power receptacle. Inspect the connecting cables, test leads, and jumpers for possible wear, cracks, or breaks before each use.
For maximum safety, do not touch the product, test cables, or any other instruments while power is applied to the circuit under test. ALWAYS remove power from the entire test system and discharge any
capacitors before: connecting or disconnecting cables or jumpers, installing or removing switching
cards, or making internal changes, such as installing or removing jumpers.
no conductive part of the circuit may be exposed.
A good safety
Page 6
Do not touch any object that could provide a current path to the common side of the circuit under test or power
line (earth) ground. Alw ays make measurements with dry hands while standing on a dry, insulated surface capable of withstanding the voltage being measured.
The instrument and accessories must be used in accordance with its specifications and operating instructions
or the safety of the equipment may be impaired.
Do not exceed the maximum signal levels of the instruments and accessories, as defined in the specifications
and operating information, and as shown on the instrument or test fixture panels, or switching card.
When fuses are used in a product, replace with same type and rating for continued protection against fire hazard.
Chassis connections must only be used as shield connections for measuring circuits, NOT as safety earth
ground connections.
If you are using a test fixture, keep the lid closed while power is applied to the device under test. Safe operation
requires the use of a lid interlock.
If a scre w is present, connect it to safety earth ground using the wire recommended in the user documentation.
!
The symbol on an instrument indicates that the user should refer to the operating instructions located in
the manual.
The symbol on an instrument shows that it can source or measure 1000 volts or more, including the combined effect of normal and common mode voltages. Use standard safety precautions to av oid personal contact
with these voltages.
The
WARNING heading in a manual explains dangers that might result in personal injury or death. Always
read the associated information very carefully before performing the indicated procedure.
The
CAUTION heading in a manual explains hazards that could damage the instrument. Such damage may
invalidate the warranty.
Instrumentation and accessories shall not be connected to humans.
Before performing any maintenance, disconnect the line cord and all test cables.
To maintain protection from electric shock and fire, replacement components in mains circuits, including the
power transformer, test leads, and input jacks, must be purchased from Keithley Instruments. Standard fuses,
with applicable national safety approvals, may be used if the rating and type are the same. Other components
that are not safety related may be purchased from other suppliers as long as they are equivalent to the original
component. (Note that selected parts should be purchased only through Keithley Instruments to maintain accuracy and functionality of the product.) If you are unsure about the applicability of a replacement component,
call a Keithley Instruments office for information.
To clean an instrument, use a damp cloth or mild, water based cleaner. Clean the exterior of the instrument
only. Do not apply cleaner directly to the instrument or allow liquids to enter or spill on the instrument. Products that consist of a circuit board with no case or chassis (e.g., data acquisition board for installation into a
computer) should never require cleaning if handled according to instructions. If the board becomes contaminated and operation is affected, the board should be returned to the factory for proper cleaning/servicing.
Use the procedures in this section to verify that Model 2400 accuracy is within the limits stated
in the instrument’s one-year accuracy specifications. You can perform these verification
procedures:
•When you first receive the instrument to make sure that it was not damaged during
shipment.
•To verify that the unit meets factory specifications.
•To determine if calibration is required.
•Following calibration to make sure it was performed properly.
WARNINGThe information in this section is intended for qualified service personnel
only. Do not attempt these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages, which
could cause personal injury or death if contacted. Use standard safety precautions when working with hazardous voltages.
NOTEIf the instrument is still under warranty and its performance is outside specified
limits, contact your Keithley representative or the factory to determine the correct
course of action.
V erification test requirements
Be sure that you perform the verification tests:
•Under the proper environmental conditions.
•After the specified warm-up period.
•Using the correct line voltage.
•Using the proper test equipment.
•Using the specified output signal and reading limits.
Environmental conditions
Conduct your performance verification procedures in a test environment with:
•An ambient temperature of 18-28°C (65-82°F).
•A relative humidity of less than 70% unless otherwise noted.
W arm-up period
Allow the Model 2400 to warm up for at least one hour before conducting the verification
procedures. If the instrument has been subjected to temperature extremes (those outside the
ranges stated above), allow additional time for the instrument’s internal temperature to
stabilize. Typically, allow one extra hour to stabilize a unit that is 10°C (18°F) outside the
specified temperature range.
Also, allow the test equipment to warm up for the minimum time specified by the
manufacturer.
Page 17
Performance Verification1-3
Line power
The Model 2400 requires a line voltage of 88 to 264V and a line frequency of 50 or 60Hz.
Verification tests should be performed within this range.
Recommended test equipment
Table 1-1 summarizes recommended verification equipment. You can use alternate equipment
as long as that equipment has specifications at least as good as those listed in Table 1-1. Keep
in mind, however, that test equipment uncertainty will add to the uncertainty of each measurement. Generally, test equipment uncertainty should be at least four times better than corresponding Model 2400 specifications. Table 1-1 lists the uncertainties of the recommended test
equipment.
Table 1-1
Recommended verifi cation equipment
DescriptionManufacturer/ModelAccuracy*
Digital Multimeter
Resistance calibrator
**90-day specifications show accuracy at specified measurement point.
The verification limits stated in this section have been calculated using only the Model 2400
one-year accuracy specifications, and they do not include test equipment uncertainty. If a
particular measurement falls outside the allowable range, recalculate new limits based both on
Model 2400 specifications and corresponding test equipment specifications.
Page 18
1-4Performance Verification
Example limits calculation
As an example of how verification limits are calculated, assume you are testing the 20V DC
output range using a 20V output value. Using the Model 2400 one-year accuracy specification
for 20V DC output of ±(0.02% of output + 2.4mV offset), the calculated output limits are:
When verifying the ohms function, you may find it necessary to recalculate resistance limits
based on the actual calibrator resistance values. You can calculate resistance reading limits in
the same manner described above, but be sure to use the actual calibrator resistance values and
the Model 2400 normal accuracy specifications for your calculations.
As an example, assume that you are testing the 20k
19k
Ω calibrator resistor is 19.025k Ω. Using the Model 2400 one-year normal accuracy
specifications of ±(0.063% of reading + 3
Reading limits = 19.025k
Before performing the verification procedures, restore the instrument to its factory front panel
(bench) defaults as follows:
1.Press MENU key. The instrument will display the following prompt:
MAIN MENU
SAVESETUP COMMUNICATION CAL
2.Select SAVESETUP, and then press ENTER. The unit then displays:
SETUP MENU
SAVE RESTORE POWERON RESET
3.Select RESET, and then press ENTER. The unit displays:
RESET ORIGINAL DFLTS
BENCH GPIB
4.Select BENCH, and then press ENTER. The unit then displays:
RESETTING INSTRUMENT
ENTER to confirm; EXIT to abort
5.Press ENTER to restore bench defaults, and note the unit displays the following:
RESET COMPLETE
BENCH defaults are now restored
Press ENTER to continue
6.Press ENTER and then EXIT to return to normal display.
Ω range, and the actual value of the nominal
Ω), the recalculated reading limits are:
Page 19
Performance Verification1-5
Performing the verification test procedures
T est summary
•DC voltage output accuracy
•DC voltage measurement accuracy
•DC current output accuracy
•DC current measurement accuracy
•Resistance measurement accuracy
If the Model 2400 is not within specifications and not under warranty, see the calibration
procedures in Section 2 for information on calibrating the unit.
T est considerations
When performing the verification procedures:
•Be sure to restore factory front panel defaults as outlined above.
•Make sure that the test equipment is properly warmed up and connected to the
Model 2400 INPUT/OUTPUT jacks. Also ensure that the front panel jacks are selected
with the TERMINALS key.
•Make sure the Model 2400 is set to the correct source range.
•Be sure the Model 2400 output is turned on before making measurements.
•Be sure the test equipment is set up for the proper function and range.
•Allow the Model 2400 output signal to settle before making a measurement.
•Do not connect test equipment to the Model 2400 through a scanner, multiplexer, or
other switching equipment.
WARNINGThe maximum common-mode voltage (voltage between LO and chassis
ground) is 250V peak. Exceeding this value may cause a breakdown in
insulation, creating a shock hazard.
CAUTIONThe maximum voltage between INPUT/OUTPUT HI and LO or 4-WIRE
SENSE HI and LO is 250V peak. The maximum voltage between INPUT/
OUTPUT HI and 4-WIRE SENSE HI or between INPUT/OUTPUT LO
and 4-WIRE SENSE LO is 5V. Exceeding these voltages may result in
instrument damage.
Page 20
1-6Performance Verification
Setting the source range and output value
Before testing each verification point, you must properly set the source range and output value
as outlined below.
1.Press either the SOURCE V or SOURCE I key to select the appropriate source
function.
2.Press the EDIT key as required to select the source display field. Note that the cursor
will flash in the source field while its value is being edited.
3.With the cursor in the source display field flashing, set the source range to the lowest
possible range for the value to be sourced using the up or do wn RANGE k ey. For example, you should use the 20V source range to output a 19V or 20V source value. With a
20V source value and the 20V range selected, the source field display will appear as
follows:
Vsrc:+20.0000 V
4.With the source field cursor flashing, set the source output to the required value using
either:
•The SOURCE adjustment and left and right arrow keys.
•The numeric keys.
5.Note that the source output value will be updated immediately; you need not press
ENTER when setting the source value.
Setting the measurement range
When simultaneously sourcing and measuring either voltage or current, the measure range is
coupled to the source range, and you cannot independently control the measure range. Thus, it
is not necessary for you to set the range when testing voltage or current measurement accurac y.
Compliance considerations
Compliance limits
When sourcing voltage, you can set the SourceMeter to limit current from 1nA to 1.05A.
Conversely, when sourcing current, you can set the SourceMeter to limit voltage from 200µV
to 210V. The SourceMeter output will not exceed the programmed compliance limit.
T ypes of compliance
There are two types of compliance that can occur: “real” and “range.” Depending upon which
value is lower , the output will clamp at either the displayed compliance setting (“real”) or at the
maximum measurement range reading (“range”).
Page 21
The “real” compliance condition can occur when the compliance setting is less than the highest
possible reading of the measurement range. When in compliance, the source output clamps at
the displayed compliance value. For example, if the compliance voltage is set to 1V and the
measurement range is 2V, the output voltage will clamp (limit) at 1V.
“Range” compliance can occur when the compliance setting is higher than the possible reading
of the selected measurement range. When in compliance, the source output clamps at the
maximum measurement range reading (not the compliance value). For example, if the
compliance voltage is set to 1V and the measurement range is 200mV, the output voltage will
clamp (limit) at 210mV.
Maximum compliance values
The maximum compliance values for the measurement ranges are summarized as follows:
Performance Verification1-7
Measurement
range
200mV
2V
20V
200V
1µA
10µA
100µA
1mA
10mA
100mA
1A
When the SourceMeter goes into compliance, the “Cmpl” label or the units label (i.e., “mA”)
for the compliance display will flash.
Maximum
compliance value
210mV
1.05µA
10.5µA
1.05mA
10.5mA
105mA
Determining compliance limit
The relationships to determine which compliance is in effect are summarized as follows. They
assume the measurement function is the same as the compliance function.
•Compliance Setting < Measurement Range = Real Compliance
•Measurement Range < Compliance Setting = Range Compliance
2.1V
21V
210V
105µA
1.05A
You can determine the compliance that is in effect by comparing the displayed compliance
setting to the present measurement range. If the compliance setting is lower than the maximum
possible reading on the present measurement range, the compliance setting is the compliance
limit. If the compliance setting is higher than the measurement range, the maximum reading on
that measurement range is the compliance limit.
Page 22
1-8Performance Verification
T aking the SourceMeter out of compliance
Verification measurements should not be made when the SourceMeter is in compliance. For
purposes of the verification tests, the SourceMeter can be taken out of compliance by going
into the edit mode and increasing the compliance limit.
NOTEDo not take the unit out of compliance by decreasing the source value or changing
the range. Always use the recommended range and source settings when performing
the verification tests.
Output voltage accuracy
Follow the steps below to verify that Model 2400 output voltage accuracy is within specified
limits. This test involves setting the output voltage to each full-range value and measuring the
voltages with a precision digital multimeter.
1.With the power of f, connect the digital multimeter to the Model 2400 INPUT/OUTPUT
jacks, as shown in Figure 1-1.
2.Select the multimeter DC volts measuring function.
NOTEThe default voltage source protection value is 40V . Befor e testing the 200V r ange, set
the voltage source protection value to >200V. To do so, press CONFIG then
SOURCE V to access the CONFIGURE V-SOURCE menu, then select PROTECTION and set the limit value to >200V.
3.Press the Model 2400 SOURCE V k ey to source voltage, and make sure the source output is turned on.
4.Verify output voltage accuracy for each of the voltages listed in Table 1-2. For each test
point:
•Select the correct source range.
•Set the Model 2400 output voltage to the indicated value.
•Verify that the multimeter reading is within the limits given in the table.
Page 23
Fi
gure 1-
1
Voltage verifi cation front panel connections
EDIT
DISPLAY
TOGGLE
POWER
V
LOCAL
67
DIGITS SPEED
MEAS
FCTN
I
Ω
1
230
REL
FILTER
LIMIT
89
STORE
RECALL
V
4
TRIG
SWEEP
+/-
CONFIG MENU
I
5
SOURCE
Model 2400
EDIT
EXIT ENTER
2400 SourceMeter
RANGE
AUTO
RANGE
Performance Verification1-9
4-WIRE
INPUT/
SENSE
OUTPUT
HI
250V
PEAK
ON/OFF
OUTPUT
250V
5V
PEAK
PEAK
LO
250V
PEAK
TERMINALS
FRONT/
REAR
Input HI
Input LO
Digital Multimeter
5.Repeat the procedure for negative output voltages with the same magnitudes as those
listed in Table 1-2.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-2
Output voltage accuracy limits
Model 2400
source range
200mV
2V
20V
200V
Model 2400
output voltage setting
200.000mV
2.00000V
20.0000V
200.000V
Output voltage limits
(1 year, 18°C–28°C)
199.360 to 200.640mV
1.99900 to 2.00100V
19.9936 to 20.0064V
199.936 to 200.064V
Page 24
1-10Performance Verification
V oltage measurement accuracy
Follow the steps below to verify that Model 2400 voltage measurement accuracy is within
specified limits. The test involves setting the source voltage to 95% of full-range values, as
measured by a precision digital multimeter, and then verifying that the Model 2400 voltage
readings are within required limits.
1.With the power of f, connect the digital multimeter to the Model 2400 INPUT/OUTPUT
jacks, as shown in Figure 1-1.
2.Select the multimeter DC volts function.
NOTEThe default voltage source protection value is 40V . Befor e testing the 200V r ange, set
the voltage source protection value to >200V. To do so, press CONFIG then
SOURCE V to access the CONFIGURE V-SOURCE menu, then select PROTECTION and set the limit value to >200V.
3.Set the Model 2400 to both source and measure voltage by pressing the SOURCE V
and MEAS V keys, and make sure the source output is turned on.
4.Verify output voltage accuracy for each of the voltages listed in Table 1-3. For each test
point:
•Select the correct source range.
•Set the Model 2400 output voltage to the indicated value as measured by the digital
multimeter.
•Verify that the Model 2400 voltage reading is within the limits given in the table.
NOTEIt may not be possible to set the voltage sour ce to the specified value . Use the closest
possible setting, and modify reading limits accordingly.
5.Repeat the procedure for negative source voltages with the same magnitudes as those
listed in Table 1-3.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-3
Voltage measurement accuracy limits
Model 2400 source and
measure range*Source voltage**
0200mv190.000mV189.677 to 190.323mV
002V1.90000V1.89947 to 1.90053V
020V19.0000V18.9962 to 19.0038V
200V190.000V189.962 to 190.038V
**Measure range coupled to source range when simultaneously sourcing and measuring voltage.
**As measured by precision digital multimeter. Use closest possible value, and modify reading limits
accordingly if necessary.
Model 2400 voltage reading limits
(1 year, 18°C–28°C)
Page 25
Output current accuracy
Fi
2
Follow the steps below to verify that Model 2400 output current accuracy is within specified
limits. The test involves setting the output current to each full-range value and measuring the
currents with a precision digital multimeter.
1.With the power of f, connect the digital multimeter to the Model 2400 INPUT/OUTPUT
jacks, as shown in Figure 1-2.
2.Select the multimeter DC current measuring function.
3.Press the Model 2400 SOURCE I key to source current, and make sure the source output is turned on.
gure 1-
Current verifi cation connections
Performance Verification1-11
4-WIRE
INPUT/
SENSE
OUTPUT
HI
EDIT
DISPLAY
TOGGLE
POWER
V
REL
LOCAL
67
DIGITS SPEED
250V
PEAK
MEAS
FCTN
I
Ω
230
LIMIT
89
RECALL
V
4
TRIG
SWEEP
+/-
CONFIG MENU
5
1
FILTER
STORE
SOURCE
I
EDIT
EXIT ENTER
2400 SourceMeter
RANGE
AUTO
RANGE
OUTPUT
ON/OFF
250V
5V
PEAK
PEAK
LO
250V
PEAK
TERMINALS
FRONT/
REAR
Model 2400
Input LO
Amps
Digital Multimeter
Page 26
1-12Performance Verification
4.Verify output current accuracy for each of the currents listed in Table 1-4. For each test
point:
•Select the correct source range.
•Set the Model 2400 output current to the correct value.
•Verify that the multimeter reading is within the limits given in the table.
5.Repeat the procedure for negative output currents with the same magnitudes as those
listed in Table 1-4.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-4
Output current accuracy limits
Model 2400
source range
1µA
10µA
100µA
1mA
10mA
100mA
1A
Model 2400 output
current setting
1.00000µA
10.0000µA
100.000µA
1.00000mA
10.0000mA
100.000mA
1.00000A
Current measurement accuracy
Follow the steps below to verify that Model 2400 current measurement accuracy is within
specified limits. The procedure involves applying accurate currents from the Model 2400
current source and then verifying that Model 2400 current measurements are within required
limits.
1.With the power of f, connect the digital multimeter to the Model 2400 INPUT/OUTPUT
jacks as shown in Figure 1-2.
2.Select the multimeter DC current function.
3.Set the Model 2400 to both source and measure current by pressing the SOURCE I and
MEAS I keys, and make sure the source output is turned on.
4.Verify measure current accuracy for each of the currents listed in Table 1-5. For each
measurement:
•Select the correct source range.
•Set the Model 2400 source output to the correct value as measured by the digital
multimeter.
•Verify that the Model 2400 current reading is within the limits given in the table.
Output current limits
(1 year, 18°C–28°C)
0.99905 to 1.00095µA
9.9947 to 10.0053µA
99.949 to 100.051µA
0.99946 to 1.00054mA
9.9935 to 10.0065mA
99.914 to 100.086mA
0.99640 to 1.00360A
Page 27
Performance Verification1-13
NOTEIt may not be possible to set the current source to the specified value. Use the closest
possible setting, and modify reading limits accordingly.
5.Repeat the procedure for negativ e calibrator currents with the same magnitudes as those
listed in Table 1-5.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-5
Current measurement accuracy limits
Model 2400 source and
measure range*Source current**
1µA
10µA
100µA
1mA
10mA
100mA
1A
**Measure range coupled to source range when simultaneously sourcing and measuring current.
**As measured by precision digital multimeter. Use closest possible value, and modify reading limits
accordingly if necessary.
0.95000µA
9.5000µA
95.000µA
0.95000mA
9.5000mA
95.000mA
0.95000A
Resistance measurement accuracy
Follow the steps below to verify that Model 2400 resistance measurement accuracy is within
specified limits. This procedure involves applying accurate resistances from a resistance
calibrator and then verifying that Model 2400 resistance measurements are within required
limits.
1.With the power off, connect the resistance calibrator to the Model 2400 INPUT/OUT-
PUT and 4-WIRE SENSE jacks as shown in Figure 1-3. Be sure to use the four-wire
connections as shown
2.Select the resistance calibrator external sense mode.
3.Configure the Model 2400 ohms function for the 4-wire sense mode as follows:
•Press CONFIG then MEAS
CONFIG OHMS
SOURCE SENSE-MODE GUARD
•Select SENSE-MODE, and then press ENTER. The following will be displayed:
SENSE-MODE
2-WIRE 4-WIRE
•Select 4-WIRE, and then press ENTER.
•Press EXIT to return to normal display.
Ω. The instrument will display the following:
Model 2400 current reading limits
(1 year, 18°C–28°C)
0.94942 to 0.95058µA
9.4967 to 9.5033µA
94.970 to 95.030µA
0.94968 to 0.95032mA
9.4961 to 9.5039mA
94.942 to 95.058mA
0.94734 to 0.95266A
Page 28
1-14Performance Verification
4.Press MEAS Ω to select the ohms measurement function, and make sure the source output is turned on.
5.Verify ohms measurement accuracy for each of the resistance values listed in T able 1-6.
For each measurement:
•Set the resistance calibrator output to the nominal resistance or closest available
value.
NOTEIt may not be possible to set the resistance calibrator to the specified value. Use the
closest possible setting, and modify reading limits accordingly.
•Select the appropriate ohms measurement range with the RANGE keys.
•Verify that the Model 2400 resistance reading is within the limits given in the
table.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT and 4-WIRE
SENSE jacks. Be sure to select the rear panel jacks with the front panel TERMINALS
key.
Table 1-6
Ohms measurement accuracy limits
Calibrator
Model 2400 range
20Ω
200Ω
2kΩ
20kΩ
200kΩ
2MΩ
20MΩ
200MΩ
** Nominal resistance value.
** Reading limits based on Model 2400 normal accuracy specifications and nominal resistance values. If
actual resistance values differ from nominal values shown, recalculate reading limits using actual calibrator
resistance values and Model 2400 normal accuracy specifications. See “Verification limits” earlier in this
section for details.
resistance*
19Ω
190Ω
1.9kΩ
19kΩ
190kΩ
1.9MΩ
19MΩ
100MΩ
Model 2400 resistance reading limits**
(1 year, 18°C-28°C)
18.9784 to 19.0216Ω
189.824 to 190.176Ω
1.89845 to 1.90155kΩ
18.9850 to 19.0150kΩ
189.847 to 190.153kΩ
1.89761 to 1.90239MΩ
18.9781 to 19.0219MΩ
99.020 to 100.980MΩ
Page 29
Fi
gure 1-
3
Resistance verifi cation connections
V
LOCAL
67
DIGITS SPEED
MEAS
I
Ω
1
REL
FILTER
STORE
EDIT
DISPLAY
TOGGLE
POWER
FCTN
230
LIMIT
89
RECALL
CONFIG MENU
Model 2400
Performance Verification1-15
4-WIRE
INPUT/
SENSE
OUTPUT
HI
250V
PEAK
2400 SourceMeter
SOURCE
I
V
4
5
TRIG
SWEEP
+/-
EXIT ENTER
RANGE
EDIT
AUTO
ON/OFF
RANGE
OUTPUT
250V
5V
PEAK
PEAK
LO
250V
PEAK
TERMINALS
FRONT/
REAR
Output HI
Sense HI
Resistance Calibrator
Output LO
Sense LO
Page 30
1-16Performance Verification
Page 31
2
Calibration
Page 32
2-2Calibration
Introduction
Use the procedures in this section to calibrate the Model 2400. These procedures require
accurate test equipment to measure precise DC voltages and currents. Calibration can be
performed either from the front panel or by sending SCPI calibration commands over the
IEEE-488 bus or RS-232 port with the aid of a computer.
WARNINGThe information in this section is intended for qualified service personnel
only. Do not attempt these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages.
Environmental conditions
T emperature and relative humidity
Conduct the calibration procedures at an ambient temperature of 18-28°C (65-82°F) with
relative humidity of less than 70% unless otherwise noted.
W arm-up period
Allow the Model 2400 to warm up for at least one hour before performing calibration.
If the instrument has been subjected to temperature extremes (those outside the ranges stated
above), allow additional time for the instrument’s internal temperature to stabilize. Typically,
allow one extra hour to stabilize a unit that is 10°C (18°F) outside the specified temperature
range.
Also, allow the test equipment to warm up for the minimum time specified by the
manufacturer.
Line power
The Model 2400 requires a line voltage of 88 to 264V at line frequency of 50 or 60Hz. The
instrument must be calibrated within this range.
Page 33
Calibration2-3
Calibration considerations
When performing the calibration procedures:
•Make sure that the test equipment is properly warmed up and connected to the
Model 2400 front panel INPUT/ OUTPUT jacks. Also be certain that the front panel
jacks are selected with the TERMINALS switch.
•Always allow the source signal to settle before calibrating each point.
•Do not connect test equipment to the Model 2400 through a scanner or other switching
equipment.
•If an error occurs during calibration, the Model 2400 will generate an appropriate error
message. See Appendix B for more information.
WARNINGThe maximum common-mode voltage (voltage between LO and chassis
ground) is 250V peak. Exceeding this value may cause a breakdown in
insulation, creating a shock hazard.
CAUTIONThe maximum voltage between INPUT/OUTPUT HI and LO or 4-WIRE
Calibration cycle
Perform calibration at least once a year to ensure the unit meets or exceeds its specifications.
SENSE HI and LO is 250V peak. The maximum voltage between INPUT/
OUTPUT HI and 4-WIRE SENSE HI or between INPUT/OUTPUT LO
and 4-WIRE SENSE LO is 5V. Exceeding these voltage values may result
in instrument damage.
Page 34
2-4Calibration
Recommended calibration equipment
Table 2-1 lists the recommended equipment for the calibration procedures. You can use
alternate equipment as long as that equipment has specifications at least as good as those listed
in the table. When possible, test equipment specifications should be at least four times better
than corresponding Model 2400 specifications.
Table 2-1
Recommended calibration equipment
DescriptionManufacturer/ModelAccuracy*
Digital MultimeterHewlett Packard
*90-day specifications show accuracy at specified measurement point.
Unlocking calibration
Before performing calibration, you must first unlock calibration by entering or sending the
calibration password as follows:
Front panel calibration password
1.Press the MENU key , then choose CAL, and press ENTER. The instrument will display
the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
2.Select UNLOCK, and then press ENTER. The instrument will display the following:
PASSWORD:
Use
,
,
3.Use the up and down RANGE keys to select the letter or number, and use the left and
right arrow keys to choose the position. (Press down RANGE for letters; up RANGE
for numbers.) Enter the current password on the display. (Front panel default: 002400.)
4.Once the correct password is displayed, press the ENTER key. If the password was correctly entered, the following message will be displayed.
CALIBRATION UNLOCKED
Calibration can now be executed
5.Press EXIT to return to normal display. Calibration will be unlocked and assume the
states summarized in Table 2-2. Attempts to change any of the settings listed below
with calibration unlocked will result in an error +510, “Not permitted with cal unlocked.”
NOTEWith calibration unlocked, the sense function and range track the source function
and range. That is, when :SOUR:FUNC is set to VOLT, the :SENS:FUNC setting
will be ‘VOLT:DC’. When :SOUR:FUNC is set to CURR, the :SENS:FUNC setting
will be ‘CURR:DC’. A similar command coupling exists for :SOUR:VOLT:RANG/
:SENS:VOLT:RANG and SOUR:CURR:RANG/:SENS:CURR:RANG.
Table 2-2
Calibration unlocked states
ModeStateEquivalent remote command
Concurrent Functions
Sense Function
Sense Volts NPLC
Sense Volts Range
Sense Current NPLC
Sense Current Range
Filter Count
Filter Control
Filter A veraging
Source V Mode
Volts Autorange
Source I Mode
Current Autorange
Autozero
Trigger Arm Count
Trigger Arm Source
Trigger Count
Trigger Source
OFF
Source
1.0
Source V
1.0
Source I
10
REPEAT
ON
FIXED
OFF:SOUR
FIXED
OFF
ON
1
Immediate
1
Immediate
:SENS:FUNC:CONC OFF
:SENS:FUNC <source_function>
:SENS:VOLT:NPLC 1.0
:SENS:VOLT:RANG <source_V_range>
:SENS:CURR:NPLC 1.0
:SENS:CURR:RANG <source_I_range>
:SENS:AVER:COUN 10
:SENS:AVER:TCON REPeat
:SENS:AVER:STAT ON
:SOUR:VOLT:MODE FIXED
:VOLT:RANGE:AUTO OFF
:SOUR:CURR:MODE FIXED
:SOUR:CURR:RANGE:AUTO OFF
:SYST:AZERO ON
:ARM:COUNT 1
:ARM:SOUR IMMediate
:TRIG:COUNT 1
:TRIG:SOUR IMMediate
Remote calibration password
To unlock calibration via remote, send the following command:
:CAL:PROT:CODE '<password>'
For example, the following command uses the default password:
:CAL:PROT:CODE 'KI002400'
Page 36
2-6Calibration
Changing the password
The default password may be changed from the front panel or via remote as discussed in the
following paragraphs.
Front panel password
Follow the steps below to change the password from the front panel:
1.Press the MENU key , then choose CAL, and press ENTER. The instrument will display
the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
2.Select CHANGE-PASSWORD, and then press ENTER. The instrument will display
the following:
NEW PWD: 002400
Use
,
,
,
, ENTER, or EXIT.
3.Using the range keys, and the left and right arrow keys, enter the new password on the
display.
4.Once the desired password is displayed, press the ENTER key to store the new
password.
Remote password
To change the calibration password via remote, first send the present password, and then send
the new password. For example, the following command sequence changes the password from
the 'KI002400' remote default to 'KI_CAL':
:CAL:PROT:CODE 'KI002400'
:CAL:PROT:CODE 'KI_CAL'
You can use any combination of letters and numbers up to a maximum of eight characters.
NOTEIf you change the first two char acters of the password to something other than “KI”,
you will not be able to unlock calibration from the front panel.
Resetting the calibration password
If you lose the calibration password, you can unlock calibration by shorting together the CAL
pads, which are located on the display board. Doing so will also reset the password to the
factory default (KI002400).
See Section 5 for details on disassembling the unit to access the CAL pads. Refer to the
display board component layout drawing at the end of Section 6 for the location of the CAL
pads.
Page 37
Viewing calibration dates and calibration count
When calibration is locked, only the UNLOCK and VIEW-DATES selections will be
accessible in the calibration menu. To view calibration dates and calibration count at any time:
1.From normal display, press MENU, select CAL, and then press ENTER. The unit will
display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
2.Select VIEW-DATES, and then press ENTER. The Model 2400 will display the next
and last calibration dates and the calibration count as in the following example:
NEXT CAL: 12/15/96
Last calibration: 12/15/95 Count: 00001
Calibration errors
Calibration2-7
The Model 2400 checks for errors after each calibration step, minimizing the possibility that
improper calibration may occur due to operator error.
Front panel error reporting
If an error is detected during comprehensive calibration, the instrument will display an
appropriate error message (see Appendix B). The unit will then prompt you to repeat the
calibration step that caused the error.
Remote error reporting
You can detect errors while in remote by testing the state of EAV (Error Available) bit (bit 2) in
the status byte. (Use the *STB? query to request the status byte.) Query the instrument for the
type of error by using the appropriate :SYST:ERR? query. The Model 2400 will respond with
the error number and a text message describing the nature of the error. See Appendix B for
details.
Page 38
2-8Calibration
Front panel calibration
The front panel calibration procedure described in the following paragraphs calibrates all
ranges of both the current and voltage source and measure functions. Note that each function is
separately calibrated by repeating the entire procedure for each range.
Step 1. Prepare the Model 2400 for calibration
1.Turn on the Model 2400 and the digital multimeter, and allow them to warm up for at
least one hour before performing calibration.
2.Press the MENU key, then choose CAL, and press ENTER. Select UNLOCK, and then
press ENTER. The instrument will display the following:
PASSWORD:
Use
,
,
,
, ENTER, or EXIT.
3.Use the up and down keys to select the letter or number, and use the left and right arrow
keys to choose the position. Enter the present password on the display. (Front panel
default: 002400.) Press ENTER to complete the process.
4.Press EXIT to return to normal display. Instrument operating states will be set as summarized in Table 2-2.
Step 2. V oltage calibration
Perform the steps below for each voltage range, using Table 2-3 as a guide.
1.Connect the Model 2400 to the digital multimeter, as shown in Figure 2-1. Select the
multimeter DC volts measurement function.
NOTEThe 2-wire connections shown assume that remote sensing is not used. Remote sens-
ing may be used, if desired, but it is not essential when using recommended digital
multimeter.
2.From normal display, press the SOURCE V key.
3.Press the EDIT key to select the source field (cursor flashing in source display field),
and then use the down RANGE key to select the 200mV source range.
4.From normal display, press MENU.
5. Select CAL, and then press ENTER. The unit will display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
6.Select EXECUTE, and then press ENTER. The instrument will display the following
message:
V-CAL
Press ENTER to Output +200.00mV
7.Press ENTER. The Model 2400 will source +200mV and simultaneously display the
following:
DMM RDG: +200.0000mV
Use
,
,
,
, ENTER, or EXIT.
Page 39
Fi
gure 2-
1
Voltage calibration connections
EDIT
V
DISPLAY
TOGGLE
POWER
REL
LOCAL
67
DIGITS SPEED
1
Calibration2-9
4-WIRE
INPUT/
SENSE
OUTPUT
HI
250V
PEAK
MEAS
FCTN
I
Ω
V
230
4
5
TRIG
LIMIT
RECALL
SWEEP
+/-
CONFIG MENU
FILTER
89
STORE
SOURCE
I
EDIT
EXIT ENTER
2400 SourceMeter
RANGE
AUTO
RANGE
ON/OFF
OUTPUT
250V
5V
PEAK
PEAK
LO
250V
PEAK
TERMINALS
FRONT/
REAR
Model 2400
Input HI
Input LO
Digital Multimeter
8.Note and record the DMM reading, and then adjust the Model 2400 display to agree
exactly with the actual DMM reading. (Use the up and down arrow keys to select the
digit value, and use the left and right arrow k e ys to choose the digit position.) Note that
the display adjustment range is within ±10% of the present range.
9.After adjusting the display to agree with the DMM reading, press ENTER. The instrument will then display the following:
V-CAL
Press ENTER to Output +000.00mV
10.Press ENTER. The Model 2400 will source 0mV and at the same time display the
following:
DMM RDG: +000.0000mV
Use
,
,
,
, ENTER, or EXIT.
11.Note and record the DMM reading, and then adjust the Model 2400 display to agree
with the actual DMM reading. Note that the display value adjustment limits are within
±1% of the present range.
12.After adjusting the display value to agree with the DMM reading, press ENTER. The
unit will then display the following:
V-CAL
Press ENTER to Output -200.00mV
Page 40
2-10Calibration
13.Press ENTER. The Model 2400 will source -200mV and display the following:
DMM RDG: -200.0000mV
Use
,
,
,
, ENTER, or EXIT.
14.Note and record the DMM reading, and then adjust the Model 2400 display to agree
with the DMM reading. Ag ain, the maximum display adjustment is within ±10% of the
present range.
15.After adjusting the display value to agree with the DMM reading, press ENTER, and
note that the instrument displays:
V-CAL
Press ENTER to Output -000.00mv
16.Press ENTER. The Model 2400 will source -0mV and simultaneously display the
following:
DMM RDG: -000.0000mV
Use
,
,
,
, ENTER, or EXIT.
17.Note and record the DMM reading, and then adjust the display to agree with the DMM
reading. Once again, the maximum adjustment is within ±1% of the present range.
18.After adjusting the display to agree with the DMM reading, press ENTER to complete
calibration of the present range.
19.Press EXIT to return to normal display, and then select the 2V source range. Repeat
steps 2 through 18 for the 2V range.
20.After calibrating the 2V range, repeat the entire procedure for the 20V and 200V ranges
using T able 2-3 as a guide. Be sure to select the appropriate source range with the EDIT
and RANGE keys before calibrating each range.
21.Press EXIT as necessary to return to normal display.
Page 41
Table 2-3
Front panel voltage calibration
Source range*Source voltageMultimeter voltage reading**
Perform the following steps for each current range using Table 2-4 as a guide.
1.Connect the Model 2400 to the digital multimeter as shown in Figure 2-2. Select the
multimeter DC current measurement function.
2.From normal display, press the SOURCE I key.
3.Press the EDIT key to select the source display field, and then use the down RANGE
key to select the 1µA source range.
4.From normal display, press MENU.
5.Select CAL, and then press ENTER. The unit will display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES SAVE LOCK CHANGE-PASSWORD
6.Select EXECUTE, and then press ENTER. The instrument will display the following
message:
I-CAL
Press ENTER to Output +1.0000µA
gure 2-
Current calibration connections
4-WIRE
INPUT/
SENSE
OUTPUT
HI
250V
PEAK
V
LOCAL
67
DIGITS SPEED
MEAS
FCTN
I
Ω
230
LIMIT
89
RECALL
V
4
TRIG
SWEEP
+/-
CONFIG MENU
5
1
REL
FILTER
STORE
Model 2400
EDIT
DISPLAY
TOGGLE
POWER
SOURCE
I
EDIT
EXIT ENTER
2400 SourceMeter
RANGE
AUTO
RANGE
ON/OFF
OUTPUT
250V
5V
PEAK
PEAK
LO
250V
PEAK
TERMINALS
FRONT/
REAR
Input LO
Amps
Digital Multimeter
Page 43
Calibration2-13
7.Press ENTER. The Model 2400 will source +1µA and simultaneously display the
following:
DMM RDG: +1.000000µA
Use , , , , ENTER, or EXIT.
8.Note and record the DMM reading, and then adjust the Model 2400 display to agree
exactly with the actual DMM reading. (Use the up and down arrow keys to select the
digit value; use the left and right arrow keys to choose the digit position.) Note that the
display adjustment range is within ±10% of the present range.
9.After adjusting the display to agree with the DMM reading, press ENTER. The instrument will then display the following:
I-CAL
Press ENTER to Output +0.0000µA
10.Press ENTER. The Model 2400 will source 0µA and at the same time display the
following:
DMM RDG: +0.000000µA
Use , , , , ENTER, or EXIT.
11.Note and record the DMM reading, and then adjust the Model 2400 display to agree
with the actual DMM reading. Note that the display value adjustment limits are within
±1% of the present range.
12.After adjusting the display value to agree with the DMM reading, press ENTER. The
unit will then display the following:
I-CAL
Press ENTER to Output -1.0000µA
13.Press ENTER. The Model 2400 will source -1µA and display the following:
DMM RDG: -1.000000µA
Use , , , , ENTER, or EXIT.
14.Note and record the DMM reading, then adjust the Model 2400 display to agree with
the DMM reading. Again, the maximum display adjustment is within ±10% of the
present range.
15.After adjusting the display value to agree with the DMM reading, press ENTER. and
note that the instrument displays:
I-CAL
Press ENTER to Output -0.0000µA
16.Press ENTER. The Model 2400 will source -0µA and simultaneously display the
following:
DMM RDG: -0.000000µA
Use , , , , ENTER, or EXIT.
17.Note and record the DMM reading, and then adjust the display to agree with the DMM
reading. Once again, the maximum adjustment is within ±1% of the present range.
18.After adjusting the display to agree with the DMM reading, press ENTER to complete
calibration of the present range.
19.Press EXIT to return to normal display, then select the 10µA source range using the
EDIT and up RANGE keys. Repeat steps 2 through 18 for the 10µA range.
Page 44
2-14Calibration
20.After calibrating the 10µA range, repeat the entire procedure for the 100µA through 1A
ranges using Table 2-4 as a guide. Be sure to select the appropriate source range with
the EDIT and up RANGE keys before calibrating each range.
Table 2-4
Front panel current calibration
Source range*Source currentMultimeter current reading**
Step 4. Enter calibration dates and save calibration
NOTEFor temporary calibration without saving new calibration constants, proceed to
Step 5: Lock out calibration.
1.From normal display, press MENU.
2.Select CAL, and then press ENTER. The Model 2400 will display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
3.Select SAVE, and then press ENTER. The instrument will display the following:
SAVE CAL
Press ENTER to continue; EXIT to abort calibration sequence.
4.Press ENTER. The unit will prompt you for the calibration date:
CAL DATE: 12/15/95
Use , , , , ENTER, or EXIT.
5.Change the displayed date to today’s date, and then press the ENTER key. Press
ENTER again to confirm the date.
6.The unit will then prompt for the calibration due date:
NEXT CAL: 12/15/96
Use , , , , ENTER, or EXIT.
7.Set the calibration due date to the desired value, and then press ENTER. Press ENTER
again to confirm the date.
8.Once the calibration dates are entered, calibration is complete. The following message
will be displayed.
CALIBRATION COMPLETE
Press ENTER to confirm; EXIT to abort
9.Press ENTER to save the calibration data (or press EXIT to abort without saving calibration data). The following message will be displayed:
CALIBRATION SUCCESS
Press ENTER or EXIT to continue.
10.Press ENTER or EXIT to complete process.
Step 5. Lock out calibration
1.From normal display, press MENU.
2.Select CAL, and then press ENTER. The Model 2400 will display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
3.Select LOCK, and then press ENTER. The instrument will display the following:
CALIBRATION LOCKED.
Press ENTER or EXIT to continue
4.Press ENTER or EXIT to return to normal display.
Page 46
2-16Calibration
Remote calibration
Use the following procedure to perform remote calibration by sending SCPI commands over
the IEEE-488 bus or RS-232 port. The remote commands and appropriate parameters are
separately summarized for each step.
Remote calibration commands
Table 2-5 summarizes remote calibration commands, while Table 2-6 and Table 2-7 list
command parameter limits. Note that each sense range requires three parameters: zero,
negative full scale, and positive full scale. Similarly, each source range requires four
parameters: two zero parameters, a positive full-scale parameter, and a negative full-scale
parameter.
For a more complete description of these commands, refer to Appendix B.
1. Calibration was not unlocked with :CODE command.
2. Invalid data exists. (For example, cal step failed or was aborted.)
3. Incomplete number of cal steps were performed. (For example, omitting a negative full-scale step.)
Calibration subsystem.
Cal commands protected by password.
Unlock cal: changes password if cal is already unlocked.
(Default password: KI002400.)
Query number of times 2400 has been calibrated.
Save calibration data to EEPROM.*
Lock calibration, inhibit SAVE command operation.
Request cal lock status.
Program calibration year, month, day.
Query calibration year, month, day.
Program calibration due year, month, day.
Query calibration due year, month, day.
Calibrate active measure range. (See Table 2-6 parameters.)
Query measurement cal constants for active range.
Calibrate active source range. (See Table 2-7 parameters.)
Query source cal constants for active range.
Page 47
Table 2-6
:CALibration:PROT ected:SENSe parameter ranges
Calibration2-17
First parameter
Sense range
0.2V
2V
20V
200V
1µA
10µA
100µA
1mA
10mA
100mA
1A
NOTE: Parameter steps for each range may be performed in any order, but all three parameter steps for each
range must be completed.
(zero)
-0.002 to +0.002
-0.02 to +0.02
-0.2 to +0.2
-2 to +2
-1E-8 to +1E-8
-1E-7 to +1E-7
-1E-6 to +1E-6
-1E-5 to +1E-5
-1E-4 to +1E-4
-1E-3 to +1E-3
-1E-2 to +1E-2
Table 2-7
:CALibration:PROT ected:SOURce parameter ranges
Source range
0.2V
2V
20V
200V
First parameter
(negative full scale)
-0.18 to -0.22
-1.8 to -2.2
-18to -22
-180 to -220
Second parameter
(negative zero)
-0.002 to +0.002
-0.02 to +0.02
-0.2to +0.2
-2to +2
Second parameter
(negative full scale)
-0.18 to -0.22
-1.8to -2.2
-18to -22
-180 to -220
-0.9E-6 to -1.1E-6
-9E-6to -11E-6
-90E-6 to -110E-6
-0.9E-3 to -1.1E-3
-9E-3to -11E-3
-90E-3 to -110E-3
-0.9to -1.1
Third parameter
(positive full scale)
+0.18 to +0.22
+1.8to +2.2
+18to +22
+180 to +220
Third parameter
(positive full scale)
+0.18 to +0.22
+1.8to +2.2
+18to +22
+180 to +220
+0.9E-6 to +1.1E-6
+9E-6to +11E-6
+90E-6 to +110E-6
+0.9E-3 to +1.1E-3
+9E-3to +11E-3
+90E-3 to +110E-3
+0.9to +1.1
Fourth parameter
(positive zero)
-0.002 to +0.002
-0.02 to +0.02
-0.2to +0.2
-2to +2
1µA
10µA
100µA
1mA
10mA
100mA
1A
NOTE: Parameter steps for each range may be performed in any order, but all four parameter steps for each range must be
completed.
-0.9E-6 to -1.1E-6
-9E-6to -11E-6
-90E-6 to -110E-6
-0.9E-3 to -1.1E-3
-9E-3to -11E-3
-90E-3 to -110E-3
-0.9to -1.1
-1E-8 to +1E-8
-1E-7 to +1E-7
-1E-6 to +1E-6
-1E-5 to +1E-5
-1E-4 to +1E-4
-1E-3 to +1E-3
-1E-2 to +1E-2
+0.9E-6 to +1.1E-6
+9E-6to +11E-6
+90E-6 to +110E-6
+0.9E-3 to +1.1E-3
+9E-3to +11E-3
+90E-3 to +110E-3
+0.9to +1.1
-1E-8 to +1E-8
-1E-7 to +1E-7
-1E-6 to +1E-6
-1E-5 to +1E-5
-1E-4 to +1E-4
-1E-3 to +1E-3
-1E-2 to +1E-2
Page 48
2-18Calibration
Remote calibration procedure
Step 1. Prepare the Model 2400 for calibration
1.Connect the Model 2400 to the controller IEEE-488 interface or RS-232 port using a
shielded interface cable.
2.Turn on the Model 2400 and the test equipment, and allow them to warm up for at least
one hour before performing calibration.
3.If you are using the IEEE-488 interface, make sure the primary address of the
Model 2400 is the same as the address specified in the program you will be using to
send commands. (Use the MENU key and the COMMUNICATION menu to access the
IEEE-488 address.)
Step 2. V oltage Calibration
1.Connect the Model 2400 to the digital multimeter (see Figure 2-1), and select the multimeter DC volts function.
2.Send the commands summarized in Table 2-8 in the order listed to initialize voltage calibration. (When the :CAL:PROT:CODE command is sent, the instrument will assume
the operating states listed in Table 2-2.)
Table 2-8
Voltage calibration initialization commands
CommandDescription
*RST
:SOUR:FUNC VOLT
:SENS:CURR:PROT 0.1
:SENS:CURR:RANG 0.1
:SOUR:VOLT:PROT:LEV MAX
:SYST:RSEN OFF
:CAL:PROT:CODE ‘KI002400’
:OUTP:STAT ON
*Remote sensing may be used if desired, but is not essential when using recommended digital multimeter.
3.Perform the range calibration steps listed in Table 2-9 for each range. For each range:
•Send the :SOUR:VOLT:RANG command to select the source and sense range
being calibrated. For example, for the 2V range, the follo wing command w ould be
sent:
:SOUR:VOLT:RANG 2
•Program the source to output the negativ e full-range v alue using the :SOUR:V OLT
command. For example:
:SOUR:VOLT -2
•Note and record the multimeter reading.
Restore GPIB defaults.
Activate voltage source.
Current limit when voltage source is active.
Make sure 1A range is not active.
Maximum allowable source voltage.
Disable remote sensing.*
Unlock cal.
Turn source on.
Page 49
Table 2-9
Voltage range calibration commands
Step Command/procedure*Description
Calibration2-19
:SOUR:VOLT:RANGE <Range>
1
:SOUR:VOLT -<Range>
2
Take DMM reading.
3
:CAL:PROT:SOUR <DMM_Reading>
4
Check 2400 for errors.
5
:CAL:PROT:SENS <DMM_Reading>
6
Check 2400 for errors.
7
:SOUR:VOLT 0.0
8
Take DMM reading.
9
:CAL:PROT:SOUR <DMM_Reading>
10
Check 2400 for errors.
11
:CAL:PROT:SENS <DMM_Reading>
12
Check 2400 for errors.
13
:SOUR:VOLT +<Range>
14
Take DMM reading.
15
:CAL:PROT:SOUR <DMM_Reading>
16
Check 2400 for errors.
17
:CAL:PROT:SENS <DMM_Reading>
18
Check 2400 for errors.
19
:SOUR:VOLT 0.0
20
Take DMM reading.
21
:CAL:PROT:SOUR <DMM_Reading>
22
*1. Perform complete procedure for each range, where <Range> = 0.2, 2, 20, and 200.
*2. <DMM_Reading> parameter is multimeter reading from previous step.
*3. Use :SYST:ERR? query to check for errors.
Select source range.
Establish negative polarity.
Read actual output value.
Calibrate source function negative full scale.
Calibrate sense function negative full scale.
Set output to 0V.
Read actual output value.
Calibrate sense function negative zero.
Calibrate source function negative zero.
Establish positive polarity.
Read actual output value.
Calibrate sense function positive full scale.
Calibrate source function positive full scale.
Set output to 0V.
Read actual output value.
Calibrate source positive zero.
•Use the multimeter reading as the parameter for the :CAL:PROT:SOUR and
:CAL:PROT:SENS commands. For example, a typical value for the 2V range
would be:
:CAL:PROT:SOUR -1.998
:CAL:PROT:SENS -1.998
•Program the voltage source for 0V output using the :SOUR:VOLT 0.0 command.
•Note the multimeter reading.
•Send the source and sense calibration commands using the multimeter reading for
the parameter. For example:
:CAL:PROT:SOUR 1E-3
:CAL:PROT:SENS 1E-3
•Set the source to the positive full-range value using the :SOUR:VOLT command.
For example:
:SOUR:VOLT 2
•Note and record the multimeter reading.
Page 50
2-20Calibration
•Send the source and sense commands using the multimeter reading as the
parameter. For example:
:CAL:PROT:SOUR 1.997
:CAL:PROT:SENS 1.997
•Send the :SOUR:VOLT 0.0 command to set the source voltage to 0V.
•Note and record the multimeter reading.
•Send the :CAL:PROT:SOUR command using the multimeter reading as the command parameter. For example:
:CAL:PROT:SOUR -1.02E-3
Step 3. Current Calibration
1.Connect the Model 2400 to the digital multimeter (see Figure 2-2), and select the multimeter DC current function.
2.Send the commands summarized in Table 2-10 in the order listed to initialize current
calibration.
Table 2-10
Current calibration initialization commands
CommandDescription
:SOUR:FUNC CURR
:SENS:VOLT:PROT 20
:SENS:VOLT:RANG 20
:OUTP:STAT ON
3.Calibrate each current range using the procedure summarized in Table 2-11. For each
range:
•Send the :SOUR:CURR:RANG command to select the source and sense range
being calibrated. For example, for the 1mA range, the command is:
:SOUR:CURR:RANG 1E-3
•Program the source to output the negative full-range value using the
:SOUR:CURR command. For example:
:SOUR:CURR -1E-3
•Note and record the multimeter reading.
•Use the multimeter reading as the parameter for the :CAL:PROT:SOUR and
:CAL:PROT:SENS commands. For example, a typical value for the 1mA range
would be:
:CAL:PROT:SOUR -1.025E-3
:CAL:PROT:SENS -1.025E-3
•Program the current source for 0A output using the :SOUR:CURR 0.0 command.
•Note the multimeter reading.
Select source current mode.
Voltage limit when current source is active.
Make sure 200V range is not active.
Turn source on.
Page 51
Table 2-11
Current range calibration commands
Step Command/procedure*Description
Calibration2-21
:SOUR:CURR:RANGE <Range>
1
:SOUR:CURR -<Range>
2
Take DMM reading.
3
:CAL:PROT:SOUR <DMM_Reading>
4
Check 2400 for errors.
5
:CAL:PROT:SENS <DMM_Reading>
6
Check 2400 for errors.
7
:SOUR:CURR 0.0
8
Take DMM reading.
9
:CAL:PROT:SOUR <DMM_Reading>
10
Check 2400 for errors.
11
:CAL:PROT:SENS <DMM_Reading>
12
Check 2400 for errors.
13
:SOUR:CURR +<Range>
14
Take DMM reading.
15
:CAL:PROT:SOUR <DMM_Reading>
16
Check 2400 for errors.
17
:CAL:PROT:SENS <DMM_Reading>
18
Check 2400 for errors.
19
:SOUR:CURR 0.0
20
Take DMM reading.
21
:CAL:PROT:SOUR <DMM_Reading>
22
*1. Perform complete procedure for each range, where <Range> = 1E6, 10E6, 100E6, 1E3, 10E3, 100E3, or 1.
*2. <DMM_Reading> parameter is multimeter reading from previous step.
*3. Use :SYST:ERR? query to check for errors.
Select source range.
Establish negative polarity.
Read actual output value.
Calibrate sense function negative full scale.
Calibrate source function negative full scale.
Set output to 0A.
Read actual output value.
Calibrate sense function negative zero.
Calibrate source function negative zero.
Establish positive polarity.
Read actual output value.
Calibrate sense function positive full scale.
Calibrate source function positive full scale.
Set output to 0A.
Read actual output value.
Calibrate source positive zero.
•Send the source and sense calibration commands using the multimeter reading for
the parameter. For example:
:CAL:PROT:SOUR 1E-6
:CAL:PROT:SENS 1E-6
•Set the source to the positive full-range value using the :SOUR:CURR command.
For example, for the 1mA range:
:SOUR:CURR 1E3
•Note and record the multimeter reading.
•Send the source and sense commands using the multimeter reading as the
parameter. For example:
:CAL:PROT:SOUR 1.03E-3
:CAL:PROT:SENS 1.03E-3
•Send the :SOUR:CURR 0.0 command to set the source current to 0A.
Page 52
2-22Calibration
•Note and record the multimeter reading.
•Send the :CAL:PROT:SOUR command using the multimeter reading as the command parameter. For example:
:CAL:PROT:SOUR -1.02E-3
Step 4. Program calibration dates
Use the following commands to set the calibration date and calibration due date:
Note that the year, month, and day must be separated by commas. The allowable range for the
year is from 1995 to 2094, the month is from 1 to 12, and the day is from 1 to 31.
Step 5. Save calibration constants
Calibration is now complete, so you can store the calibration constants in EEROM by sending
the following command:
:CAL:PROT:SAVE
NOTECalibration will be temporary unless you send the SAVE command. Also, calibration
data will not be saved if (1) calibration is locked, (2) invalid data exists, or (3) all
steps were not completed.
Step 6. Lock out calibration
To lock out further calibration, send the following command after completing the calibration
procedure:
:CAL:PROT:LOCK
Single-range calibration
Normally, the complete calibration procedure should be performed to ensure that the entire
instrument is properly calibrated. In some instances, however, you may want to calibrate only
certain ranges. To do so, simply complete the entire procedure only for the range(s) to be
calibrated. Keep in mind, however, that you must complete all parameter steps for each source
or sense range. Also, be sure to set calibration dates and save calibration after calibrating the
desired range(s).
Page 53
3
Routine Maintenance
Page 54
3-2Routine Maintenance
Introduction
The information in this section deals with routine type maintenance that can be performed by
the operator.
Line fuse replacement
WARNINGDisconnect the line cord at the rear panel, and remove all test leads con-
nected to the instrument (front and rear) before replacing the line fuse.
The power line fuse is accessible from the rear panel, just above the AC power receptacle (see
Figure 3-1).
Perform the following steps to replace the line fuse:
1.Carefully grasp and squeeze together the locking tabs that secure the fuse carrier to the
fuse holder.
2.Pull out the fuse carrier, and replace the fuse with the type specified in Table 3-1.
CAUTIONTo prevent instrument damage, use only the fuse type specified in
Table 3-1.
3.Reinstall the fuse carrier.
NOTEIf the power line fuse continues to blow, a circuit malfunction exists and must be
corrected. Refer to the troubleshooting section of this manual for additional
information.
Table 3-1
Power line fuse
Line voltageRatingKeithley part no.
88-264V250V, 1A, slow blow 5 × 20mmFU-72
Page 55
WARNING:NO INTERNAL OPERATOR SERVICABLE PARTS,SERVICE BY QUALIFIED PERSONNEL ONLY.
CAUTION:FOR CONTINUED PROTECTION AGAINST FIRE HAZARD,REPLACE FUSE WITH SAME TYPE AND RATING.
Fi
1
gure 3-
Rear panel
Routine Maintenance3-3
HI
250V
PEAK
5V
PEAK
LO
4-WIRE
SENSE
IEEE-488
(ENTER IEEE ADDRESS
WITH FRONT PANEL MENU)
INPUT/
OUTPUT
250V
PEAK
5V
PK
250V
PEAK
5V
PEAK
V, Ω,
GUARD
GUARD
SENSE
RS232
MADE IN
U.S.A.
TRIGGER
LINK
LINE FUSE
SLOWBLOW
2.5A, 250V
LINE RATING
85-264VAC
50, 60, HZ
70VA MAX
INTERLOCK-
DIGITAL I/O
Fuse
Page 56
3-4Routine Maintenance
Page 57
4
Troubleshooting
Page 58
4-2Troubleshooting
Introduction
This section of the manual will assist you in troubleshooting and repairing the Model 2400.
Included are self-tests, test procedures, troubleshooting tables, and circuit descriptions. The
repair technician must select the appropriate tests and documentation needed to troubleshoot
the instrument. Note that disassembly instructions are located in Section 5, while component
layout drawings are at the end of Section 6.
WARNINGThe information in this section is intended for qualified service personnel
only. Do not perform these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages that could
cause personal injury or death. Use caution when working with hazardous
voltages.
Repair considerations
Before making any repairs to the Model 2400, be sure to read the following considerations.
CAUTIONThe PC boards are built using surface mount techniques and require spe-
cialized equipment and skills for repair. If you are not equipped and/or
qualified, it is strongly recommended that you send the unit back to the
factory for repairs or limit repairs to the PC board replacement level.
Without proper equipment and training, you could damage a PC board
beyond repair.
•Repairs will require various degrees of disassembly. However, it is recommended that
the Front Panel Tests be performed prior to any disassembly. The disassembly instructions for the Model 2400 are contained in Section 5 of this manual.
•Do not make repairs to surface mount PC boards unless equipped and qualified to do so
(see previous CAUTION).
•When working inside the unit and replacing parts, be sure to adhere to the handling precautions and cleaning procedures explained in Section 5.
•Many CMOS devices are installed in the Model 2400. These static-sensitive devices
require special handling as explained in Section 5.
•Whenever a circuit board is removed or a component is replaced, the Model 2400 must
be recalibrated. See Section 2 for details on calibrating the unit.
Power-on self-test
During the power-on sequence, the Model 2400 will perform a checksum test on its ROM and
test its RAM. If the RAM tests fails, the instrument will lock up. If the ROM checksum test
fails, the firmware upgrade mode is automatically enabled. See Firmware upgrades at the end
of this section.
Page 59
Troubleshooting4-3
Front panel tests
There are three front panel tests: one to test the functionality of the front panel keys and tw o to
test the display. In the event of a test failure, refer to Display board checks for details on troubleshooting the display board.
KEYS test
The KEYS test allows you to check the functionality of each front panel key. Perform the following steps to run the KEYS test.
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
4.Select KEYS, and press ENTER to start the test. When a ke y is pressed, the label name
for that key will be displayed to indicate that it is functioning properly. When the key is
released, the “No keys pressed” message is displayed.
5.Pressing EXIT tests the EXIT key. However, the second consecutive press of EXIT
aborts the test and returns the instrument to the SELF-TEST MENU. Continue pressing
EXIT to back out of the menu structure.
DISPLA Y P A TTERNS test
The display test allows you to verify that each pix el and annunciator in the vacuum fluorescent
display is working properly. Perform the following steps to run the display test:
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
4.Select DISPLAY-PA TTERNS, and press ENTER to start the display test. There are five
parts to the display test. Each time a front panel key (except EXIT) is pressed, the next
part of the test sequence is selected. The five parts of the test sequence are:
•Checkerboard pattern (alternate pixels on) and all annunciators.
•Checkerboard pattern and the annunciators that are on during normal operation.
•Horizontal lines (pixels) of the first digit are sequenced.
•Vertical lines (pixels) of the first digit are sequenced.
•Each digit (and adjacent annunciator) is sequenced. All the pixels of the selected
digit are on.
5.When finished, abort the display test by pressing EXIT. The instrument returns to the
SELF-TEST MENU. Continue pressing EXIT to back out of the menu structure.
Page 60
4-4Troubleshooting
CHAR SET test
The character set test lets you display all characters. Perform the following steps to run the
character set test:
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
4.Select CHAR-SET, and press ENTER to start the character set test. Press any key
5.When finished, abort the character set test by pressing EXIT. The instrument returns to
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
except EXIT to cycle through all displayable characters.
the SELF-TEST MENU. Continue pressing EXIT to back out of the menu structure.
Principles of operation
The following information is provided to support the troubleshooting tests and procedures covered in this section of the manual. Refer to the following drawings:
Figure 4-1 — Analog circuitry overall block diagram
Figure 4-2 — Power supply block diagram
Figure 4-3 — Output stage simplified schematic
Figure 4-4 — Digital circuitry block diagram
Analog circuits
Figure 4-1 shows the overall block diagram for the Model 2400.
D/A converters control the programmed v oltage and current, or v oltage compliance and current
compliance. Each DAC has two ranges, a 10V output or a 1V output. The DAC outputs are fed
to the summing node, FB. Either the V DAC or the I DAC has the ability to control the main
loop. If the unit is set for SV (source voltage), it will source voltage until the compliance current is reached (as determined by the I DAC setting), and the current loop will override the v oltage loop. If, however, the unit is set for SI (source current), it will source current until the
compliance voltage is reached (as determined by the V DAC setting), and the voltage loop will
override the current loop. A priority bit in the Vclamp/I clamp circuit controls these functions.
The error amplifier adds open-loop gain and slew-rate control to the system to assure accuracy
and provide a controllable signal for the output stage, which provides the necessary v oltage and
current gain to drive the output. Sense resistors in the HI output lead provide output current
sensing, and a separate sense resistor is used for each current range. The 1A range uses 0.2V
full-scale for a full-range 1A output, while all other ranges use 2V output for full-scale current.
Voltage feedback is routed either internally or externally.
Page 61
Fi
1
There are four voltage ranges: 0.2V, 2V, 20V, and 200V. The feedback gain changes for only
the 20V and 200V ranges, resulting in three unique feedback gain values. A multiplexer directs
the voltage feedback, current feedback, reference, or ground signal to the A/D converter. An
opto-isolated interface provides control signals for both DACs, analog circuit control, and A/D
converter communication to the digital section.
gure 4-
Analog circuit block diagram
Troubleshooting4-5
Output
Stage
Remote
+36
-220
+
-
Sense
Resistors
Protection
Protection
Output
HI
S+
Output
LO
O
S-
Guard
Out
Guard
Sense
-36
V DAC
I DAC
Control
FB
IFB
VFB
A/DMUX
V Clamp
I Clamp
VFB
IFB
+7
+220
Error
Amp
O
S
O
Page 62
Fi
2
4-6Troubleshooting
Power supply
Figure 4-2 shows a block diagram of the Model 2400 power delivery system.
The offline flyback switching power supply pro vides all power for the instrument while provid-
ing universal inputs for the 110/120V line. The digital board runs directly from the switcher,
including the +12VD supply. (See Digital circuitry.)
A constant-frequency switching supply runs off the +12VD supplies and generates all the fl oating supply voltages for the analog board: +5VF, ±15VF, and ±30VF. An AC output (low voltage) supplies the analog board with the power it uses to deri ve the output stage supply v oltages,
±36VO and ±220VO.
gure 4-
Power supply block diagram
Analog Board
+30V
+15V
F
F
F
F
-15V
-30V
F
Constant Frequency
Low Noise Floating
Switching Supply
F
AC1
AC2
Output Stage
-220+220+36+36+5V
High Voltage/
O
Power
Digital Circuits
+12Vd
+5Vd
D
Line
Neutral
D
Switching Power
Supply
+12Vd
Page 63
Fi
3
Output stage
Figure 4-3 shows a simplified schematic of the output stage.
The Model 2400 output stage serves two purposes: (1) it converts signals from floating com-
mon to output common, and (2) it provides both voltage and current amplification. The output
stage drive transistors are biased in class B configuration to prevent the possibility of thermal
runaway with high-current output values. High-current taps for the ±20V outputs are provided
to reduce power dissipation on the 20V and lower ranges.
Output transistors Q518 and Q521 are cascoded with output MOSFETs Q516 and Q523. All
other MOSFETs and transistors are slaves, and the voltages across these devices are determined by the resistor-capacitor ladder circuits shown. Coarse current limits are built into the
output stage.
Troubleshooting4-7
gure 4-
225
Output stage
simplifi ed schematic
+17VF
+36
+15VF
Maindrive
300k
O
F
-15VF
-17VF
Q514
Q516
Q518
F
Q521
Q523
Q525
-36
-225
Page 64
4-8Troubleshooting
A/D converter
The SourceMeter unit uses a multi-slope charge balance A/D conv erter with a single-slope rundown. The converter is controlled by gate array U610. Commands are issued by the MPU on
the digital board through communications opto-isolators to U610, and U610 sends A/D reading
data back through opto-isolators to the digital board for calibration and processing.
Active guard
The Model 2400 has an active guard or “six-wire ohms” circuit used to measure complex
devices. This circuitry provides a lo w-current (50mA) equi v alent of the v oltage on output HI. If
the unit is in the SV mode, the low-current equivalent of the source voltage will appear on the
guard terminal. If the unit is in the SI mode, the voltage on output HI is equal to the source cur rent multiplied by the external resistance value. An equivalent voltage will be generated by the
guard circuit, and a guard sense terminal is provided to sense around the voltage drop in the
guard leads since significant current can flow (50mA).
Digital circuitry
Refer to Figure 4-4 for the following discussion on digital circuitry.
The core digital circuitry uses a Motorola 68332 microcontroller running at 16.78MHz. The
memory configuration includes two 256K
used in parallel to utilize the 16-bit data bus of the MPU. The RAM is battery backed-up, providing continued storage of data buffer information during power-down cycles. All calibration
constants and system setups are stored in a separate serial EEPROM.
External communication is provided via GPIB and serial interfaces. A 9914 GPIA IEEE-488
standard interface IC is used for the GPIB, and a 68332 Queued Serial Module (QSM) provides
the serial UART. For internal communications, the Time Processing Unit (TPU) is used for
serial communications with the front panel display module, and both the TPU and QSM handle
digital-to-analog interfacing.
Display board
Display board components are shown in the digital circuitry block diagram in Figure 4-4.
U902 is the display microcontroller that controls the VFD (vacuum fluorescent display) and
interprets key data. The microcontroller has four peripheral I/O ports that are used for the various control and read functions.
Display data is serially transmitted to the microcontroller from the digital board via the TXB
line to the microcontroller PD0 terminal. In a similar manner, key data is serially sent back to
the digital board through the RXB line via PD1. The 4MHz clock for the microcontroller is
generated on the digital board.
× 8-bit EEPROMS and two 128K × 8-bit RAMs
Page 65
Fi
gure 4-4
Digital board block diagram
Troubleshooting4-9
A/D
Control/Data
Reset
2
E PROM
U17
A/D
Interface
U9, U25
Voltage Source
Control
ROM
U15,
U16
Microprocessor
U3
RAM
U12,
U14
16.78MHz
Serial
Interface
U4
GPIB
U6, U13
U20
Trigger
U23
Digital
To Display
Board Controller
I/O
U7
RS-232
Interface
IEEE-488
Interface
Trigger
Digital
I/O
DS901 is the VFD (v acuum fluorescent display) module, which can display up to 49 characters.
Each character is organized as a 5
segment to act as a cursor.
× 7 matrix of dots or pixels and includes a long under-bar
The display uses a common multiplexing scheme with each character refreshed in sequence.
U903 and U904 are the grid drivers, and U901 and U905 are the dot drivers. Note that dot
driver and grid driver data is serially transmitted from the microcontroller (PD3 and PC1).
The VFD requires both +60VDC and 5VAC for the filaments. These VFD voltages are supplied
by U625, which is located on the digital board.
The front panel keys (S901-S931) are organized into a row-column matrix to minimize the
number of microcontroller peripheral lines required to read the keyboard. A key is read by
strobing the columns and reading all rows for each strobed column. Key down data is interpreted by the display microcontroller and sent back to the main microprocessor using proprietary encoding schemes.
Page 66
4-10Troubleshooting
T roubleshooting
Troubleshooting information for the various circuits is summarized in the following
paragraphs.
Display board checks
If the front panel display tests indicate that there is a problem on the display board, use
Table 4-1. See “Principles of operation” for display circuit theory.
Table 4-1
Display board checks
Step Item/component Required conditionRemarks
Front panel test
1
P1005, pin 5
2
P1005, pin 9
3
U902, pin 1
4
U902, pin 43
5
U902, pin 32
6
U902, pin 33
7
Verify that all segments operate.
+5V ±5%
+37V ±5%
Goes low briefly on power up, and then goes high.
4MHz square wave.
Pulse train every 1 ms.
Brief pulse train when front panel key is pressed.
Power supply checks
Power supply problems can be checked using Table 4-2. See “Principles of operation” for circuit theory on the power supply. Note that the power supply circuits are located on the digital
board.
Table 4-2
Power supply checks
Step Item/component Required conditionRemarks
1
2
3
4
5
6
7
1
U18, pin 2.
2
U8, pin 1.
Line fuse
Line power
TP5
TP6
TP7
TP8
TP9
Check continuity.
Plugged into live receptacle, power on.
+5V ±5%
+15V ±5%
-15V ±5%
~-35V
~+35V
Remove to check.
Check for correct power-up sequence.
+5VF, referenced to Common F3
+15VF, referenced to Common F2
-15VF, referenced to Common F2.
-30VF, referenced to Common F2.
+30VF, referenced to Common F2.
Use front panel display test.
Digital +5V supply.
Display +37V supply.
Microcontroller RESET.
Controller 4MHz clock.
Control from main processor.
Key down data sent to main
processor.
1
.
2
.
Page 67
Digital circuitry checks
Digital circuit problems can be checked using Table 4-3. See “Principles of operation” for a
digital circuit description.
Table 4-3
Digital circuitry checks
Step Item/component Required conditionRemarks
Troubleshooting4-11
Power-on test
01
U3 pin 19
02
U3 pin 7
03
U3 pin 68
04
U3, lines A0-A19
05
U3, lines D0-D15
06
U3 pin 66
07
U4 pin 7
08
U4 pin 8
09
U13 pins 34-42
10
U13 pins 26-31
11
U13 pin 24
12
U13 pin 25
13
U3 pin 43
14
U3 pin 44
15
U3 pin 45
16
U3 pin 47
17
Analog circuitry checks
Table 4-4 summarizes analog circuitry checks.
Table 4-4
Analog circuitry checks
RAM OK, ROM OK.
Digital common.
+5V
Low on power-up, and then goes high.
Check for stuck bits.
Check for stuck bits.
16.78MHz.
Pulse train during RS-232 I/O.
Pulse train during RS-232 I/O.
Pulse train during IEEE-488 I/O.
Pulses during IEEE-488 I/O.
Low with remote enabled.
Low during interface clear.
Pulse train.
Pulse train.
Pulse train.
Pulse train
Verify that RAM and ROM are functional.
All signals referenced to digital common.
Digital logic supply.
MPU RESET line.
MPU address bus.
MPU data bus.
MPU clock.
RS-232 RX line.
RS-232 TX line.
IEEE-488 data bus.
IEEE-488 command lines.
IEEE-488 REN line.
IEEE-488 IFC line.
D_ADDATA
D_DATA
D_CLK
D_STB
>200V voltage protection
SOURCE +10V
SOURCE + 10V (SVMI)
SOURCE +10V
SOURCE +10V
OUTPUT COM
OUTPUT COM
SVMI, OUTPUT ON, 20V, on 20V RANGE
Bench defaults
-13 ±1V
-5V ±.5V
-10V ±1V
-10.5 ±1V
0V ±.1V
7V ±.7V
7V ±.7V
20V ±.5V
6.4V ±6V
Page 68
4-12Troubleshooting
•
•
•
•
•
•
•
Battery replacement
WARNINGDisconnect the instrument from the power line and all other equipment
before changing the battery.
The volatile memories of the Model 2400 are protected by a replaceable battery when po wer is
off. Typical life for the battery is approximately ten years. The battery should be suspected if
the instrument no longer retains buffer data or user-defined operating parameters, such as
instrument setups, source memory, and math expressions. If the battery is absent or totally
exhausted, the display will show the “Reading buffer data lost” message shortly after the
Model 2400 is switched on.
The battery is a 3V wafer-type lithium cell, Duracell type DL2450 or equivalent (Keithley part
number BA-44), which is located on the digital board. Replacement of the battery requires
removal of the case cover, analog shield, and analog board assembly. (See Section 5.)
WARNINGThere is a danger of explosion if battery is incorrectly replaced. Replace
only with the same or equivalent type recommended by the manufacturer.
Dispose of used batteries according to manufacturer’s instructions.
WARNINGThe precautions below must be followed to avoid personal injury.
Wear safety glasses or goggles when working with lithium batteries.
Do not short the battery terminals together.
Keep lithium batteries away from all liquids.
Do not attempt to recharge lithium batteries.
Observe proper polarity when inserting the battery in its holder.
Do not incinerate or otherwise expose the battery to excessive heat
(>60°C).
Bulk quantities of lithium batteries should be disposed of as a
hazardous waste.
To replace the battery , first locate its holder. Use a small, non-metallic tool to lift the battery so
that it can be slid out from under the retainer spring clip.
The new battery should be reinstalled with the “+” terminal facing up. Lift up on the retaining
clip and place the edge of the battery under the clip. Slide the battery full into the holder.
Re-assemble the instrument and turn it on. The “Reading buffer data lost” error message will
be displayed. Send the :syst:mem:init command to perform the following:
•Clear the reading buffer.
•Initialize instrument setups 1-4 to the present instrument settings.
•Initialize all 100 source memory locations to the present instrument settings.
•Delete user math expressions.
Page 69
No comm link error
A “No Comm Link” error indicates that the front panel processor has ceased communication
with the main processor, which is located on the digital board. This error indicates that one of
the main processor ROMs may require re-seating in its socket. ROMs may be reseated as
follows:
1.Turn of f the po wer, and disconnect the line cord and all other test leads and cables from
the instrument.
2.Remove the case cover as outlined in Section 5.
3.Remove the analog shield and analog board assembly as outlined in Section 5.
4.Locate the two firmware ROMs, U15 and U16, located on the digital board. These are
the only ICs installed in sockets. (Refer to the component layout drawing at the end of
Section 6 for exact locations.)
5.Carefully push down on each ROM IC to make sure it is properly seated in its socket.
Troubleshooting4-13
CAUTIONBe careful not to push down excessively, or you might crack the digital
board.
6.Connect the line cord and turn on the power. If the problem persists, additional troubleshooting will be required.
Page 70
4-14Troubleshooting
Page 71
5
Disassembly
Page 72
5-2Disassembly
Introduction
This section explains how to handle, clean, and disassemble the Model 2400. Disassembly
drawings are located at the end of this section.
Handling and cleaning
To avoid contaminating PC board traces with body oil or other foreign matter, avoid touching
the PC board traces while you are repairing the instrument. Analog circuits have highimpedance devices or sensitive circuitry where contamination could cause degraded
performance.
Handling PC boards
Observe the following precautions when handling PC boards:
•Wear cotton gloves.
•Only handle PC boards by the edges and shields.
•Do not touch any board traces or components not associated with repair.
•Do not touch areas adjacent to electrical contacts.
•Use dry nitrogen gas to clean dust off PC boards.
Solder repairs
Observe the following precautions when you must solder a circuit board:
•Use an OA-based (organic activated) flux, and take care not to spread the flux to other
areas of the circuit board.
•Remove the flux from the work area when you have finished the repair by using pure
water with clean, foam-tipped swabs or a clean, soft brush.
•Once you have removed the flux, swab only the repair area with methanol, then blowdry the board with dry nitrogen gas.
•After cleaning, allow the board to dry in a 50°C, lo w-humidity environment for several
hours.
Page 73
Disassembly5-3
Static sensitive devices
CMOS devices operate at very high impedance levels. Therefore, any static that builds up on
you or your clothing may be sufficient to destroy these de vices if the y are not handled properly.
Use the following precautions to avoid damaging them:
CAUTIONMany CMOS devices are installed in the Model 2400. Handle all semicon-
ductor devices as being static sensitive.
•Transport and handle ICs only in containers specially designed to prevent static buildup. Typically, you will receive these parts in anti-static containers made of plastic or
foam. Keep these devices in their original containers until ready for installation.
•Remove the devices from their protective containers only at a properly grounded work
station. Ground yourself with a suitable wrist strap.
•Handle the devices only by the body; do not touch the pins.
•Ground any printed circuit board into which a semiconductor device is to be inserted to
the bench or table.
•Use only anti-static type desoldering tools.
•Use only grounded-tip solder irons.
•Once the device is installed in the PC board, it is normally adequately protected, and
you can handle the boards normally.
Assembly drawings
Use the assembly drawings located at the end of this section to assist you as you disassemble
and reassemble the Model 2400. Also, refer to these drawings for information about the Keithley part numbers of most mechanical parts in the unit.
Follow the steps below to remove the case cover to gain access to internal parts.
WARNINGBefore removing the case cover, disconnect the line cord and any test leads
from the instrument.
Remove Handle — The handle serves as an adjustable tilt-bail. Adjust its position by
gently pulling it away from the sides of the instrument case and swinging it up or do wn.
To remove the handle, swing the handle below the bottom surface of the case and back
until the orientation arrows on the handles line up with the orientation arrows on the
mounting ears. With the arrows lined up, pull the ends of the handle away from the
case.
2.
Remove Mounting Ears — Remove the screw that secures each mounting ear. Pull
down and out on each mounting ear.
NOTEWhen reinstalling the mounting ears, be sure to mount the right ear to the right side
of the chassis and the left ear to the left side of the chassis. Each ear is marked
“RIGHT” or “LEFT” on its inside surface.
Remove Rear Bezel — To remove the rear bezel, loosen the tw o scre ws that secure the
rear bezel to the chassis, and then pull the bezel away from the case.
Removing Grounding Screws — Remove the two grounding screws that secure the
4.
case to the chassis. They are located on the bottom of the case at the back.
Remove Chassis — To remove the case, grasp the front bezel of the instrument, and
5.
carefully slide the chassis forward. Slide the chassis out of the metal case.
NOTETo gain access to the components under the analog board shield, remove the shield,
which is secured to the analog board by a single screw.
Analog board removal
Perform the following steps to remove the analog board. This procedure assumes that the case
cover is already removed.
1.Remove analog board shield.
Remove the screw that secures the shield to the analog board, then remove the shield.
2.Disconnect the front and rear input terminals.
You must disconnect these input terminal connections for both the front and rear inputs:
•INPUT/OUTPUT HI and LO
•4-WIRE SENSE HI and LO
Ω, GUARD, and GUARD SENSE (rear panel only)
•V,
Page 75
Disassembly5-5
Remove all the connections by pulling the wires off the pin connectors. During
reassembly, use the following table to identify input terminals:
Front wire colorRear wire color
INPUT/OUTPUT HI
INPUT/OUTPUT LO
4-WIRE SENSE HI
4-WIRE SENSE LO
V,
Ω, GUARD
GUARD SENSE
3.Unplug cables.
•Carefully unplug the ribbon cables at J1001, J1002, and J1003.
•Unplug the ON/OFF cable at J1034.
4.Remove screws.
•Remove the two fastening screws that secure the analog board assembly to the
chassis. These screws are located on the side of the board opposite from the heat
sink.
•Remove the two screws that secure the heat sink to the chassis.
5.Remove analog board assembly.
•After all screws have been removed, carefully lift the analog board assembly free
of the main chassis.
6.Disassemble analog board assembly.
•Remove the screws that secure the analog board and heat sink to the analog board
subchassis.
•Carefully remove the heat sink by sliding the clips off the power transistors.
Red
Black
Yellow
Gray
—
—
White/Red
White/Black
White/Yellow
White/Gray
White
Blue/White
CAUTIONBe careful not to damage the heat sink insulation layer.
•Remove the analog board from the subchassis.
•Remove the four screws that secure the bottom cover, and then remove the cover
from the bottom of the PC board.
NOTEWhen reinstalling the heat sink, make sure that all clips are properly installed and
centered on each pair of output transistors.
Page 76
5-6Disassembly
Digital board removal
Perform the following steps to remove the digital board. This procedure assumes that the analog board assembly is already removed.
1.Remove the IEEE-488, Digital I/O, and RS-232 fasteners.
The IEEE-488, Digital I/O, and RS-232 connectors each have two nuts that secure the
connectors to the rear panel. Remove these nuts.
2.Remove the POWER switch rod.
At the switch, place the edge of a flat-blade screwdriver in the notch on the pushrod.
Gently twist the screwdriver while pulling the rod from the shaft.
3.Unplug cables:
•Unplug the display board ribbon cable.
•Unplug the cables going to the power supply.
•Unplug the rear panel power module cable.
4.Remove digital board.
Slide the digital board forward until it is free of the guide pins, then remove the board.
During reassembly, replace the board, and start the IEEE-488, Digital I/O, and RS-232
connector nuts and the mounting screw. Tighten all the fasteners once they are all in
place and the board is correctly aligned.
Front panel disassembly
Use the following steps to remove the display board and/or the pushbutton switch pad.
1.Unplug the display board ribbon cable.
2.Remove the front panel assembly.
This assembly has four retaining clips that snap onto the chassis over four pem nut
studs. Two retaining clips are located on each side of the front panel. Pull the retaining
clips outward and, at the same time, pull the front panel assembly forward until it separates from the chassis.
3.Using a thin-bladed screwdriv er, pry the plastic PC board stop (located at the bottom of
the display board) until the bar separates from the casing. Pull the display board from
the front panel.
4.Remove the switch pad by pulling it from the front panel.
Page 77
Removing power components
The following procedures for removing the po wer supply and/or power module require that the
case cover and analog board be removed, as previously explained.
Power supply removal
Perform the following steps to remove the power supply:
1.Remove the analog board.
2.Unplug the two cables coming from the digital board.
3.Remove the four screws that secure the power supply to the bottom of the chassis.
4.Remove the power supply from the chassis.
Disassembly5-7
Power module removal
Perform the following steps to remove the rear panel power module:
1.Remove the analog board.
2.Unplug the cable connecting the power module to the digital board.
3.Disconnect the power module's ground wire. This green and yellow wire connects to a
threaded stud on the chassis with a kep nut.
4.Squeeze the latches on either side of the power module while pushing the module from
the access hole.
WARNINGT o avoid electrical shock, which could result in injury or death, the ground
wire of the power module must be connected to chassis ground. When
installing the power module, be sure to reconnect the green and yellow
ground wire to the threaded stud on the chassis.
Instrument reassembly
Reassemble the instrument by reversing the previous disassembly procedures. Make sure that
all parts are properly seated and secured and that all connections are properly made. To ensure
proper operation, replace and securely fasten the shield.
WARNINGTo ensure continued protection against electrical shock, verify that power
line ground (green and yellow wire attached to the power module) is connected to the chassis. Also make certain that the two bottom case screws
are properly installed to secure and ground the case cover to the chassis.
Page 78
5-8Disassembly
Page 79
6
Replaceable Parts
Page 80
6-2Replaceable Parts
Introduction
This section contains replacement parts information and component layout drawings for the
Model 2400.
Parts lists
The electrical parts lists for the Model 2400 are shown in the tables at the end of this section.
For part numbers to the various mechanical parts and assemblies, use the Miscellaneous parts
list and the assembly drawings provided at the end of Section 5.
Ordering information
T o place an order or to obtain information concerning replacement parts, contact your Keithley
representative or the factory (see inside front co ver for addresses). When ordering parts, be sure
to include the following information:
•Instrument model number (Model 2400)
•Instrument serial number
•Part description
•Component designation (if applicable)
•Keithley part number
Page 81
Replaceable Parts6-3
Factory service
If the instrument is to be returned to Keithley Instruments for repair, perform the following:
•Call the Repair Department at 1-800-552-1115 for a Return Material Authorization
(RMA) number.
•Complete the service form at the back of this manual, and include it with the
instrument.
•Carefully pack the instrument in the original packing carton.
•Write ATTENTION REPAIR DEPARTMENT and the RMA number on the shipping
label.
Component layouts
The component layouts for the various circuit boards are provided on the following pages.
R901RES NET, 15K, 2%, 1.875WTF-219-15K
R902RES, 13K, 5%, 125MW, METAL FILMR-375-13K
R903,904RES, 4.7K, 5%, 250MW, METAL FILMR-376-4.7K
R905RES, 1M, 5%, 125MW, METAL FILMR-375-1M
R906RES, 1K, 5%, 250MW, METAL FILM R-376-1K
R907RES, 240, 5%, 250MW, METAL FILM R-376-240
R908RES, 10M, 5%, 250MW, METAL FILM R-375-10M
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
MAX.OUTPUT POWER: 22W (four quadrant source or sink operation).
SOURCE/SINK LIMITS: ± 21V @ ± 1.05A, ±210V @ ± 105mA.
VOLTAGE REGULATION: Line: 0.01% of range.
Load: 0.01% of range + 100µV.
NOISE 10Hz–1MHz (p-p): 10mV, typical. Resistive load.
OVER VOLTAGE PROTECTION: User selectable values, 5% tolerance. Factory default = 40 volts.
CURRENT LIMIT:Bipolar current limit (compliance) set with single value. Min. 0.1% of range.
OVERSHOOT: <0.1% typical (full scale step, resistive load, 10mA range).
Current Programming Accuracy (local or remote sense)
Accuracy (1 Year)
3
Noise
Programming23°C ±5°C(peak-peak)
RangeResolution±% rdg. +amps0.1Hz – 10Hz
1.00000 µA50 pA0.035% + 600pA5 pA
10.0000 µA500 pA0.033% +2nA50 pA
100.000 µA5 nA0.031% + 20nA500 pA
1.00000mA50 nA0.034% + 200nA5 nA
10.0000mA500 nA0.045% +2µA
100.000mA5 µA0.066% + 20µA
1.00000 A
2
50 µA0.27 % + 900µA
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
MAX.OUTPUT POWER: 22W (four quadrant source or sink operation).
SOURCE/SINK LIMITS: ±105mA @ 210V, ±1.05A @ 21V.
CURRENT REGULATION: Line:0.01% of range.
Load: 0.01% of range + 100pA.
VOLTAGE LIMIT: Bipolar voltage limit (compliance) set with single value. Min. 0.1% of range.
OVERSHOOT: <0.1% typical (1mA step, R
L
= 10kΩ, 20V range).
1
Specifications valid for continuous output currents below 105mA. For operating above 105mA continuous for >1 minute, derate accuracy 10%/35mA above 105mA.
2
Full operation (1A) regardless of load to 30°C. Above 30°C ambient, derate 35mA/°C and prorate 35mA/Ωload. 4-wire mode.
For current sinking, up to 10W (external power) at 23°C. Above 23°C, derate 1W/°C.
3
For sink mode, 1µA to 100mA range, accuracy is ±(0.15% + offset*4). For 1A range, accuracy is ±(1.5%+ offset*8).
ADDITIONAL SOURCE SPECIFICATIONS
TRANSIENT RESPONSE TIME: 30µs typical for the output to recover to its spec. following a step change in load. Resistive load.
COMMAND PROCESSING TIME: Maximum time required for the output to begin to change following the receipt of
:SOURce:VOLTage|CURRent <nrf> command.
Autorange On: 10ms. Autorange Off: 7ms.
OUTPUT SETTLING TIME: Time required to reach 0.1% of final
value after command is processed. 100µs typical. Resistive load.
Page 95
Specifications A-3
2400 SPECIFICATIONS (cont.)
OUTPUT SLEW RATE: 0.5V/µs, 200V range, 100mA compliance. 0.08V/µs, 2V and 20V ranges, 100mA compliance.
DC FLOATING VOLTAGE: Output can be floated up to ±250VDC from chassis ground.
REMOTE SENSE: Up to 1V drop per load lead.
COMPLIANCE ACCURACY:Add 0.1% of range to base specification.
OVER TEMPERATURE PROTECTION: Internally sensed temperature overload puts unit in standby mode.
RANGE CHANGE OVERSHOOT: Overshoot into a fully resistive 100kΩ load, 10Hz to 1MHz BW, adjacent ranges, Smooth Mode:
(100mV) typical, except 20V/200V range boundary.
MINIMUM COMPLIANCE VALUE: 0.1% of range.
MEASURE SPECIFICATIONS
1, 2
Voltage Measurement Accuracy (remote sense)
Max.InputAccuracy (23°C ± 5°C)
RangeResolutionResistance1 Year,±(%rdg + volts)
200.000 mV1 µV>10GΩ0.012% + 300 µV
2.00000 V10 µV>10GΩ0.012% + 300 µV
20.0000 V100 µV>10GΩ0.015% + 1.5 mV
200.000 V1mV>10GΩ0.015% + 10 mV
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
Current Measurement Accuracy (local or remote sense)
Max.VoltageAccuracy (23°C ±5°C)
RangeResolution*Burden
3
1 Y ear ,±(%rdg + amps)
1.00000 µA10 pA< 1mV0.029% + 300 pA
10.0000 µA100 pA< 1mV0.027% + 700 pA
100.000 µA1 nA< 1mV0.025% + 6 nA
1.00000 mA10 nA< 1mV0.027% + 60 nA
10.0000 mA100 nA< 1mV0.035% + 600 nA
100.000 mA1 µA< 1mV0.055% +6 µA
1.00000 A10 µA< 1mV0.22 % + 570 µA
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.1 × accuracy specification)/°C.
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
SOURCE I MODE, MANUAL OHMS:Total uncertainty = I source accuracy + Vmeasure accuracy (4-wire remote sense).
SOURCE V MODE: Total uncertainty = V source accuracy + I measure accuracy (4-wire remote sense).
6-WIRE OHMS MODE: Available using active ohms guard and guard sense. Max.Guard Output Current: 50 mA (except 1A
range). Accuracy is load dependent. Refer to manual for calculation formula.
GUARD OUTPUT IMPEDANCE:0.1Ω in ohms mode.
1
Speed = Normal (1 PLC).
2
Accuracies apply to 2- or 4-wire mode when properly zeroed.
3
4-wire mode.
4
Manual ohms mode only.
5
Source readback enabled, offset compensation ON.
Page 96
A-4Specifications
2400 SPECIFICATIONS (cont.)
SYSTEM SPEEDS
MEASUREMENT
1
MAXIMUM RANGE CHANGE RATE:75/second.
MAXIMUM MEASURE AUTORANGE TIME: 40ms (fixed source)2.
SWEEP OPERATION3READING RATES (rdg/second) FOR 60Hz (50Hz):
Source-Measure
NPLC/TriggerMeasureSource-MeasurePass/Fail Test
4
Source-Memory
4
SpeedOriginTo Mem.To GPIBTo Mem.To GPIBTo Mem. To GPIBTo Mem. To GPIB
SpeedNPLC/T rigger OriginMeasure Pass/Fail TestSource Pass/F ail TestSource-Measur e Pass/Fail Test
7
Fast0.01 / external1.04 ms (1.08 ms)0.5 ms (0.5 ms)4.82 ms (5.3 ms)
Medium 0.10 / external2.55 ms(2.9 ms)0.5 ms (0.5 ms)6.27 ms (7.1 ms)
Normal1.00 / external17.53 ms (20.9 ms)0.5 ms (0.5 ms)21.31 ms (25.0 ms)
1
Reading rates applicable for voltage or current measurements. Auto zero off, autorange off, filter off, display off, trigger delay
= 0, source auto clear off, and binary reading format.
2
Purely resistive load. 1µA and 10µA ranges <65ms.
3
1000 point sweep was characterized with the source on a fixed range.
4
Pass/Fail test performed using one high limit and one low math limit.
5
Includes time to re-program source to a new level before making measurement.
6
Time from falling edge of START OF TEST signal to falling edge of END OF TEST signal.
7
Command processing time of :SOURce:VOLTage|CURRent:TRIGgered <nrf> command not included.
Page 97
Specifications A-5
2400 SPECIFICATIONS (cont.)
GENERAL
NOISE REJECTION:
NPLCNMRRCMRR
Fast0.01—80 dB
Medium0.1—80 dB
Normal160 dB120 dB
1
LOAD IMPEDANCE:Stable into 20,000pF typical.
COMMON MODE VOLTAGE: 250VDC.
COMMON MODE ISOLATION:>109Ω, <1000pF.
OVERRANGE: 105% of range, source and measure.
MAX.VOLTAGE DROP BETWEEN INPUT/OUTPUT AND SENSE TERMINALS: 5 volts.
MAX.SENSE LEAD RESISTANCE: 1MΩ for rated accuracy.
SENSE INPUT IMPEDANCE: >1010Ω.
GUARD OFFSET VOLTAGE: 300µV, typical.
SOURCE OUTPUT MODES:
Fixed DC level
Memory List (mixed function)
Stair (linear and log)
SOURCE MEMORY LIST: 100 points max.
MEMORY BUFFER:5,000 readings @ 5½ digits (two 2,500 point buffers). Includes selected measured value(s) and time stamp.
Lithium battery backup (3 yr+ battery life).
PROGRAMMABILITY: IEEE-488 (SCPI-1995.0), RS-232, 5 user-definable power-up states plus factory default and *RST.
DIGITAL INTERFACE:
Safety Interlock: Active low input.
Handler Interface:Start of test, end of test, 3 category bits. +5V @ 300mA supply.
Digital I/O: 1 trigger input, 4 TTL/Relay Drive outputs (33V @ 500mA sink, diode clamped).
POWER SUPPLY: 88V to 264V rms, 50–60Hz (automatically detected at power up).
WARRANTY:1 year.
EMC: Conforms with European Union Directive 89/336/EEC EN 55011, EN50082-1, EN 61000-3-2 and 61000-3-3, FCC part 15
class B.
SAFETY: Conforms with European Union Directive 73/23/EEC EN 61010-1, UL 3111-1.
VIBRATION:MIL-T-28800E Type III, Class 5.
WARM-UP: 1 hour to rated accuracies.
DIMENSIONS: 89mm high × 213mm wide × 370mm deep (3
Operating: 0°–50°C, 70% R.H. up to 35°C. Derate 3% R.H./°C, 35°–50°C.
Storage: –25°C to 65°C.
1
Except lowest 2 current ranges = 90dB.
Specifications subject to change without notice.
Page 98
A-6Specifications
2400-C SPECIFICATIONS
SOURCE SPECIFICATIONS
1
Voltage Programming Accuracy (remote sense)
Accuracy (1 Year)Noise
Programming23°C ±5°C(peak-peak)
RangeResolution±% rdg. + volts0.1Hz – 10Hz
200.000 mV5 µV0.02% + 600 µV5 µV
2.00000 V50 µV0.02% +600 µV50 µV
20.0000 V500 µV0.02% + 2.4 mV500 µV
200.000 V5 mV0.02% + 24 mV5mV
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
MAX.OUTPUT POWER: 22W (four quadrant source or sink operation).
SOURCE/SINK LIMITS: ± 21V @ ± 1.05A, ±210V @ ± 105mA.
VOLTAGE REGULATION: Line: 0.01% of range.
Load: 0.01% of range + 100µV.
NOISE 10Hz–1MHz (p-p): 10mV, typical. Resistive load.
OVER VOLTAGE PROTECTION: User selectable values, 5% tolerance. Factory default = 40 volts.
CURRENT LIMIT:Bipolar current limit (compliance) set with single value. Min. 0.1% of range.
OVERSHOOT: <0.1% typical (full scale step, resistive load, 10mA range).
Current Programming Accuracy (local or remote sense)
Accuracy (1 Year)
3
Noise
Programming23°C ±5°C(peak-peak)
RangeResolution±% rdg. +amps0.1Hz – 10Hz
1.00000 µA50 pA0.035% + 600pA5 pA
10.0000 µA500 pA0.033% + 2nA50 pA
100.000 µA5 nA0.031% + 20nA500 pA
1.00000mA50 nA0.034% + 200nA5 nA
10.0000mA500 nA0.045% +2µA
100.000mA5 µA0.066% + 20µA
1.00000 A
2
50 µA0.27 % + 900µA
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
MAX.OUTPUT POWER: 22W (four quadrant source or sink operation).
SOURCE/SINK LIMITS: ±105mA @ 210V, ±1.05A @ 21V.
CURRENT REGULATION: Line: 0.01% of range.
Load: 0.01% of range + 100pA.
VOLTAGE LIMIT: Bipolar voltage limit (compliance) set with single value. Min. 0.1% of range.
OVERSHOOT: <0.1% typical (1mA step, R
L
= 10kΩ, 20V range).
1
Specifications valid for continuous output currents below 105mA. For operating above 105mA continuous for >1 minute, derate accuracy 10%/35mA above 105mA.
2
Full operation (1A) regardless of load to 30°C. Above 30°C ambient, derate 35mA/°C and prorate 35mA/Ωload. 4-wire mode.
For current sinking, up to 10W (external power) at 23°C. Above 23°C, derate 1W/°C.
3
For sink mode, 1µA to 100mA range, accuracy is ±(0.15% + offset*4). For 1A range, accuracy is ±(1.5%+ offset*8).
ADDITIONAL SOURCE SPECIFICATIONS
TRANSIENT RESPONSE TIME: 30µs typical for the output to recover to its spec. following a step change in load. Resistive load.
COMMAND PROCESSING TIME: Maximum time required for the output to begin to change following the receipt of
:SOURce:VOLTage|CURRent <nrf> command.
Autorange On: 10ms. Autorange Off: 7ms.
OUTPUT SETTLING TIME: Time required to reach 0.1% of final
value after command is processed. 100µs typical. Resistive load.
DC FLOATING VOLTAGE: Output can be floated up to ±250VDC from chassis ground.
REMOTE SENSE: Up to 1V drop per load lead.
COMPLIANCE ACCURACY:Add 0.1% of range to base specification.
OVER TEMPERATURE PROTECTION: Internally sensed temperature overload puts unit in standby mode.
RANGE CHANGE OVERSHOOT: Overshoot into a fully resistive 100kΩ load, 10Hz to 1MHz BW, adjacent ranges, Smooth Mode:
(100mV) typical, except 20V/200V range boundary.
MINIMUM COMPLIANCE VALUE: 0.1% of range.
CONTACT CHECK:2Ω15Ω50Ω
No contact check failure<1.00Ω<13.5Ω<47.5Ω
Always contact check failure >3.00Ω>16.5Ω>52.5Ω
MEASURE SPECIFICATIONS
1, 2
Voltage Measurement Accuracy (remote sense)
Max.InputAccuracy (23°C ± 5°C)
RangeResolutionResistance1 Year,±(%rdg + volts)
200.000 mV1 µV>10GΩ0.012% + 300 µV
2.00000 V10 µV>10GΩ0.012% + 300 µV
20.0000 V100 µV>10GΩ0.015% + 1.5 mV
200.000 V1mV>10GΩ0.015% + 10 mV
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.15 × accuracy specification)/°C.
Current Measurement Accuracy (local or remote sense)
Max.VoltageAccuracy (23°C ±5°C)
RangeResolution*Burden
3
1 Y ear ,±(%rdg + amps)
1.00000 µA10 pA< 1mV0.029% + 300 pA
10.0000 µA100 pA< 1mV0.027% + 700 pA
100.000 µA1 nA< 1mV0.025% + 6 nA
1.00000 mA10 nA< 1mV0.027% + 60 nA
10.0000 mA100 nA< 1mV0.035% + 600 nA
100.000 mA1 µA< 1mV0.055% +6 µA
1.00000 A10 µA< 1mV0.22 % + 570 µA
TEMPERATURE COEFFICIENT (0°–18°C & 28°–50°C): ±(0.1 × accuracy specification)/°C.
SpeedNPLC/T rigger OriginMeasure Pass/Fail TestSource Pass/F ail TestSource-Measur e Pass/Fail Test
7
Fast0.01 / external0.96 ms (1.07 ms)0.5 ms (0.5 ms)4.0 ms (4.0 ms)
Medium 0.10 / external2.5 ms(2.8 ms)0.5 ms (0.5 ms)5.5 ms (5.75 ms)
Normal1.00 / external17.5 ms (20.85 ms)0.5 ms (0.5 ms)20.5 ms(24 ms)
1
Reading rates applicable for voltage or current measurements. Auto zero off, autorange off, filter off, display off, trigger delay
= 0, source auto clear off, and binary reading format.
2
Purely resistive load. 1µA and 10µA ranges <65ms.
3
1000 point sweep was characterized with the source on a fixed range.
4
Pass/Fail test performed using one high limit and one low math limit.
5
Includes time to re-program source to a new level before making measurement.
6
Time from falling edge of START OF TEST signal to falling edge of END OF TEST signal.
7
Command processing time of :SOURce:VOLTage|CURRent:TRIGgered <nrf> command not included.
Loading...
+ hidden pages
You need points to download manuals.
1 point = 1 manual.
You can buy points or you can get point for every manual you upload.