Keithley Instruments, Inc. warrants this product to be free from defects in material and workmanship for a
period of 1 year from date of shipment.
Keithley Instruments, Inc. warrants the following items for 90 days from the date of shipment: probes, cables,
rechargeable batteries, diskettes, and documentation.
During the warranty period, we will, at our option, either repair or replace any product that proves to be defective.
To exercise this warranty, write or call your local Keithley representative, or contact Keithley headquarters in
Cleveland, Ohio. You will be given prompt assistance and return instructions. Send the product, transportation
prepaid, to the indicated service facility. Repairs will be made and the product returned, transportation prepaid.
Repaired or replaced products are warranted for the balance of the original warranty period, or at least 90 days.
LIMITATION OF WARRANTY
This warranty does not apply to defects resulting from product modification without Keithley’s express written
consent, or misuse of any product or part. This warranty also does not apply to fuses, software, non-recharge
able batteries, damage from battery leakage, or problems arising from normal wear or failure to follow instructions.
THIS WARRANTY IS IN LIEU OF ALL OTHER WARRANTIES, EXPRESSED OR IMPLIED, INCLUDING ANY IMPLIED WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR USE.
THE REMEDIES PROVIDED HEREIN ARE BUYER’S SOLE AND EXCLUSIVE REMEDIES.
NEITHER KEITHLEY INSTRUMENTS, INC. NOR ANY OF ITS EMPLOYEES SHALL BE LIABLE FOR
ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT
OF THE USE OF ITS INSTRUMENTS AND SOFTWARE EVEN IF KEITHLEY INSTRUMENTS, INC.,
HAS BEEN ADVISED IN ADVANCE OF THE POSSIBILITY OF SUCH DAMAGES. SUCH EXCLUDED
DAMAGES SHALL INCLUDE, BUT ARE NOT LIMITED TO: COSTS OF REMOVAL AND INSTALLA
TION, LOSSES SUSTAINED AS THE RESULT OF INJURY TO ANY PERSON, OR DAMAGE TO PROPERTY.
The print history shown below lists the printing dates of all Revisions and Addenda created
for this manual. The Revision Level letter increases alphabetically as the manual undergoes sub
sequent updates. Addenda, which are released between Revisions, contain important change information that the user should incorporate immediately into the manual. Addenda are numbered
sequentially. When a new Revision is created, all Addenda associated with the previous Revi
sion of the manual are incorporated into the new Revision of the manual. Each new Revision
includes a revised copy of this print history page.
Revision A (Document Number 2430-902-01) ........................................................ December 1998
Revision B (Document Number 2430-902-01) ................................................................. June 2000
Revision C (Document Number 2430-902-01) ............................................................October 2004
-
-
All Keithley product names are trademarks or registered trademarks of Keithley Instruments, Inc.
Other brand names are trademarks or registered trademarks of their respective holders.
Safety Precautions
The following safety precautions should be observed before using this product and any associated instrumentation. Although
some instruments and accessories would normally be used with non-hazardous voltages, there are situations where hazardous
conditions may be present.
This product is intended for use by qualified personnel who recognize shock hazards and are familiar with the safety precautions
required to avoid possible injury. Read and follow all installation, operation, and maintenance information carefully before using the product. Refer to the manual for complete product specifications.
If the product is used in a manner not specified, the protection provided by the product may be impaired.
The types of product users are:
Responsible body is the individual or group responsible for the use and maintenance of equipment, for ensuring that the equipment is operated within its specifications and operating limits, and for ensuring that operators are adequately trained.
Operators use the product for its intended function. They must be trained in electrical safety procedures and proper use of the
instrument. They must be protected from electric shock and contact with hazardous live circuits.
Maintenance personnel perform routine procedures on the product to keep it operating properly, for example, setting the line
voltage or replacing consumable materials. Maintenance procedures are described in the manual. The procedures explicitly state
if the operator may perform them. Otherwise, they should be performed only by service personnel.
Service personnel are trained to work on live circuits, and perform safe installations and repairs of products. Only properly
trained service personnel may perform installation and service procedures.
Keithley products are designed for use with electrical signals that are rated Measurement Category I and Measurement Category
II, as described in the International Electrotechnical Commission (IEC) Standard IEC 60664. Most measurement, control, and
data I/O signals are Measurement Category I and must not be directly connected to mains voltage or to voltage sources with
high transient over-voltages. Measurement Category II connections require protection for high transient over-voltages often as
sociated with local AC mains connections. Assume all measurement, control, and data I/O connections are for connection to
Category I sources unless otherwise marked or described in the Manual.
Exercise extreme caution when a shock hazard is present. Lethal voltage may be present on cable connector jacks or test fixtures.
The American National Standards Institute (ANSI) states that a shock hazard exists when voltage levels greater than 30V RMS,
42.4V peak, or 60VDC are present. A good safety practice is to expect that hazardous voltage is present in any unknown
circuit before measuring.
Operators of this product must be protected from electric shock at all times. The responsible body must ensure that operators
are prevented access and/or insulated from every connection point. In some cases, connections must be exposed to potential
human contact. Product operators in these circumstances must be trained to protect themselves from the risk of electric shock.
If the circuit is capable of operating at or above 1000 volts, no conductive part of the circuit may be exposed.
Do not connect switching cards directly to unlimited power circuits. They are intended to be used with impedance limited sources. NEVER connect switching cards directly to AC mains. When connecting sources to switching cards, install protective devices to limit fault current and voltage to the card.
Before operating an instrument, make sure the line cord is connected to a properly grounded power receptacle. Inspect the connecting cables, test leads, and jumpers for possible wear, cracks, or breaks before each use.
When installing equipment where access to the main power cord is restricted, such as rack mounting, a separate main input power disconnect device must be provided, in close proximity to the equipment and within easy reach of the operator.
For maximum safety, do not touch the product, test cables, or any other instruments while power is applied to the circuit under
test. ALWAYS remove power from the entire test system and discharge any capacitors before: connecting or disconnecting ca
-
-
5/03
bles or jumpers, installing or removing switching cards, or making internal changes, such as installing or removing jumpers.
Do not touch any object that could provide a current path to the common side of the circuit under test or power line (earth) ground. Always make measurements with dry hands while standing on a dry, insulated surface capable of withstanding the voltage being measured.
The instrument and accessories must be used in accordance with its specifications and operating instructions or the safety of the
equipment may be impaired.
Do not exceed the maximum signal levels of the instruments and accessories, as defined in the specifications and operating information, and as shown on the instrument or test fixture panels, or switching card.
When fuses are used in a product, replace with same type and rating for continued protection against fire hazard.
Chassis connections must only be used as shield connections for measuring circuits, NOT as safety earth ground connections.
If you are using a test fixture, keep the lid closed while power is applied to the device under test. Safe operation requires the use
of a lid interlock.
If a screw is present, connect it to safety earth ground using the wire recommended in the user documentation.
The ! symbol on an instrument indicates that the user should refer to the operating instructions located in the manual.
The symbol on an instrument shows that it can source or measure 1000 volts or more, including the combined effect of
normal and common mode voltages. Use standard safety precautions to avoid personal contact with these voltages.
The symbol indicates a connection terminal to the equipment frame.
The WARNING heading in a manual explains dangers that might result in personal injury or death. Always read the associated
information very carefully before performing the indicated procedure.
The CAUTION heading in a manual explains hazards that could damage the instrument. Such damage may invalidate the warranty.
Instrumentation and accessories shall not be connected to humans.
Before performing any maintenance, disconnect the line cord and all test cables.
To maintain protection from electric shock and fire, replacement components in mains circuits, including the power transformer,
test leads, and input jacks, must be purchased from Keithley Instruments. Standard fuses, with applicable national safety approvals, may be used if the rating and type are the same. Other components that are not safety related may be purchased from
other suppliers as long as they are equivalent to the original component. (Note that selected parts should be purchased only
through Keithley Instruments to maintain accuracy and functionality of the product.) If you are unsure about the applicability
of a replacement component, call a Keithley Instruments office for information.
To clean an instrument, use a damp cloth or mild, water based cleaner. Clean the exterior of the instrument only. Do not apply
cleaner directly to the instrument or allow liquids to enter or spill on the instrument. Products that consist of a circuit board with
no case or chassis (e.g., data acquisition board for installation into a computer) should never require cleaning if handled according to instructions. If the board becomes contaminated and operation is affected, the board should be returned to the factory for
proper cleaning/servicing.
Use the procedures in this section to verify that Model 2430 accuracy is within the limits stated
in the instrument’s one-year accuracy specifications. You can perform these verification proce
dures:
•When you first receive the instrument to make sure that it was not damaged during shipment.
•To verify that the unit meets factory specifications.
•To determine if calibration is required.
•Following calibration to make sure it was performed properly.
WARNINGThe information in this section is intended for qualified service personnel
NOTEIf the instrument is still under warranty and its performance is outside specified limits, con-
tact your Keithley representative or the factory to determine the correct course of action.
-
only. Do not attempt these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages, which
could cause personal injury or death if contacted. Use standard safety pre
cautions when working with hazardous voltages.
-
Verification test requirements
Be sure that you perform the verification tests:
•Under the proper environmental conditions.
•After the specified warm-up period.
•Using the correct line voltage.
•Using the proper test equipment.
•Using the specified output signals and reading limits.
Environmental conditions
Conduct your performance verification procedures in a test environment with:
•An ambient temperature of 18 to 28°C (65 to 82°F).
•A relative humidity of less than 70% unless otherwise noted.
Warm-up period
Allow the Model 2430 to warm up for at least one hour before conducting the verification
procedures.
If the instrument has been subjected to temperature extremes (those outside the ranges stated
above), allow additional time for the instrument’s internal temperature to stabilize. Typically,
allow one extra hour to stabilize a unit that is 10°C (18°F) outside the specified temperature
range.
Also, allow the test equipment to warm up for the minimum time specified by the manufactur-
er.
Line power
The Model 2430 requires a line voltage of 100 to 240V and a line frequency of 50 or 60Hz.
Verification tests must be performed within this range.
Recommended test equipment
Table 1-1 summarizes recommended verification equipment. You can use alternate equipment
as long as that equipment has specifications comparable to those listed in Table 1-1. Keep in mind,
however, that test equipment uncertainty will add to the uncertainty of each measurement. Gener
ally, test equipment uncertainty should be at least four times better than corresponding Model 2430
specifications. Table 1-1 lists the specifications of the recommended test equipment, including
maximum allowable uncertainty for alternate test equipment (shown in parentheses).
Table 1-1
Recommended verification equipment
Performance Verification 1-3
-
DescriptionManufacturer/ModelSpecifications
Digital MultimeterHewlett Packard
HP3458A
Resistance CalibratorFluke 5450AResistance
Precision Resistor
1. 90-day, full-range accuracy specifications of ranges required for various measurement points.
2. 90-day, ±5°C specifications of nominal resistance values shown. Use actual values for tests. Maximum uncertainty of alternate test equipment shown in parentheses.
3. Required for verification of 3A current range. Characterize resistor to ±300ppm or better using recommended DMM
before verifying 3A current measurement range.
The recommended 1Ω resistor should be characterized to ±300ppm or better before verifying
the 3A current measurement range. (You need not characterize the resistor if you are checking
only the 3A current source range.) Use the 4-wire ohms function of the DMM recommended in
Table 1-1 to measure the resistance value. Then use that measured value to calculate the current
during the 3A current measurement range test procedure.
Verification limits
The verification limits stated in this section have been calculated using only the Model 2430
one-year accuracy specifications, and they do not include test equipment uncertainty. If a par
ticular measurement falls outside the allowable range, recalculate new limits based on Model
2430 specifications and corresponding test equipment specifications.
Example limits calculation
As an example of how verification limits are calculated, assume you are testing the 20V DC
output range using a 20V output value. Using the Model 2430 20V range one-year accuracy
specification of ±(0.02% of output + 2.4mV offset), the calculated output limits are:
-
Output limits = 20V ± [(20V × 0.02%) + 2.4mV]
Output limits = 20V ± (0.004 + 0.0024)
Output limits = 20V ± 0.0064V
Output limits = 19.9936V to 20.0064V
Resistance limits calculation
When verifying the resistance measurement accuracy, it will probably be necessary to recalculate resistance limits based on the actual calibrator resistance values. You can calculate resistance reading limits in the same manner described above, but be sure to use the actual calibrator
resistance values and the Model 2430 normal accuracy specifications for your calculations.
As an example, assume you are testing the 20kΩ range, and that the actual value of the nominal 19kΩ calibrator resistor is 19.01kΩ. Using the Model 2430 20kΩ range one-year normal
accuracy specifications of ±(0.006% of reading + 3Ω), the recalculated reading limits are:
If the Model 2430 is not within specifications and not under warranty, see the calibration pro-
cedures in Section 2 for information on calibrating the unit.
Test considerations
When performing the verification procedures:
•Be sure to restore factory front panel defaults as previously outlined.
•Make sure that the test equipment is properly warmed up and connected to the Model
2430 INPUT/OUTPUT jacks. Also be sure that the front panel jacks are selected with
the TERMINALS key.
1-6Performance Verification
•Make sure the Model 2430 is set to the correct source range (see below).
•Ensure that the Model 2430 output is turned on before making measurements.
•Ensure the test equipment is set up for the proper function and range.
•Allow the Model 2430 output signal to settle before making a measurement.
•Do not connect test equipment to the Model 2430 through a scanner, multiplexer, or
other switching equipment.
WARNINGThe maximum common-mode voltage (voltage between LO and chassis
CAUTIONThe maximum voltage between INPUT/OUTPUT HI and LO or 4-WIRE
Setting the source range and output value
Before testing each verification point, you must properly set the source range and output val-
ue as outlined below.
ground) is 250V peak. Exceeding this value may cause a breakdown in
insulation, creating a shock hazard.
SENSE HI and LO is 125V peak. The maximum voltage between INPUT/
OUTPUT HI and 4-WIRE SENSE HI or between INPUT/OUTPUT LO
and 4-WIRE SENSE LO is 5V. Exceeding these voltage values may result
in instrument damage.
1.Press either the SOURCE V or SOURCE I key to select the appropriate source function.
2.Press the EDIT key as required to select the source display field. Note that the cursor will
flash in the source field while its value is being edited.
3.With the cursor in the source display field flashing, set the source range to the lowest
possible range for the value to be sourced using the up or down RANGE key. For exam
ple, you should use the 20V source range to output a 20V source value. With a 20V
source value and the 20V range selected, the source field display will appear as follows:
Vsrc:+20.0000 V
4.With the source field cursor flashing, set the source output to the required value using
either:
•The SOURCE adjustment and left and right arrow keys.
•The numeric keys.
5.Note that the source output value will be updated immediately; you need not press
ENTER when setting the source value.
Setting the measurement range
When simultaneously sourcing and measuring either voltage or current, the measure range is
coupled to the source range, and you cannot independently control the measure range. Thus, it
is not necessary for you to set the measure range when testing voltage or current measurement
accuracy.
-
Compliance considerations
Compliance limits
When sourcing voltage, you can set the SourceMeter to limit current from 10nA to 3.15A.
Conversely, when sourcing current, you can set the SourceMeter to limit voltage from 0.2mV
to 105V. The SourceMeter output will not exceed the programmed compliance limit.
Types of compliance
There are two types of compliance that can occur: “real” and “range.” Depending on which
value is lower, the output will clamp at either the displayed compliance setting (“real”) or at the
maximum measurement range reading (“range”).
The “real” compliance condition can occur when the compliance setting is less than the highest possible reading of the measurement range. When in compliance, the source output clamps
at the displayed compliance value. For example, if the compliance voltage is set to 1V and the
measurement range is 2V, the output voltage will clamp (limit) at 1V.
“Range” compliance can occur when the compliance setting is higher than the possible reading of the selected measurement range. When in compliance, the source output clamps at the
maximum measurement range reading (not the compliance value). For example, if the compli
ance voltage is set to 1V and the measurement range is 200mV, the output voltage will clamp
(limit) at 210mV.
Performance Verification 1-7
-
Maximum compliance values
The maximum compliance values for the measurement ranges are summarized in Table 1-2.
Table 1-2
Maximum compliance values
Measurement
range
200mV
2V
20V
100V
10μA
100μA
1mA
10mA
100mA
1A
3A
Maximum
compliance value
210mV
2.1V
21V
105V
10.5μA
105 μA
1.05mA
10.5mA
105mA
1.05A
3.15A
1-8Performance Verification
When the SourceMeter goes into compliance, the “Cmpl” label or the units label (i.e., “mA”)
for the compliance display will flash.
Determining compliance limit
The relationships to determine which compliance is in effect are summarized as follows.
They assume that the measurement function is the same as the compliance function.
•Compliance Setting < Measurement Range = Real Compliance
•Measurement Range < Compliance Setting = Range Compliance
You can determine the compliance that is in effect by comparing the displayed compliance
setting to the present measurement range. If the compliance setting is lower than the maximum
possible reading on the present measurement range, the compliance setting is the compliance
limit. If the compliance setting is higher than the measurement range, the maximum reading on
that measurement range is the compliance limit.
Taking the SourceMeter out of compliance
Verification measurements should not be made when the SourceMeter is in compliance. For
purposes of the verification tests, the SourceMeter can be taken out of compliance by going into
the edit mode and increasing the compliance limit.
NOTEDo not take the unit out of compliance by decreasing the source value or changing
the range. Always use the recommended range and source settings when performing
the verification tests.
Output voltage accuracy
Follow the steps below to verify that Model 2430 output voltage accuracy is within specified
limits. This test involves setting the output voltage to each full-range value and measuring the
voltages with a precision digital multimeter.
1.With the power off, connect the digital multimeter to the Model 2430 INPUT/OUTPUT
jacks, as shown in Figure 1-1.
2.Select the multimeter DC volts measuring function.
3.Press the Model 2430 SOURCE V key to source voltage, and make sure the source output is turned on.
Fi
gure 1-
1
Connections for voltage
verification tests
Performance Verification 1-9
4- WIRE
INPUT/
SENSE
OUTPUT
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
Ω
230
FILTER
89
STORE
2430 1KW PULSE SourceMeter
SOURCE
FCTN
I
V
4
5
EDIT
TRIG
SWEEP
LIMIT
+/-
EXIT ENTER
RECALL
CONFIG MENU
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Model 2430
Input HI
Input LO
Digital Multimeter
4.Verify output voltage accuracy for each of the voltages listed in Table 1-3. For each test
point:
•Select the correct source range.
•Set the Model 2430 output voltage to the indicated value.
•Verify that the multimeter reading is within the limits given in the table.
5.Repeat the procedure for negative output voltages with the same magnitudes as those
listed in Table 1-3.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-3
Output voltage accuracy limits
Model 2430
source range
200mV
2V
20V
100V
Model 2430 output
voltage setting
200.000mV
2.00000V
20.0000V
100.000V
Output voltage limits
(1 year, 18°C to 28°C)
199.360 to 200.640mV
1.99900 to 2.00100V
19.9936 to 20.0064V
99.968 to 100.032V
Voltage measurement accuracy
Follow the steps below to verify that Model 2430 voltage measurement accuracy is within
specified limits. The test involves setting the source voltage to full-range values, as measured
by a precision digital multimeter, and then verifying that the Model 2430 voltage readings are
within required limits.
1.With the power off, connect the digital multimeter to the Model 2430 INPUT/OUTPUT
jacks, as shown in Figure 1-1.
1-10Performance Verification
2.Select the multimeter DC volts function.
3.Set the Model 2430 to both source and measure voltage by pressing the SOURCE V and
MEAS V keys, and make sure the source output is turned on.
4.Verify output voltage accuracy for each of the voltages listed in Table 1-4. For each test
point:
•Select the correct source range.
•Set the Model 2430 output voltage to the indicated value as measured by the digital
multimeter.
•Verify that the Model 2430 voltage reading is within the limits given in the table.
NOTEIt may not be possible to set the voltage source to the specified value. Use the closest
possible setting, and modify reading limits accordingly.
5.Repeat the procedure for negative source voltages with the same magnitudes as those
listed in Table 1-4.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-4
Voltage measurement accuracy limits
Model 2430 source
and measure range
200mV
2V
20V
100V
1. Measure range coupled to source range when simultaneously sourcing and measuring voltage.
2. As measured by multimeter. Use closest possible value and modify reading limits accordingly.
1
Source voltage
200.000mV
2.00000V
20.0000V
100.000V
Output current accuracy
Follow the steps below to verify that Model 2430 output current accuracy is within specified
limits. The test involves setting the output current to each full-range value and measuring the
currents with a precision digital multimeter.
10μA to 1A range accuracy
1.With the power off, connect the digital multimeter to the Model 2430 INPUT/OUTPUT
jacks, as shown in Figure 1-2.
2
Model 2430 voltage reading
limits (1 year, 18°C to 28°C)
199.676 to 200.324mV
1.99946 to 2.00054V
19.9960 to 20.0040V
99.980 to 100.020V
Fi
gure 1-
2
Connections for 10μA
to 1A range current
verification tests
Performance Verification 1-11
4- WIRE
INPUT/
SENSE
OUTPUT
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
Ω
FILTER
STORE
FCTN
230
LIMIT
89
RECALL
V
4
TRIG
+/-
CONFIG MENU
SWEEP
2430 1KW PULSE SourceMeter
SOURCE
I
5
EDIT
EXIT ENTER
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Model 2430
Input LO
Amps
Digital Multimeter
2.Select the multimeter DC current measuring function.
3.Press the Model 2430 SOURCE I key to source current, and make sure the source output
is turned on.
4.Verify output current accuracy for the 10μA-1A range currents listed in Table 1-5. For
each test point:
•Select the correct source range.
•Set the Model 2430 output current to the correct value.
•Verify that the multimeter reading is within the limits given in the table.
5.Repeat the procedure for negative output currents with the same magnitudes as those
listed in Table 1-5.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
3A and 10A pulse range accuracy
NOTEVerification of the 3A range confirms 10A pulse range accuracy.
1.With the power off, connect the digital multimeter and the 1Ω resistor to the Model 2430
INPUT/OUTPUT jacks, as shown in Figure 1-3.
2.Select the multimeter DC volts measuring function.
3.Press the Model 2430 SOURCE I key to source current, and make sure the source output
is turned on.
Fi
3
1-12Performance Verification
gure 1-
Connections for
3A range current
verification tests
4- WIRE
INPUT/
SENSE
OUTPUT
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
Ω
FILTER
STORE
FCTN
230
LIMIT
89
RECALL
V
4
TRIG
+/-
CONFIG MENU
SWEEP
2430 1KW PULSE SourceMeter
SOURCE
I
5
EDIT
EXIT ENTER
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
1Ω Resistor
Model 2430
Input HI
Input LO
Digital Multimeter
4.Verify output current accuracy for the 3A range. Be sure to:
•Select the 3A source range.
•Set the Model 2430 output current to the correct 3A output value.
•Verify that the multimeter reading is within the 3A range limits given in Table 1-5.
(Since the value of the 1Ω resistor value is assumed to be the same as its nominal
value, the DMM voltage reading is the same as the sourced current.)
5.Repeat the procedure for a negative 3A current output value.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
Table 1-5
Output current accuracy limits
Model 2430
source range
10mA
100mA
1mA
10mA
100mA
1A
1
3A
1. See separate procedure for 3A range. DMM voltage reading is same as sourced current.
Model 2430
output current setting
10.0000mA
100.000mA
1.00000mA
10.0000mA
100.000mA
1.00000A
3.00000A
Output current limits
(1 year, 18°C to 28°C)
9.9947 to 10.0053mA
99.949 to 100.051mA
0.99946 to 1.00054mA
9.9935 to 10.0065mA
99.914 to 100.086mA
0.99843 to 1.00157A
2.99543 to 3.00457A
Current measurement accuracy
Follow the steps below to verify that Model 2430 current measurement accuracy is within
specified limits. The procedure involves applying accurate currents from the Model 2430 cur
rent source and then verifying that Model 2430 current measurements are within required limits.
10μA to 1A range accuracy
1.With the power off, connect the digital multimeter to the Model 2430 INPUT/OUTPUT
jacks, as shown in Figure 1-2.
2.Select the multimeter DC current function.
3.Set the Model 2430 to both source and measure current by pressing the SOURCE I and
MEAS I keys, and make sure the source output is turned on.
4.Verify measure current accuracy for the 10μA-1A range currents listed in Table 1-6. For
each measurement:
•Select the correct source range.
•Set the Model 2430 source output to the correct value as measured by the digital mul-
timeter.
• Verify that the Model 2430 current reading is within the limits given in the table.
NOTEIt may not be possible to set the current source to the specified value. Use the closest
possible setting, and modify reading limits accordingly.
Performance Verification 1-13
-
5.Repeat the procedure for negative calibrator currents with the same magnitudes as those
listed in Table 1-6.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT jacks. Be sure to
select the rear panel jacks with the front panel TERMINALS key.
3A and 10A pulse range accuracy
NOTEThe 1Ω resistor should be characterized to within ±300ppm before verifying the 3A
current measurement range. Use the 4-wire ohms function of the DMM to measure
the resistance value, and then use that measured value to calculate the current during
the measurement procedure. Also note that verification of the 3A range confirms 10A
pulse range accuracy.
1.With the power off, connect the 1Ω resistor and digital multimeter to the Model 2430
INPUT/OUTPUT jacks, as shown in Figure 1-3.
2.Select the multimeter DC volts function.
3.Set the Model 2430 to both source and measure current by pressing the SOURCE I and
MEAS I keys, and make sure the source output is turned on.
4.Verify measurement current accuracy for the 3A range as follows:
•Select the 3A source range.
1-14Performance Verification
•Set the Model 2430 source output to the correct 3A value as measured by the digital
multimeter.
•Note the DMM voltage reading, and then calculate the current from the voltage reading and characterized 1Ω resistance value as I = V/R, where V is the DMM voltage
reading and R is the characterized resistance value.
•Verify that the Model 2430 current reading is within the 3A limits given in Table 1-
6.
NOTEIt may not be possible to set the current source to the specified 3A value. Use the clos-
est possible setting, and modify reading limits accordingly.
5.Repeat the procedure for a negative 3A current.
6.Repeat the procedure using the rear panel INPUT/OUTPUT jacks. Be sure to select the
rear panel jacks with the front panel TERMINALS key.
Table 1-6
Current measurement accuracy limits
Model 2430 source
1
and measure range
10μA
100μA
1mA
10mA
100mA
1A
3A
1. Measure range coupled to source range when simultaneously sourcing and measuring current.
2. As measured by precision digital multimeter. Use closest possible value, and modify reading limits
accordingly if necessary.
3. Current calculated as follows: I = V/R, where V is the DMM voltage reading, and R is the characterized value of the 1Ω resistor.
Source current
10.00000μA
100.000μA
1.00000mA
10.0000mA
100.000mA
1.00000A
3.00000A
Model 2430 current reading limits
2
(1 year, 18°C to 28°C)
9.9966 to 10.0034μA
99.969 to 100.031μA
0.99967 to 1.00033mA
9.9959 to 10.0041mA
99.939 to 100.061mA
0.99883 to 1.00117A
2.99673 to 3.00327A
Resistance measurement accuracy
Use the following steps to verify that Model 2430 resistance measurement accuracy is within
specified limits. This procedure involves applying accurate resistances from a resistance calibra
tor and then verifying that Model 2430 resistance measurements are within required limits.
3
-
Performance Verification 1-15
CAUTIONBefore testing the 2Ω and 20Ω ranges, make sure your resistance calibrator
can safely handle the default test currents for those ranges (see Model 2430
and calibrator specifications). If not, use the CONFIG OHMS menu to se
lect the MANUAL source mode, then set the source current to an appropriate safe value. When using the manual source mode, total resistance
reading uncertainty includes both Source I and Measure V uncertainty
(see specifications), and calculated reading limits should take the addition
al uncertainty into account.
If using the Fluke 5450A resistance calibrator, you cannot use the Auto
Ohms mode of the Model 2430 to verify the 2¾ range. The 1A test current
for the 2¾ range of the Model 2430 will damage the calibrator. On the
Model 2430, use the CONFIG OHMS menu to select the MANUAL source
mode, and then set the source (test) current to 100mA.
1.With the power off, connect the resistance calibrator to the Model 2430 INPUT/
OUTPUT and 4-WIRE SENSE jacks, as shown in Figure 1-4. Be sure to use the 4-wire
connections as shown.
-
-
Figure 1-4
Connections for
resistance accuracy
verification
4- WIRE
INPUT/
SENSE
OUTPUT
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
Ω
FILTER
STORE
FCTN
230
LIMIT
89
RECALL
V
4
TRIG
+/-
CONFIG MENU
SWEEP
2430 1KW PULSE SourceMeter
SOURCE
I
EDIT
5
EXIT ENTER
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Model 2430
Output HI
Resistance Calibrator
Output LO
2.Select the resistance calibrator external sense mode.
Sense HI
Sense LO
1-16Performance Verification
3.Configure the Model 2430 ohms function for the 4-wire sense mode as follows:
•Press CONFIG then MEAS Ω. The instrument will display the following:
CONFIG OHMS
SOURCE SENSE-MODE GUARD
•Select SENSE-MODE, then press ENTER. The following will be displayed:
SENSE-MODE
2-WIRE 4-WIRE
•Select 4-WIRE, then press ENTER.
•Press EXIT to return to normal display.
4.Press MEAS Ω to select the ohms measurement function, and make sure the source out-
put is turned on.
5.Verify ohms measurement accuracy for each of the resistance values listed in Table 1-7.
For each measurement:
•Set the resistance calibrator output to the nominal resistance or closest available
value.
NOTEIt may not be possible to set the resistance calibrator to the specified value. Use the
closest possible setting, and modify reading limits accordingly.
•Select the appropriate ohms measurement range with the RANGE keys.
•Verify that the Model 2430 resistance reading is within the limits given in the table.
6.Repeat the entire procedure using the rear panel INPUT/OUTPUT and 4-WIRE SENSE
jacks. Be sure to select the rear panel jacks with the front panel TERMINALS key.
Table 1-7
Ohms measurement accuracy limits
Model 2430 resistance reading limits
Model 2430 rangeCalibrator resistance
2Ω
20Ω
200Ω
2kΩ
20kΩ
200kΩ
2MΩ
20MΩ
1. Nominal resistance values.
2. Reading limits based on Model 2430 normal accuracy specifications and nominal resistance values. If actual
resistance values differ from nominal values shown, recalculate reading limits using actual calibrator resistance
values and Model 2430 normal accuracy specifications. See Verification limits earlier in this section for details.
1.9Ω
19Ω
190Ω
1.9kΩ
19kΩ
190kΩ
1.9MΩ
19MΩ
1
(1 year, 18°C to 28°C)
1.89647 to 1.90353Ω
18.9780 to 19.0220Ω
189.818 to 190.182Ω
1.89837 to 1.90163kΩ
18.9856 to 19.0144kΩ
189.837 to 190.163kΩ
1.89761 to 1.90239MΩ
18.9781 to 19.0219MΩ
2
2
Calibration
2-2Calibration
Introduction
Use the procedures in this section to calibrate the Model 2430. These procedures require accurate
test equipment to measure precise DC voltages and currents. Calibration can be performed either
from the front panel or by sending SCPI calibration commands over the IEEE-488 bus or RS-232
port with the aid of a computer.
WARNINGThe information in this section is intended for qualified service personnel
only. Do not attempt these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages.
Environmental conditions
Temperature and relative humidity
Conduct the calibration procedures at an ambient temperature of 18 to 28°C (65 to 82°F) with
relative humidity of less than 70% unless otherwise noted.
Warm-up period
Allow the Model 2430 to warm up for at least one hour before performing calibration.
If the instrument has been subjected to temperature extremes (those outside the ranges stated
above), allow additional time for the instrument’s internal temperature to stabilize. Typically,
allow one extra hour to stabilize a unit that is 10°C (18°F) outside the specified temperature
range.
Also, allow the test equipment to warm up for the minimum time specified by the manufacturer.
Line power
Model 2430 requires a line voltage of 100 to 240V at line frequency of 50 or 60Hz. The instrument must be calibrated while operating from a line voltage within this range.
Calibration considerations
When performing the calibration procedures:
•Ensure that the test equipment is properly warmed up and connected to the Model 2430
front panel INPUT/OUTPUT jacks. Also be certain that the front panel jacks are
selected with the TERMINALS switch.
•Always allow the source signal to settle before calibrating each point.
•Do not connect test equipment to the Model 2430 through a scanner or other switching
equipment.
•If an error occurs during calibration, the Model 2430 will generate an appropriate error
message. See Appendix B for more information.
WARNINGThe maximum common-mode voltage (voltage between LO and chassis
CAUTIONThe maximum voltage between INPUT/OUTPUT HI and LO or 4-WIRE
Calibration cycle
Perform calibration at least once a year to ensure the unit meets or exceeds its specifications.
NOTECalibration constants are stored in volatile memories of the Model 2430, which
are protected by a replaceable battery when power is off. Typical life for the bat
tery is approximately 10 years, but the battery should be replaced if the voltage
drops below 2.5V regardless of age. See Section 4 for battery replacement de
tails.
Calibration 2-3
ground) is 250V peak. Exceeding this value may cause a breakdown in in
sulation, creating a shock hazard.
SENSE HI and LO is 125V peak. The maximum voltage between INPUT/
OUTPUT HI and 4-WIRE SENSE HI or between INPUT/OUTPUT LO
and 4-WIRE SENSE LO is 5V. Exceeding these voltage values may result
in instrument damage.
-
-
-
Recommended calibration equipment
Table 2-1 lists the recommended equipment for the calibration procedures. You can use alternate equipment as long as that equipment has specifications comparable to those listed in the
table. For optimum calibration accuracy, test equipment specifications should be at least four
times better than corresponding Model 2430 specifications.
Table 2-1
Recommended calibration equipment
DescriptionManufacturer/ModelSpecifications
Digital MultimeterHewlett Packard
DC Voltage
HP3458A
DC Current
Precision Resistor
1. 90-day, full-range accuracy specifications of ranges required for various measurement points.
2. Necessary for calibration of 3A current range. Resistor must be characterized to ±300ppm or better using recommended DMM before calibrating 3A range.
2
Isotec RUG-Z-1R00-0.11Ω, ±0.1%, 100W
1
1V:
10V:
100V:
1
10μA:
100μA:
1mA:
10mA:
100mA:
1A:
±5.6ppm
±4.3ppm
±6.3ppm
±25ppm
±23ppm
±20ppm
±20ppm
±35ppm
±110ppm
2-4Calibration
1Ω resistor characterization
The 1Ω resistor must be characterized to ±300ppm or better before calibrating the 3A current
range. Use the 4-wire ohms function of the DMM recommended in Table 1-1 to measure the
resistance value, and then use that measured value to calculate the current during the 3A current
range calibration procedure.
Unlocking calibration
Before performing calibration, you must first unlock calibration by entering or sending the
calibration password as explained in the following paragraphs.
Unlocking calibration from the front panel
1.Press the MENU key, then choose CAL, and press ENTER. The instrument will display
the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
2.Select UNLOCK, then press ENTER. The instrument will display the following:
3.Use the up and down RANGE keys to select the letter or number, and use the left and
right arrow keys to choose the position. (Press down RANGE for letters; up RANGE for
numbers.) Enter the present password on the display. (Front panel default: 002430.)
4.Once the correct password is displayed, press the ENTER key. If the password was correctly entered, the following message will be displayed:
CALIBRATION UNLOCKED
Calibration can now be executed.
5.Press EXIT to return to normal display. Calibration will be unlocked and assume the
states summarized in Table 2-2. Attempts to change any of the settings listed with cali
bration unlocked will result in an error +510, “Not permitted with cal un-locked.”
NOTEWith calibration unlocked, the sense function and range track the source function and
range. That is, when :SOUR:FUNC is set to VOLT, the :SENS:FUNC setting will be
'VOLT:DC'. When :SOUR:FUNC is set to CURR, the :SENS:FUNC setting will be
'CURR:DC'. A similar command coupling exists for :SOUR:VOLT:RANG/
:SENS:VOLT:RANG and SOUR:CURR:RANG:SENS:CURR:RANG.
Table 2-2
Calibration unlocked states
ModeStateEquivalent remote command
Concurrent Functions
Sense Function
Sense Volts NPLC
Sense Volts Range
Sense Current NPLC
Sense Current Range
Filter Count
Filter Control
Filter Averaging
Source V Mode
Volts Autorange
Source I Mode
Current Autorange
Autozero
Trigger Arm Count
Trigger Arm Source
Trigger Count
Trigger Source
OFF
Source
1.0
Source V
1.0
Source I
10
REPEAT
ON
FIXED
OFF
FIXED
OFF
ON
1
Immediate
1
Immediate
Unlocking calibration by remote
To unlock calibration via remote, send the following command:
:CAL:PROT:CODE '<password>'
For example, the following command uses the default password:
:CAL:PROT:CODE 'KI002430'
:SENS:FUNC:CONC OFF
:SENS:FUNC <source_ function>
:SENS:VOLT:NPLC 1.0
:SENS:VOLT:RANG <range>
:SENS:CURR:NPLC 1.0
:SENS:CURR:RANG <range>
:SENS:AVER:COUN 10
:SENS:AVER:TCON REPeat
:SENS:AVER:STAT ON
:SOUR:VOLT:MODE FIXED
:SOUR:VOLT:RANG:AUTO OFF
:SOUR:CURR:MODE FIXED
:SOUR:CURR:RANG:AUTO OFF
:SYST:AZERO ON
:ARM:COUNT 1
:ARM:SOUR IMMediate
:TRIG:COUNT 1
:TRIG:SOUR IMMediate
Changing the password
The default password may be changed from the front panel or via remote as discussed next.
2-6Calibration
Changing the password from the front panel
Changing the password by remote
Follow the steps below to change the password from the front panel:
1.Press the MENU key, then choose CAL and press ENTER. The instrument will display
the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
SAVE LOCK CHANGE-PASSWORD
2.Select UNLOCK, then enter the password. (Default: 002430.)
3.Select CHANGE-PASSWORD, then press ENTER. The instrument will display the following:
To change the calibration password by remote, first send the present password, and then send
the new password. For example, the following command sequence changes the password from
the 'KI002430' remote default to 'KI_CAL':
:CAL:PROT:CODE 'KI002430'
:CAL:PROT:CODE 'KI_CAL'
You can use any combination of letters and numbers up to a maximum of eight characters.
NOTEIf you change the first two characters of the password to something other than
“KI”, you will not be able to unlock calibration from the front panel.
Resetting the calibration password
If you lose the calibration password, you can unlock calibration by shorting together the CAL
pads, which are located on the display board. Doing so will also reset the password to the factory
default (KI002430).
See Section 5 for details on disassembling the unit to access the CAL pads. Refer to the display board component layout drawing at the end of Section 6 for the location of the CAL pads.
Calibration 2-7
Viewing calibration dates and calibration count
When calibration is locked, only the UNLOCK and VIEW-DATES selections will be acces-
sible in the calibration menu. To view calibration dates and calibration count at any time:
1.From normal display, press MENU, select CAL, and then press ENTER. The unit will
display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
2.Select VIEW-DATES, then press ENTER. The Model 2430 will display the next and
last calibration dates and the calibration count as in the following example:
The Model 2430 checks for errors after each calibration step, minimizing the possibility that
improper calibration may occur due to operator error.
Front panel error reporting
If an error is detected during comprehensive calibration, the instrument will display an appropriate error message (see Appendix B). The unit will then prompt you to repeat the calibration step that caused the error.
Remote error reporting
You can detect errors while in remote by testing the state of EAV (Error Available) bit (bit 2)
in the status byte. (Use the *STB? query to request the status byte.) Query the instrument for the
type of error by using the appropriate :SYST:ERR? query. The Model 2430 will respond with the
error number and a text message describing the nature of the error. See Appendix B for details.
Front panel calibration
The following front panel calibration procedure calibrates all ranges of both the current and
voltage source and measure functions. Note that each range is separately calibrated by repeating
the entire procedure for each range.
Step 1: Prepare the Model 2430 for calibration
1.Turn on the Model 2430 and the digital multimeter, and allow them to warm up for at
least one hour before performing calibration.
2-8Calibration
Figure 2-1
Voltage calibration
test connections
2.Press the MENU key, then choose CAL and press ENTER. Select UNLOCK, and then
press ENTER. The instrument will display the following:
3.Use the up and down range keys to select the letter or number, and use the left and right
arrow keys to choose the position. Enter the present password on the display. (Front
panel default: 002430.) Press ENTER to complete the process.
4.Press EXIT to return to normal display. Instrument operating states will be set as summarized in Table 2-2.
Step 2: Voltage calibration
Perform the steps below for each voltage range, using Table 2-3 as a guide.
1.Connect the Model 2430 to the digital multimeter, as shown in Figure 2-1. Select the
multimeter DC volts measurement function.
4- WIRE
INPUT/
SENSE
OUTPUT
Ω
230
FILTER
89
STORE
2430 1KW PULSE SourceMeter
SOURCE
FCTN
I
V
4
5
EDIT
TRIG
SWEEP
LIMIT
+/-
EXIT ENTER
RECALL
CONFIG MENU
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
Model 2430
Input HI
Input LO
Digital Multimeter
2.From normal display, press the SOURCE V key.
3.Press the EDIT key to select the source field (cursor flashing in source display field), and
then use the down RANGE key to select the 200mV source range.
4.From normal display, press MENU.
5.Select CAL, then press ENTER. The unit will display the following:
8.Note and record the DMM reading, and then adjust the Model 2430 display to agree
exactly with the actual DMM reading. Use the up and down arrow keys to select the digit
value, and use the left and right arrow keys to choose the digit position (or use the num
ber keys, 0-9, +/-). Note that the display adjustment range is within ±10% of the present
range.
9.After adjusting the display to agree with the DMM reading, press ENTER. The instrument will then display the following:
V-CAL
Press ENTER to Output +000.00mV
10.Press ENTER. The Model 2430 will source 0mV and at the same time display the following:
11.Note and record the DMM reading, and then adjust the Model 2430 display to agree with
the actual DMM reading. Note that the display value adjustment limits are within ±1%
of the present range.
12.After adjusting the display value to agree with the DMM reading, press ENTER. The
unit will then display the following:
V-CAL
Press ENTER to Output -200.00mV
13.Press ENTER. The Model 2430 will source -200mV and display the following:
14.Note and record the DMM reading, and then adjust the Model 2430 display to agree with
the DMM reading. Again, the maximum display adjustment is within ±10% of the
present range.
15.After adjusting the display value to agree with the DMM reading, press ENTER and note
that the instrument displays:
V-CAL
Press ENTER to Output +000.00mV
16.Press ENTER. The Model 2430 will source 0mV and simultaneously display the
following:
17.Note and record the DMM reading, and then adjust the display to agree with the DMM
reading. Once again, the maximum adjustment is within ±1% of the present range.
18.After adjusting the display to agree with the DMM reading, press ENTER to complete
calibration of the present range.
-
2-10Calibration
19.Press EXIT to return to normal display, then select the 2V source range. Repeat steps 2
through 18 for the 2V range.
20.After calibrating the 2V range, repeat the entire procedure for the 20V and 100V ranges
using Table 2-3 as a guide. Be sure to select the appropriate source range with the EDIT
and RANGE keys before calibrating each range.
21.Press EXIT as necessary to return to normal display.
Table 2-3
Front panel voltage calibration
Source
range
1
Source voltageMultimeter voltage reading
2
0.2V+200.00mV
+000.00mV
-200.00mV
+000.00mV
+2.0000V
2V
+0.0000V
-2.0000V
+0.0000V
+20.000V
20V
+00.000V
-20.000V
+00.000V
+100.00V
100V
+000.00V
-100.00V
+000.00V
1. Use EDIT and RANGE keys to select source range.
2. Multimeter reading used in corresponding calibration step. See procedure.
8.Note and record the DMM reading, and then adjust the Model 2430 display to agree
exactly with the actual DMM reading. Use the up and down arrow keys to select the digit
value, and use the left and right arrow keys to choose the digit position (or use the num
ber keys, 0-9, +/-). Note that the display adjustment range is within ±10% of the present
range.
9.After adjusting the display to agree with the DMM reading, press ENTER. The instrument will then display the following:
I-CAL
Press ENTER to Output +00.000μA
-
2-12Calibration
10.Press ENTER. The Model 2430 will source 0mA and at the same time display the following:
11.Note and record the DMM reading, and then adjust the Model 2430 display to agree with
the actual DMM reading. Note that the display value adjustment limits are within ±1%
of the present range.
12.After adjusting the display value to agree with the DMM reading, press ENTER. The
unit will then display the following:
I-CAL
Press ENTER to Output -10.000μA
13.Press ENTER. The Model 2430 will source -10μA and display the following:
14.Note and record the DMM reading, and then adjust the Model 2430 display to agree with
the DMM reading. Again, the maximum display adjustment is within ±10% of the
present range.
15.After adjusting the display value to agree with the DMM reading, press ENTER and note
that the instrument displays:
I-CAL
Press ENTER to Output +00.000μA
16.Press ENTER The Model 2430 will source 0μA and simultaneously display the following:
17.Note and record the DMM reading, and then adjust the display to agree with the DMM
reading. Once again, the maximum adjustment is within ±1% of the present range.
18.After adjusting the display to agree with the DMM reading, press ENTER to complete
calibration of the present range.
19.Press EXIT to return to the normal display, and then select the 100μA source range using
the EDIT and up RANGE keys. Repeat steps 2 through 18 for the 100μA range.
20.After calibrating the 100μA range, repeat the entire procedure for the 1mA through 1A
ranges using Table 2-4 as a guide. Be sure to select the appropriate source range with the
EDIT and up RANGE keys before calibrating each range.
21.After calibrating the 1A range, connect the 1Ω characterized resistor and DMM to the
Model 2430 INPUT/OUTPUT jacks, as shown in Figure 2-3.
22.Select the DMM DC volts function.
23.Repeat steps 2 through 18 for the 3A range using Table 2-4 as a guide. When entering
the DMM reading, use the calculated current as follows: I = V/R, where V is the DMM
voltage reading, and R is the characterized value of the 1Ω resistor.
Figure 2-3
3A range current
calibration test
connections
Calibration 2-13
4- WIRE
INPUT/
SENSE
OUTPUT
HI
125V
125V
5V
PEAK
PEAK
PEAK
®
Ω
230
FILTER
89
STORE
2430 1KW PULSE SourceMeter
SOURCE
FCTN
I
V
4
5
EDIT
TRIG
SWEEP
LIMIT
+/-
EXIT ENTER
RECALL
CONFIG MENU
MEAS
EDIT
V
I
DISPLAY
1
TOGGLE
LOCAL
REL
POWER
67
DIGITS SPEED
LO
250V
PEAK
!
RANGE
AUTO
TERMINALS
ON/OFF
FRONT/
RANGE
REAR
OUTPUT
1Ω Resistor
Model 2430
Digital Multimeter
Input HI
Input LO
2-14Calibration
Table 2-4
Front panel current calibration
Source range
1
Source current
10μA+10.000μA
+00.000μA
-10.000μA
+00.000μA
100μA+100.00μA
+000.00μA
-100.00μA
+000.00μA
1mA+1.0000mA
+0.0000mA
-1.0000mA
+0.0000mA
10mA+10.000mA
+00.000mA
-10.000mA
+00.000mA
100mA+100.00mA
+000.00mA
-100.00mA
+000.00mA
1A+1.0000A
+0.0000A
-1.0000A
+0.0000A
3
3A
+3.0000A
+0.0000A
-3.0000A
+0.0000A
1. Use EDIT and RANGE keys to select source range.
2. Multimeter reading used in corresponding calibration step. See procedure.
3. Current calculated as follows: I = V/R, where V is the DMM reading, and R is the
characterized value of the 1Ω resistor. 10A pulse range calibrated simultaneously.
7.Set the calibration due date to the desired value, then press ENTER. Press ENTER again
to confirm the date.
8.Once the calibration dates are entered, calibration is complete, and the following message will be displayed:
CALIBRATION COMPLETE
Press ENTER to confirm; EXIT to abort
9.Press ENTER to save the calibration data (or press EXIT to abort without saving calibration data.) The following message will be displayed:
CALIBRATION SUCCESS
Press ENTER or EXIT to continue.
10.Press ENTER or EXIT to complete process.
Step 5: Lock out calibration
1.From normal display, press MENU.
2.Select CAL, then press ENTER. The Model 2430 will display the following:
CALIBRATION
UNLOCK EXECUTE VIEW-DATES
ß
SAVE LOCK CHANGE-PASSWORD
3.Select LOCK, then press ENTER to lock out calibration.
Remote calibration
Use the following procedure to perform remote calibration by sending SCPI commands over
the IEEE-488 bus or RS-232 port. The remote commands and appropriate parameters are sepa
rately summarized for each step.
Remote calibration commands
Table 2-5 summarizes remote calibration commands. For a more complete description of
these commands, refer to Appendix B.
1. Calibration was not unlocked with :CODE command.
2. Invalid data exists. (For example, cal step failed or was aborted.)
3. Incomplete number of cal steps were performed. ( For example, omitting a negative full-scale step.)
Ranges that calibrated successfully will be saved if calibration is unlocked. Ranges that failed will not be saved.
Query number of times 2430 has been calibrated.
Save calibration data to EEPROM.*
Lock calibration, inhibit SAVE command operation.
Request cal lock status.
Program calibration year, month, day.
Query calibration year, month, day.
Program calibration due year, month, day.
Query calibration due year, month, day.
Calibrate active measure range. (See Table 2-6 parameters.)
Query measurement cal constants for active range.
Calibrate active source range. (See Table 2-7 parameters.)
Query source cal constants for active range.
Recommended calibration parameters
The maximum calibration command parameter ranges are: 75% to 150% of full scale for positive and negative full scale calibration points; ± zero calibration steps have ±50% of full scale
for valid entry ranges. However, for optimum calibration, it is recommended that you use cali
bration points within the ranges listed in Table 2-6 and Table 2-7. Note that each sense range
requires three parameters: zero, negative full scale, and positive full scale. Similarly, each
source range requires four parameters: two zero parameters, a positive full-scale parameter, and
a negative full-scale parameter.
Note: Parameter steps for each range may be performed in any order, but all parameter steps for each range
must be completed. For optimum calibration, use parameters within recommended limits.
(negative full scale)
-0.18 to -0.22
-1.8 to -2.2
-18 to -22
-90 to -110
-9E-6 to -11E-6
-90E-6 to -110E-6
-0.9E-3 to -1.1E-3
-9E-3 to -1E-3
-90E-3 to -110E-3
-0.9 to -1.1
-2.7 to -3.15
Second parameter
(negative zero)
-0.002 to +0.002
-0.02 to +0.02
-0.2 to +0.2
-1 to +1
-1E-7 to +1E-7
-1E-6 to +1E-6
-1E-5 to +1E-5
-1E-4 to +1E-4
-1E-3 to +1E-3
-1E-2 to +1E-2
-3E-2 to +3E-2
Third parameter
(positive full scale)
+0.18 to +0.22
+1.8 to +2.2
+18 to +22
+90 to +110
+9E-6 to +11E-6
+90E-6 to +110E-6
+0.9E-3 to +1.1E-3
+9E-3 to +11E-3
+90E-3 to +110E-3
+0.9 to +1.1
+2.7 to +3.15
parameter
(positive zero)
-0.002 to +0.002
-0.02 to +0.02
-0.2 to +0.2
-1 to +1
-1E-7 to +1E-7
-1E-6 to +1E-6
-1E-5 to +1E-5
-1E-4 to +1E-4
-1E-3 to +1E-3
-1E-2 to +1E-2
-3E-2 to +3E-2
2-18Calibration
Remote calibration procedure
Step 1: Prepare the Model 2430 for calibration
1.Connect the Model 2430 to the controller IEEE-488 interface or RS-232 port using a
shielded interface cable.
2.Turn on the Model 2430 and the test equipment, and allow them to warm up for at least
one hour before performing calibration.
3.If you are using the IEEE-488 interface, make sure the primary address of the Model
2430 is the same as the address specified in the program you will be using to send com
mands. (Use the MENU key and the COMMUNICATION menu to access the IEEE-488
address.)
Step 2: Voltage calibration
1.Connect the Model 2430 to the digital multimeter, and select the multimeter DC volts
function. (See Figure 1-2).
2.Send the commands summarized in Table 2-8 in the order listed to initialize voltage calibration. (When the :CAL:PROT:CODE command is sent, the instrument will assume
the operating states listed in Table 2-2.)
3.Perform the range calibration steps listed in Table 2-9 for all ranges. For each range:
• Send the :SOUR:VOLT:RANG command to select the source and sense range being
calibrated. For example, for the 2V range, the following command would be sent
:SOUR:VOLT:RANG 2
• Program the source to output the negative full-range value using the :SOUR:VOLT
command. For example:
:SOUR:VOLT -2
• Note and record the multimeter reading.
• Use the multimeter reading as the parameter for the :CAL:PROT:SOUR and
:CAL:PROT:SENS commands. For example, a typical value for the 2V range would
be:
:CAL:PROT:SOUR -1.998
:CAL:PROT:SENS -1.998
• Program the voltage source for 0V output using the :SOUR:VOLT 0.0 command.
• Note the multimeter reading.
• Send the source and sense calibration commands using the multimeter reading for the
parameter. For example:
:CAL:PROT:SOUR1E-3
:CAL:PROT:SENS 1E-3
• Set the source to the positive full-range value using the :SOUR:VOLT command. For
example:
:SOUR:VOLT 2
• Note and record the multimeter reading.
-
Calibration 2-19
• Send the source and sense commands using the multimeter reading as the parameter.
For example:
:CAL:PROT:SOUR 1.997
:CAL:PROT:SENS 1.997
• Send the SOUR:VOLT 0.0 command to set the source voltage to 0V.
• Note and record the multimeter reading.
• Send the :CAL:PROT:SOUR command using the multimeter reading as the command parameter. For example:
:CAL:PROT:SOUR -1.02E-3
Table 2-8
Voltage calibration initialization commands
CommandDescription
*RSTRestore GPIB defaults.
:SOUR:FUNC VOLTActivate voltage source.
:SENS:CURR:PROT 0.1Current limit when voltage source is active.
:SENS:CURR:RANG 0.1Make sure 1A range is not active.
:SOUR:VOLT:PROT MAXMaximum allowable source voltage.
:SYST:RSEN OFFDisable remote sensing.
:CAL:PROT:CODE 'KI002430' Unlock cal.
:OUTP:STAT ONTurn source on.
Step 3: Current calibration
1.Connect the Model 2430 to the digital multimeter (Figure 2-2), and select the multimeter
DC current function.
2.Send the commands summarized in Table 2-10 in the order listed to initialize current
calibration.
:SOUR:VOLT:RANGE <Range>
:SOUR:VOLT -<Source_value>
Take DMM reading.
2
:CAL:PROT:SOUR <DMM_Reading>
Check 2430 for errors.
3
:CAL:PROT:SENS <DMM_Reading>
Check 2430 for errors.
:SOUR:VOLT 0.0
Take DMM reading.
:CAL:PROT:SOUR <DMM_Reading>
Check 2430 for errors.
:CAL:PROT:SENS <DMM_Reading>
Check 2430 for errors.
:SOUR:VOLT +<Source_value>
Take DMM reading.
:CAL:PROT:SOUR <DMM_Reading>
Check 2430 for errors.
:CAL:PROT:SENS <DMM_Reading>
Check 2430 for errors.
:SOUR:VOLT 0.0
Take DMM reading.
:CAL:PROT:SOUR <DMM_Reading>
Description
Select source range.
Establish negative full-range polarity.
Read actual output value.
Calibrate sense function negative full scale.
Calibrate source function negative full scale.
Set output to 0V.
Read actual output value.
Calibrate sense function negative zero.
Calibration source function negative zero.
Establish positive full-range polarity.
Read actual output value.
Calibrate sense function positive full scale.
Calibrate source function positive full scale.
Set output to 0V.
Read actual output value.
Calibrate source positive zero.
1. Perform complete procedure for each range, where <Range> = 0.2, 2, 20, and 100, and<Source_value> = 0.2, 2,
20, and 100.
2. <DMM_Reading> parameter is multimeter reading from previous step.
3. Use :SYST:ERR? query to check for errors.
3.Calibrate the 10μA to 1A current ranges using the procedure summarized in Table 2-11.
For each range:
•Send the :SOUR:CURR:RANG command to select the source and sense range
being calibrated. For example, for the 1mA range, the command is:
:SOUR:CURR:RANG 1E-3
•Program the source to output the negative full-range value using the
:SOUR:CURR command. For example:
:SOUR:CURR -1E-3
•Note and record the multimeter reading.
•Use the multimeter reading as the parameter for the :CAL:PROT:SOUR and
:CAL:PROT:SENS commands. For example, a typical value for the 1mA range
would be:
:CAL:PROT:SOUR -1.025E-3
:CAL:PROT:SENS -1.025E-3
Calibration 2-21
•Program the current source for 0A output using the :SOUR:CURR 0.0 command.
•Note the multimeter reading.
•Send the source and sense calibration commands using the multimeter reading for
the parameter. For example:
:CAL:PROT:SOUR 1E-6
:CAL:PROT:SENS 1E-6
•Set the source to the positive full-range value using the :SOUR:CURR command.
For example, for the 1mA range:
:SOUR:CURR 1E-3
•Note and record the multimeter reading.
•Send the source and sense commands using the multimeter reading as the parameter. For example:
:CAL:PROT:SOUR 1.03E-3
:CAL:PROT:SENS 1.03E-3
•Send the :SOUR:CURR 0.0 command to set the source current to 0A.
•Note and record the multimeter reading.
•Send the :CAL:PROT:SOUR command using the multimeter reading as the command parameter. For example:
:CAL:PROT:SOUR 1E-6
4.Connect the 1Ω resistor and DMM to the Model 2430 INPUT/OUPUT jacks, as shown
in Figure 2-3. Select the DMM DC volts function.
5.Repeat step 3 for the 3A range using the calculated current as follows: I = V/R, where V
is the DMM voltage reading, and R is the characterized value of the 1Ω resistor.
Table 2-10
Current calibration initialization commands
CommandDescripton
:SOUR:FUNC CURR
:SENS:VOLT:PROT 20
:SENS:VOLT:RANG 20
:OUTP:STAT ON
Select source current mode.
Voltage limit when current source is active.
Make sure 100V range is not active.
Turn source on.
2-22Calibration
Table 2-11
Current range calibration commands
Step Command/procedure
1
:SOUR:CURR:RANGE <Range>
2
:SOUR:CURR -<Source_value>
3
Take DMM reading.
4
:CAL:PROT:SOUR <DMM_Reading>
5
Check 2430 for errors.
6
:CAL:PROT:SENS <DMM_Reading>
7
Check 2430 for errors.
8
:SOUR:CURR 0.0
9
Take DMM reading.
10
:CAL:PROT:SOUR <DMM_Reading>
11
Check 2430 for errors.
12
:CAL:PROT:SENS <DMM_Reading>
13
Check 2430 for errors.
14
:SOUR:CURR +<Source_value>
15
Take DMM reading.
16
:CAL:PROT:SOUR <DMM_Reading>
17
Check 2430 for errors.
18
:CAL:PROT:SENS <DMM_Reading>
19
Check 2430 for errors.
20
:SOUR:CURR 0.0
21
Take DMM reading.
22
:CAL:PROT:SOUR <DMM_Reading>
1. Perform complete procedure for each range, where <Range> and <Source_value> = 10E-6, 100E-6, 1E-3, 10E-3,
100E-3, 1, or 3.
2. <DMM_Reading> parameter is multimeter reading from previous step.
3. Use :SYST:ERR? query to check for errors.
1
Description
Select source range.
2
3
Establish negative full-range polarity.
Read actual output value.
Calibrate sense function negative full scale.
Calibrate source function negative full scale.
Set output to 0A.
Read actual output value.
Calibrate sense function negative zero.
Calibration source function negative zero.
Establish positive full-range polarity.
Read actual output value.
Calibrate sense function positive full scale.
Calibrate source function positive full scale.
Set output to 0A.
Read actual output value.
Calibrate source positive zero.
Step 4: Program calibration dates
Use following commands to set the calibration date and calibration due date:
Note that the year, month, and date must be separated by commas. The allowable range for
the year is from 1998 to 2097, the month is from 1 to 12, and the date is from 1 to 31.
Step 5: Save calibration constants
Calibration is now complete, so you can store the calibration constants in EEROM by sending
the following command:
:CAL:PROT:SAVE
NOTECalibration will be temporary unless you send the SAVE command. Also, cali-
bration data will not be saved if (1) calibration is locked, (2) invalid data exists,
or (3) all steps were not completed.
Step 6: Lock out calibration
To lock out further calibration, send the following command after completing the calibration
procedure:
:CAL:PROT:LOCK
Single-range calibration
Normally, the complete calibration procedure should be performed to ensure that the entire
instrument is properly calibrated. In some instances, however, you may want to calibrate only
certain ranges. To do so, complete the entire procedure only for the range(s) to be calibrated.
Keep in mind, however, that you must complete all parameter steps for each source or sense
range. Also be sure to set calibration dates and save calibration after calibrating the desired
range(s).
Calibration 2-23
2-24Calibration
3
Routine Maintenance
3-2Routine Maintenance
CAUTION:FOR CONTINUED PROTECTION AGAINSTFIRE HAZARD,REPLACE FUSE WITH SAME TYPE AND RATING.
CAUTION:FOR CONTINUED PROTECTION AGAINSTFIRE HAZARD,REPLACE FUSE WITH SAME TYPE AND RATING.
WARNING:NO INTERNAL OPERATOR SERVICABLE PARTS,SERVICE BY QUALIFIED PERSONNEL ONLY.
WARNING:NO INTERNAL OPERATOR SERVICABLE PARTS,SERVICE BY QUALIFIED PERSONNEL ONLY.
Introduction
The information in this section deals with routine type maintenance that can be performed by
the operator.
Line fuse replacement
WARNINGDisconnect the line cord at the rear panel, and remove all test leads con-
nected to the instrument (front and rear) before replacing the line fuse.
The power line fuse is accessible from the rear panel, just above the AC power receptacle
(Figure 3-1).
Figure 3-1
Rear panel
125V
PEAK
4-WIRE
SENSE
WITH FRONT PANEL MENU)
PEAK
LO
(ENTER IEEE ADDRESS
HI
5V
OUTPUT
IEEE-488
INPUT/
125V
PEAK
5V
PK
250V
PEAK
PEAK
5V
V, Ω,
GUARD
GUARD
SENSE
RS232
TRIGGER
LINK
MADE IN
U.S.A.
LINE FUSE
SLOWBLOW
3.15A, 250V
LINE RATING
100-240VAC
50, 60HZ
250VA MAX
OUTPUT
ENABLE
Perform the following steps to replace the line fuse:
1.Carefully grasp and squeeze together the locking tabs that secure the fuse carrier to the
fuse holder.
2.Pull out the fuse carrier, and replace the fuse with the type specified in Table 3-1.
CAUTIONTo prevent instrument damage, use only the fuse type specified in Table 3-
1.
3.Re-install the fuse carrier.
Routine Maintenance 3-3
NOTEIf the power line fuse continues to blow, a circuit malfunction exists and must be
corrected. Refer to the troubleshooting section of this manual for additional
information.
Table 3-1
Power line fuse
Line voltageRatingKeithley part no.
100-240V250V, 3.15A, Slow
Blow 5 × 20mm
FU-106-3.15
3-4Routine Maintenance
4
Troubleshooting
4-2Troubleshooting
Introduction
This section of the manual will assist you in troubleshooting and repairing the Model 2430.
Included are self-tests, test procedures, troubleshooting tables, and circuit descriptions. Note
that disassembly instructions are located in Section 5, and component layout drawings are at the
end of Section 6.
Safety considerations
WARNINGThe information in this section is intended for qualified service personnel
only. Do not perform these procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages that could
cause personal injury or death. Use caution when working with hazardous
voltages.
Be sure to observe the following precautions regarding heat sink cool-down
time and capacitor voltage bleed-down time:
Heat sink cool-down time: five minutes to 70°C, 15 minutes to 50°C.
Capacitor voltage bleed-off time: two minutes to 50V, five minutes to 5V.
Repair considerations
Before making any repairs to the Model 2430, be sure to read the following considerations.
CAUTIONThe PC-boards are built using surface mount techniques and require spe-
cialized equipment and skills for repair. If you are not equipped and/or
qualified, it is strongly recommended that you send the unit back to the fac
tory for repairs or limit repairs to the PC-board replacement level. Without
proper equipment and training, you could damage a PC-board beyond re
pair.
•Repairs will require various degrees of disassembly. However, it is recommended that
the Front Panel Tests be performed prior to any disassembly. The disassembly instruc
tions for the Model 2430 are contained in Section 5 of this manual.
•Do not make repairs to surface mount PC-boards unless equipped and qualified to do so.
(See previous CAUTION.)
•When working inside the unit and replacing parts, adhere to the handling precautions
and cleaning procedures explained in Section 5.
•Many CMOS devices are installed in the Model 2430. These static-sensitive devices
require special handling as explained in Section 5.
•Whenever a circuit board is removed or a component is replaced, the Model 2430 must
be recalibrated. See Section 2 for details on calibrating the unit.
-
-
-
Power-on self-test
During the power-on sequence, the Model 2430 will perform a checksum test on its EPROM
and test its RAM. If one of these tests fails, the instrument will lock up.
Front panel tests
There are three front panel tests: one to test the functionality of the front panel keys and two
to test the display. In the event of a test failure, refer to Display board checks for details on trou
bleshooting the display board.
KEYS test
The KEYS test lets you check the functionality of each front panel key. Perform the following
steps to run the KEYS test.
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
4.Select KEYS, and press ENTER to start the test. When a key is pressed, the label name
for that key will be displayed to indicate that it is functioning properly. When the key is
released, the message “No keys pressed” is displayed.
5.Pressing EXIT tests the EXIT key. However, the second consecutive press of EXIT
aborts the test and returns the instrument to the SELF-TEST MENU. Continue pressing
EXIT to back out of the menu structure.
Troubleshooting 4-3
-
DISPLAY PATTERNS test
The display test lets you verify that each pixel and annunciator in the vacuum fluorescent dis-
play is working properly. Perform the following steps to run the display test:
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
4.Select DISPLAY-PATTERNS, and press ENTER to start the display test. There are five
parts to the display test. Each time a front panel key (except EXIT) is pressed, the next
part of the test sequence is selected. The five parts of the test sequence are as follows:
• Checkerboard pattern (alternate pixels on) and all annunciators.
• Checkerboard pattern and the annunciators that are on during normal operation.
• Horizontal lines (pixels) of the first digit are sequenced.
• Vertical lines (pixels) of the first digit are sequenced.
4-4Troubleshooting
5.When finished, abort the display test by pressing EXIT. The instrument returns to the
CHAR SET test
The character set test lets you display all characters. Perform the following steps to run the
character set test:
1.Display the MAIN MENU by pressing the MENU key.
2.Select TEST, and press ENTER to display the SELF-TEST MENU.
3.Select DISPLAY-TESTS, and press ENTER to display the following menu:
4.Select CHAR-SET, and press ENTER to start the character set test. Press any key except
5.When finished, abort the character set test by pressing EXIT. The instrument returns to
• Each digit (and adjacent annunciator) is sequenced. All the pixels of the selected digit
are on.
FRONT PANEL TESTS MENU. Continue pressing EXIT to back out of the menu
structure.
FRONT PANEL TESTS
KEYS DISPLAY-PATTERNS CHAR-SET
EXIT to cycle through all displayable characters.
the FRONT PANEL TESTS MENU. Continue pressing EXIT to back out of the menu
structure.
Principles of operation
The following information is provided to support the troubleshooting tests and procedures
covered in this section of the manual. Refer to the following drawings:
Figure 4-1 — Overall block diagram
Figure 4-2 — Analog circuitry block diagram
Figure 4-3 — Power supply block diagram
Figure 4-4 — Output stage simplified schematic
Figure 4-5 — Digital circuitry block diagram
Overall block diagram
Figure 4-1 shows an overall block diagram of the Model 2430. Circuitry may be divided into
three general areas:
•Analog circuits — includes sourcing circuits such as the DACs, clamps, output stage,
and feedback circuits, as well as measurement circuits such as the A/D converter.
•Digital circuits — includes the microcomputer that controls the analog section, front
panel, and GPIB and RS-232 ports, as well as associated interfacing circuits.
•Power supplies — converts the AC line voltage into DC voltages that supply the power
for the digital and analog circuits, and the output stage.
Troubleshooting 4-5
Figure 4-1
Overall block
diagram
Display,
Keyboard
Digital
I/O
Trigger
Link
Front
Panel
Controller
Trigger,
Digital
I/O
DACs
Analog Section
Clamps
Feedback
A/D
Converter
Microcomputer
Digital Section
Output
Stage
RS-232
GPIB
Interface
Guard
Buffer
Output
Guard
RS-232 I/O
GPIB I/O
To Analog
Circuits
±15V
Analog
Power
Supply
Output Stage
+5V±42V
Power Supply
To
Digital Circuits
±150V +5V +12V
Output
Stage
Power
Supply
Line In
To
Digital
Power
Supply
4-6Troubleshooting
Analog circuits
Figure 4-2 shows a block diagram of the analog circuits.
D/A converters control the programmed voltage and current, or voltage compliance and current
compliance. Each DAC has two ranges, a 10V full-scale output or a 1V full-scale output. The
DAC outputs are fed to the summing node, FB. Either the V DAC or the I DAC has the ability to
control the main loop. If the unit is set for SV (source voltage), it will source voltage until the com
pliance current is reached (as determined by the I DAC setting), and the current loop will override
the voltage loop. If, however, the unit is set for SI (source current), it will source current until the
compliance voltage is reached (as determined by the V DAC setting), and the voltage loop will
override the current loop. A priority bit in the Vclamp/I clamp circuit controls these functions.
The error amplifier adds open-loop gain and slew-rate control to the system to assure accuracy and provide a controllable signal for the output stage, which provides the necessary voltage
and current gain to drive the output. Sense resistors in the HI output lead provide output current
sensing, and a separate sense resistor is used for each current range. The 1A and 3A ranges use
0.2V full-scale for a full-range output, while all other ranges use 2V output for full-scale current.
Voltage feedback is routed either internally or externally.
Figure 4-2
Analog circuitry block diagram
-
Output
Stage
+42
+150
+
-
Sense
Resistors
Protection
Protection
V DAC
I DAC
Control
FB
VFB
A/D
IFB
V Clamp
I Clamp
+7
MUX
-42
-150
Error
Amp
O
VFB
IFB
S
Remote
O
There are four voltage ranges: 0.2V, 2V, 20V, and 100V. The feedback gain changes only for
the 20V and 100V ranges, resulting in three unique feedback gain values. A multiplexer directs
the voltage feedback, current feedback, reference, or ground signal to the A/D converter. An
opto-isolated interface provides control signals for both DACs, analog circuit control, and A/D
converter communication to the digital section.
O
Output
HI
S+
Output
LO
S-
Guard
Out
Guard
Sense
Power supply
Figure 4-3 shows a block diagram of the Model 2430 power delivery system.
The offline switching power supply provides all power for the instrument while providing
universal inputs and power factor correction for the 120/240V line. The digital board runs off of
5V and 12V supplies derived from the switcher. The +12VD supply is set to program the flash
ROM. (See Digital circuitry below.)
A constant-frequency switching supply runs off the +24VD supplies and generates all the floating and output supply voltages for the analog board: +5V and ±15V, ±42V and ±150V.
Troubleshooting 4-7
Figure 4-3
Power supply block
diagram
Output stage
Line
Neutral
Analog Board
+15V+5V
FFF
Constant Frequency
Low Noise Floating
Switching Supply
+24Vdc
D
PFC
DC/DC
Converter
Output Stage
F
-15V
+24dc
Regulators
D
-150V -42V+42V +150V
+12Vdc
DC
+5Vdc
O
Digital
Circuits
D
Figure 4-4 shows a simplified schematic of the output stage.
The Model 2430 output stage serves two purposes: (1) it converts signals from floating common to output common, and (2) it provides both voltage and current amplification. The output
stage drive transistors are biased in class B configuration to prevent the possibility of thermal
runaway with high-current output values.
Output transistors Q518 and Q521 are cascoded with output MOSFETs Q516 and Q523. All
other MOSFETs and transistors are slaves, and the voltages across these devices are determined
4-8Troubleshooting
by the resistor-capacitor ladder circuits shown. High-current drive capability is provided by
Q500-Q511. Coarse current limits are built into the output stage.
Figure 4-4
Output stage simplified
schematic
Maindrive
O
+15Vf
300K
-15Vf
F
HI Drive
+
-
+85V
Q500
+42V
Q504
Q514
Q516
Q518
F
Q521
F
Q523
HI Drive
Q525
Q507
-42V
Q503
-85V
A/D converter
The SourceMeter unit uses a multi-slope charge balance A/D converter with a single-slope
run-down. The converter is controlled by gate array U610. Commands are issued by the MPU
on the digital board through communications opto-isolators to U610, and U610 sends A/D read
ing data back through opto-isolators to the digital board for calibration and processing.
Active guard
The Model 2430 has an active guard or “six-wire ohms” circuit used to measure complex devices. This circuitry provides a low-current (50mA) equivalent of the voltage on output HI. If
the unit is in the SV mode, the low-current equivalent of the source voltage will appear on the
guard terminal. If the unit is in the SI mode, the voltage on output HI is equal to the source cur
rent multiplied by the external resistance value. An equivalent voltage will be generated by the
guard circuit, and a guard sense terminal is provided to sense around the voltage drop in the
guard leads since significant current can flow (50mA).
Digital circuitry
Refer to Figure 4-5 for the following discussion on digital circuitry.
The core digital circuitry uses a Motorola 68332 microcontroller running at 16.78MHz. The
memory configuration includes two 256K × 8-bit flash EEPROMs and two 128K × 8-bit RAMs
used in parallel to utilize the 16-bit data bus of the MPU. The RAM is battery backed-up,
providing continued storage of data buffer information during power-down cycles, and flash
ROM support allows internal firmware upgrades using either the serial or GPIB port for
downloading new firmware. All calibration constants and the save 0 setup are stored in a
separate serial EEPROM. Setups 1 through 4 are stored in battery backed-up RAM.
Troubleshooting 4-9
-
-
External communication is provided via GPIB and serial interfaces. A 9914 GPIB IEEE-488
standard interface IC is used for the GPIB, and a 68332 Queued Serial Module (QSM) provides
the serial UART. For internal communications, the Time Processing Unit (TPU) is used for serial
communications with the front panel display module, and both the TPU and QSM handle digitalto-analog interfacing.
4-10Troubleshooting
Figure 4-5
Digital circuitry
block diagram
A/D
Control/Data
Reset
2
E PROM
U17
A/D Interface
U9, U25
Voltage Source
Control
ROM
U15, U16
Microprocessor
U3
RAM
U12, U14
16.78MHz
Serial
Interface
U4
GPIB
U6, U13
U20
Digital I/O
Trigger
U23
U7
RS-232 Interface
IEEE-488 Interface
To Display
Board Controller
Trigger
Digital
I/O
Display board circuit theory
Display board components are shown in the digital circuitry block diagram in Figure 4-5.
U902 is the display microcontroller that controls the VFD (vacuum fluorescent display) and
interprets key data. The microcontroller has four peripheral I/O ports that are used for the vari
ous control and read functions.
Display data is serially transmitted to the microcontroller from the digital board via the TXB
line to the microcontroller PD0 terminal. In a similar manner, key data is serially sent back to
the digital board through the RXB line via PD1. The 4MHz clock for the microcontroller is gen
erated on the digital board.
DS901 is the VFD (vacuum fluorescent display) module, which can display up to 49 characters. Each character is organized as a 5 × 7 matrix of dots or pixels and includes a long underbar segment to act as a cursor.
-
-
The display uses a common multiplexing scheme with each character refreshed in sequence.
U903 and U904 are the grid drivers, and U901 and U905 are the dot drivers. Note that dot driver
and grid driver data is serially transmitted from the microcontroller (PD3 and PC1).
The front panel keys (S901-S931) are organized into a row-column matrix to minimize the
number of microcontroller peripheral lines required to read the keyboard. A key is read by strob
ing the columns and reading all rows for each strobed column. Key down data is interpreted by
the display microcontroller and sent back to the main microprocessor using proprietary encod
ing schemes.
Troubleshooting
Troubleshooting information for the various circuits is summarized below.
Display board checks
If the front panel display tests indicate that there is a problem on the display board, use Table
4-1. See “Principles of operation” for display circuit theory.
Table 4-1
Display board checks
Troubleshooting 4-11
-
-
Step Item/componentRequired conditionRemarks
1
Front panel test
2
J1033
3
U902, pin 1
4
U902, pin 43
5
U902, pin32
6
U902, pine 33
Verify that all segments operate.
+5V, ±5%
Goes low briefly on power up, and
then goes high.
4MHz square wave.
Pulse train every 1 ms.
Brief pulse train when front panel key
is pressed.
Use front panel display test.
Digital +5V supply.
Microcontroller RESET.
Controller 4MHz clock.
Control from main processor.
Key down data sent to main
processor.
4-12Troubleshooting
Power supply checks
Power supply problems can be checked using Table 4-2. See “Principles of operation” for
circuit theory on the power supply.
Table 4-2
Power supply checks
StepItem/componentRequired conditionRemarks
1
2
3
4
5
6
7
8
9
Digital circuitry checks
Line fuse
Line power
TP502
TP503
TP504
TP505
TP507
TP508
TP510
Check continuity.
Plugged into live receptacle,
power on.
+150V, ±5%
-150V, ±5%
+38V, ±10%
-38V, ±10%
+15V, ±5%
-15V, ±5%
+5V, ±5%
Remove to check.
Check for correct power-up
sequence.
Referenced to TP501.
Referenced to TP501.
Referenced to TP501.
Referenced to TP501.
+15VF, referenced to TP500.
-15VF, referenced to TP500.
+5VF, referenced to TP500.
Digital circuit problems can be checked out using Table 4-3. See “Principles of operation”
for a digital circuit description.
Table 4-3
Digital circuitry checks
Step Item/componentRequired conditionRemarks
1
Power-on test
2
U3 pin 19
3
U3 pin 7
4
U3 pin 68
5
U3, lines A0-A19
6
U3, lines D0-D15
7
U3 pin 66
8
U4 pin 7
9
U4 pin 8
10
U13 pins 34-42
11
U13 pins 26-31
12
U13 pin 24
13
U13 pin 25
14
U3 pin 43
15
U3 pin 44
16
U3 pin 45
17
U3 pin 47
RAM OK, ROM OK.
Digital common.
+5V
Low on power-up, then goes high.
Check for stuck bits.
Check for stuck bits.
16.78MHz.
Pulse train during RS-232 I/O.
Pulse train during RS-232 I/O.
Pulse train during IEEE-488 I/O.
Pulses during IEEE-488 I/O.
Low with remote enabled.
Low during interface clear.
Pulse train.
Pulse train.
Pulse train.
Pulse train.
Verify that RAM and ROM are
functional.
All signals referenced to digital
common.
Digital logic supply.
MPU RESET line.
MPU address bus.
MPU data bus.
MPU clock.
RS-232 RX line.
RS-232 TX line.
IEEE-488 data bus.
IEEE-488 command lines.
IEEE-488 REN line.
IEEE-488 IFC line.
D_ADDATA
D_DATA
D_CLK
D_STB
NOTEThe test points in Table 4-4 are located under the pulse board and may be difficult
to access without using a jumper cable for J1024.
>100V voltage protection
SOURCE +10V
SOURCE +10V (SVMI)
SOURCE +10V
SOURCE +10V
OUTPUT COM
OUTPUT COM
SVMI, OUTPUT ON, 20V
Bench defaults
Troubleshooting 4-13
-13V ±1V
-5V ±0.5V
-10V ±1V
-10.5V ±1V
0V ±0.1V
7V ±0.7V
7V ±0.7V
20V ±0.5V
6.4V ±0.6V
Battery replacement
WARNINGDisconnect the instrument from the power line and all other equipment be-
fore changing the battery.
The volatile memories of the Model 2430 are protected by a replaceable battery when power is
off. Typical battery life is approximately 10 years, but the battery should be replaced if the voltage
drops below 2.5V regardless of age. The battery should be suspected if the instrument no longer
retains buffer data or user-defined operating parameters such as instrument setups, source memo
ry, and math expressions. If the battery is absent or totally exhausted, the display will show the
“Reading buffer data lost” message shortly after the Model 2430 is switched on.
The battery is a 3V wafer-type lithium cell, (Keithley part number BA-46), which is located
on the digital board. Replacement of the battery requires removal of the case cover and analog
board assembly. (See Section 5.) Use only the recommended battery.
NOTECalibration constants and user-defined parameters will be lost when the battery
is replaced. The Model 2430 must be re-calibrated (section 2) after the battery
is replaced.
-
4-14Troubleshooting
Battery replacement precautions
WARNINGThe following precautions must be followed to avoid personal injury.
1.Wear safety glasses or goggles when working with lithium batteries.
2.Do not short the battery terminals together.
3.Keep lithium batteries away from all liquids.
4.Do not attempt to recharge lithium batteries.
5.Observe proper polarity when installing the battery.
6.Do not incinerate or otherwise expose the battery to excessive heat (>60°C).
7.Bulk quantities of lithium batteries should be disposed of as hazardous waste.
Battery replacement procedure
1.Remove the case cover and analog board assembly as covered in Section 5.
2.Locate the battery on the digital board.
3.Carefully unsolder and remove the old battery.
4.Install and solder the new battery in place.
5.Re-assemble the instrument, and turn it on. The “Reading buffer data lost” error message
6.Send the :SYST:MEM:INIT command via remote to perform the following:
will be displayed.
• Clear the reading buffer.
• Initialize instrument setups 1 through 4 to present instrument settings.
• Initialize all 100 source memory locations to present instrument settings.
• Delete user math expressions.
No comm link error
A “No Comm Link” error indicates that the front panel processor has stopped communicating
with the main processor, which is located on the digital board. This error indicates that one of the
main processor ROMs may require re-seating in its socket. ROMs may be reseated as follows:
1.Turn off the power, and disconnect the line cord and all other test leads and cables from
the instrument.
2.Remove the case cover as outlined in Section 5.
3.Remove the analog board assembly as outlined in Section 5.
4.Locate the two firmware ROMS, U15 and U16, located on the digital board. These are
the only ICs installed in sockets. (Refer to the component layout drawing at the end of
Section 6 for exact locations.)
5.Carefully push down on each ROM IC to make sure it is properly seated in its socket.
CAUTIONBe careful not to push down excessively; digital board could crack.
6.Connect the line cord, and turn on the power. If the problem persists, additional troubleshooting will be required.
5
Disassembly
5-2Disassembly
Introduction
This section explains how to handle, clean, and disassemble the Model 2430. Disassembly
drawings are located at the end of this section.
Handling and cleaning
To avoid contaminating PC board traces with body oil or other foreign matter, avoid touching
the PC board traces while you are repairing the instrument. Motherboard areas covered by the
shield have high-impedance devices or sensitive circuitry where contamination could cause de
graded performance.
Handling PC boards
Observe the following precautions when handling PC boards:
•Wear cotton gloves.
•Only handle PC boards by the edges and shields.
•Do not touch any board traces or components not associated with repair.
•Do not touch areas adjacent to electrical contacts.
•Use dry nitrogen gas to clean dust off PC boards.
-
Solder repairs
Observe the following precautions when you must solder a circuit board:
•Use an OA-based (organic activated) flux, and take care not to spread the flux to other
areas of the circuit board.
•Remove the flux from the work area when you have finished the repair by using pure
water with clean, foam-tipped swabs or a clean, soft brush.
•Once you have removed the flux, swab only the repair area with methanol, then blowdry the board with dry nitrogen gas.
•After cleaning, allow the board to dry in a 50°C, low-humidity environment for several
hours.
Static sensitive devices
CMOS devices operate at very high impedance levels. Therefore, any static that builds up on
you or your clothing may be sufficient to destroy these devices if they are not handled properly.
Use the following precautions to avoid damaging them:
CAUTIONMany CMOS devices are installed in the Model 2430. Handle all semicon-
ductor devices as being static sensitive.
•Transport and handle ICs only in containers specially designed to prevent static buildup. Typically, you will receive these parts in anti-static containers made of plastic or
foam. Keep these devices in their original containers until ready for installation.
•Remove the devices from their protective containers only at a properly grounded work
station. Also, ground yourself with a suitable wrist strap.
•Handle the devices only by the body; do not touch the pins.
•Ground any printed circuit board into which a semiconductor device is to be inserted to
the bench or table.
•Use only anti-static type desoldering tools.
•Use only grounded-tip solder irons.
•Once the device is installed in the PC board, it is normally adequately protected, and you
can handle the boards normally.
Disassembly 5-3
Assembly drawings
Use the assembly drawings located at the end of this section to assist you as you disassemble
and re-assemble the Model 2430. Also, refer to these drawings for information about the Kei
thley part numbers of most mechanical parts in the unit. Assembly drawings include:
Follow the steps below to remove the case cover to gain access to internal parts.
WARNINGBefore removing the case cover, disconnect the line cord and any test leads
from the instrument.
1.Remove handle — The handle serves as an adjustable tilt-bail. Adjust its position by
gently pulling it away from the sides of the instrument case and swinging it up or down.
To remove the handle, swing the handle below the bottom surface of the case and back
until the orientation arrows on the handles line up with the orientation arrows on the
mounting ears. With the arrows lined up, pull the ends of the handle away from the case.
-
5-4Disassembly
2.Remove mounting ears — Remove the screw that secures each mounting ear. Pull
down and out on each mounting ear.
NOTEWhen re-installing the mounting ears, make sure to mount the right ear to the
right side of the chassis, and the left ear to the left side of the chassis. Each ear is
marked “RIGHT” or “LEFT” on its inside surface.
3.Remove rear bezel — To remove the rear bezel, loosen the two screws that secure the
rear bezel to the chassis, then pull the bezel away from the case.
4.Remove grounding screws — Remove the two grounding screws that secure the case
to the chassis. They are located on the bottom of the case at the back.
5.Remove chassis — To remove the case, grasp the front bezel of the instrument, and
carefully slide the chassis forward. Slide the chassis out of the metal case.
NOTETo gain access to the components under the analog board shield, remove the
shield, which is secured to the analog board by a single screw.
Analog board removal
Perform the following steps to remove the analog board. This procedure assumes that the case
cover is already removed.
1.Remove the small pulse board before removing the analog board.
2.Disconnect the front and rear input terminals.
You must disconnect these input terminal connections for both the front and rear inputs:
•INPUT/OUTPUT HI and LO
•4-WIRE SENSE HI and LO
•V, Ω, GUARD and GUARD SENSE (rear panel only)
Remove all the connections by pulling the wires off the pin connectors, then remove the ferrite noise filters from the chassis. During re-assembly, use the following table to identify input
terminals:
Input terminalsFront wire colorRear wire color
INPUT/OUTPUT HI
INPUT/OUTPUT LO
4-WIRE SENSE HI
4-WIRE SENSE LO
V, Ω, GUARD
GUARD SENSE
3.Unplug cables.
•Carefully unplug the ribbon cables at J1027, J1028, and J1029.
•Unplug the ON/OFF cable at J1034.
Red
Black
Yellow
Gray
-
-
White/Red
White/Black
White/Yellow
White/Gray
White
Blue/White
Disassembly 5-5
4.Remove screws.
•Remove two fastening screws that secure the analog board assembly to the
chassis. These screws are located on the side of the board opposite from the heat
sink.
•Remove two screws that secure the heat sink to the chassis.
5.Remove analog board assembly.
After all screws have been removed, carefully lift the analog board assembly free of the
main chassis.
6.Disassemble analog board assembly.
•Remove the screws that secure the analog board and heat sink to the analog board
subchassis.
•Carefully remove the heat sink by sliding the clips off the power transistors.
CAUTIONBe careful not to damage the heat sink insulation layer.
•Remove the analog board from the subchassis.
•Remove four screws that secure the bottom cover, then remove the cover
from the bottom of the PC board.
NOTEWhen re-installing the heat sink, make sure that all clips are properly installed
and centered on each pair of output transistors.
Digital board removal
Perform the following steps to remove the digital board. This procedure assumes that the an-
alog board assembly is already removed.
NOTEIn order to remove the digital board, the display board must first be removed.
1.Remove IEEE-488, Digital I/O, and RS-232 fasteners.
The IEEE-488, Digital I/O, and RS-232 connectors each have two nuts that secure the
connectors to the rear panel. Remove these nuts.
2.Remove POWER switch rod.
At the switch, place the edge of a flat-blade screwdriver in the notch on the pushrod.
Gently twist the screwdriver while pulling the rod from the shaft.
3.Unplug cables:
•Unplug the display board ribbon cable.
•Unplug the cables going to the power supply.
•Unplug the rear panel power module cable.
•The fan may need to be removed.
4.Remove digital board.
Slide the digital board forward until it is free of the guide pins, then remove the board.
5-6Disassembly
During re-assembly, replace the board, and start the IEEE-488, Digital I/O, and RS-232
connector nuts and the mounting screw. Tighten all the fasteners once they are all in
place and the board is correctly aligned.
Front panel disassembly
Use the following procedures to remove the display board and/or the pushbutton switch pad.
1.Unplug the display board ribbon cables.
2.Remove front panel assembly.
This assembly has four retaining clips that snap onto the chassis over four pem nut studs.
Two retaining clips are located on each side of the front panel. Pull the retaining clips
outward and, at the same time, pull the front panel assembly forward until it separates
from the chassis.
3.Using a thin-bladed screwdriver, pry the plastic PC board stop (located at the bottom of
the display board) until the bar separates from the casing. Pull the display board from the
front panel.
4.Remove the switch pad by pulling it from the front panel.
Removing power components
The following procedures to remove the power supply and/or power module require that the
case cover and motherboard be removed, as previously explained.
Power module removal
Perform the following steps to remove the rear panel power module:
1.Remove the analog board.
2.Unplug the cable connecting the power module to the digital board.
3.Disconnect the power module’s ground wire. This green and yellow wire connects to a
threaded stud on the chassis with a kep nut.
4.Squeeze the latches on either side of the power module while pushing the module from
the access hole.
WARNINGTo avoid electrical shock, which could result in injury or death, the ground
wire of the power module must be connected to chassis ground. When in
stalling the power module, be sure to re-connect the green and yellow
ground wire to the threaded stud on the chassis.
-
Instrument re-assembly
Re-assemble the instrument by reversing the previous disassembly procedures. Make sure
that all parts are properly seated and secured, and that all connections are properly made. To en
sure proper operation, replace the analog signal wire ferrite noise filters, and securely fasten the
shield.
WARNINGTo ensure continued protection against electrical shock, verify that power
line ground (green and yellow wire attached to the power module) is con
nected to the chassis.
Also make sure the two bottom case screws are properly installed to secure
and ground the case cover to the chassis.
Disassembly 5-7
-
-
5-8Disassembly
6
Replaceable Parts
6-2Replaceable Parts
Introduction
This section contains replacement parts information and component layout drawings for the
Model 2430.
Parts lists
The electrical parts lists for the Model 2430 are shown in tables at the end of this section. For
part numbers to the various mechanical parts and assemblies, use the Miscellaneous parts list
and the assembly drawings provided at the end of Section 5.
Ordering information
To place an order, or to obtain information concerning replacement parts, contact your
Keithley representative or the factory (see inside front cover for addresses). When ordering
parts, be sure to include the following information:
•Instrument model number (Model 2430)
•Instrument serial number
•Part description
•Component designation (if applicable)
•Keithley part number
Factory service
If the instrument is to be returned to Keithley Instruments for repair, perform the following:
•Call the Repair Department at 1-800-552-1115 for a Return Material Authorization
(RMA) number.
•Complete the service form at the back of this manual, and include it with the instrument.
•Carefully pack the instrument in the original packing carton.
•Write ATTENTION REPAIR DEPARTMENT and the RMA number on the shipping
label.
Component layouts
The component layouts for the various circuit boards are provided on the following pages.