Tektronix DMM7510 User manual

www.tek.com/keithley
Model DMM7510 7½ Digit Graphical Sampling Multimeter
Calibration and Adjustment Manual
DMM7510-905-01 Rev. C / October 2018
*PDMM7510-905-01C*
DMM7510-905-01C
A Tektr onix Company
7½ Digit Graphical Sampling Multimeter
Model DMM7510
Calibration and Adjustment Manual
© 2018, Keithley Instruments, LLC
Cleveland, Ohio, U.S.A.
All rights reserved.
Any unauthorized reproduction, photocopy, or use of the information herein, in whole or in part,
without the prior written approval of Keithley Instruments, LLC, is strictly prohibited.
These are the original instructions in English.
TSP®, TSP-Link®, and TSP-Net® are trademarks of Keithley Instruments, LLC. All Keithley
Instruments product names are trademarks or registered trademarks of Keithley Instruments, LLC.
Other brand names are trademarks or registered trademarks of their respective holders.
The Lua 5.0 software and associated documentation files are copyright © 1994-2008, Tecgraf,
PUC-Rio. Terms of license for the Lua software and associated documentation can be accessed at
the Lua licensing site (http://www.lua.org/license.html).
Microsoft, Visual C++, Excel, and Windows are either registered trademarks or trademarks of
Microsoft Corporation in the United States and/or other countries.
Document number: DMM7510-905-01 Rev. C / October 2018

Safety precautions

The following safety precautions should be observed before using this product and any associated instrumentation. Although some instruments and accessories would normally be used with nonhazardous voltages, there are situations where hazardous conditions may be present.
This product is intended for use by personnel who recognize shock hazards and are familiar with the safety precautions required to avoid possible injury. Read and follow all installation, operation, and maintenance information carefully before using the product. Refer to the user documentation for complete product specifications.
If the product is used in a manner not specified, the protection provided by the product warranty may be impaired.
The types of product users are:
Responsible body is the individual or group responsible for the use and maintenance of equipment, for ensuring that the equipment is operated within its specifications and operating limits, and for ensuring that operators are adequately trained.
Operators use the product for its intended function. They must be trained in electrical safety procedures and proper use of the instrument. They must be protected from electric shock and contact with hazardous live circuits.
Maintenance personnel perform routine procedures on the product to keep it operating properly, for example, setting the line voltage or replacing consumable materials. Maintenance procedures are described in the user documentation. The procedures explicitly state if the operator may perform them. Otherwise, they should be performed only by service personnel.
Service personnel are trained to work on live circuits, perform safe installations, and repair products. Only properly trained service personnel may perform installation and service procedures.
Keithley products are designed for use with electrical signals that are measurement, control, and data I/O connections, with low transient overvoltages, and must not be directly connected to mains voltage or to voltage sources with high transient overvoltages. Measurement Category II (as referenced in IEC 60664) connections require protection for high transient overvoltages often associated with local AC mains connections. Certain Keithley measuring instruments may be connected to mains. These instruments will be marked as category II or higher.
Unless explicitly allowed in the specifications, operating manual, and instrument labels, do not connect any instrument to mains.
Exercise extreme caution when a shock hazard is present. Lethal voltage may be present on cable connector jacks or test fixtures. The American National Standards Institute (ANSI) states that a shock hazard exists when voltage levels greater than 30 V RMS, 42.4 V peak, or 60 VDC are present. A good safety practice is to expect that hazardous voltage is present in any unknown circuit before measuring.
Operators of this product must be protected from electric shock at all times. The responsible body must ensure that operators are prevented access and/or insulated from every connection point. In some cases, connections must be exposed to potential human contact. Product operators in these circumstances must be trained to protect themselves from the risk of electric shock. If the circuit is capable of operating at or above 1000 V, no conductive part of the circuit may be exposed.
Do not connect switching cards directly to unlimited power circuits. They are intended to be used with impedance-limited sources. NEVER connect switching cards directly to AC mains. When connecting sources to switching cards, install protective devices to limit fault current and voltage to the card.
Before operating an instrument, ensure that the line cord is connected to a properly-grounded power receptacle. Inspect the connecting cables, test leads, and jumpers for possible wear, cracks, or breaks before each use.
When installing equipment where access to the main power cord is restricted, such as rack mounting, a separate main input power disconnect device must be provided in close proximity to the equipment and within easy reach of the operator.
For maximum safety, do not touch the product, test cables, or any other instruments while power is applied to the circuit under test. ALWAYS remove power from the entire test system and discharge any capacitors before: connecting or disconnecting cables or jumpers, installing or removing switching cards, or making internal changes, such as installing or removing jumpers.
Do not touch any object that could provide a current path to the common side of the circuit under test or power line (earth) ground. Always make measurements with dry hands while standing on a dry, insulated surface capable of withstanding the voltage being measured.
For safety, instruments and accessories must be used in accordance with the operating instructions. If the instruments or accessories are used in a manner not specified in the operating instructions, the protection provided by the equipment may be impaired.
Do not exceed the maximum signal levels of the instruments and accessories. Maximum signal levels are defined in the specifications and operating information and shown on the instrument panels, test fixture panels, and switching cards.
When fuses are used in a product, replace with the same type and rating for continued protection against fire hazard.
Chassis connections must only be used as shield connections for measuring circuits, NOT as protective earth (safety ground) connections.
If you are using a test fixture, keep the lid closed while power is applied to the device under test. Safe operation requires the use of a lid interlock.
If a screw is present, connect it to protective earth (safety ground) using the wire recommended in the user documentation.
The symbol on an instrument means caution, risk of hazard. The user must refer to the operating instructions located in the user documentation in all cases where the symbol is marked on the instrument.
The symbol on an instrument means warning, risk of electric shock. Use standard safety precautions to avoid personal contact with these voltages.
The symbol on an instrument shows that the surface may be hot. Avoid personal contact to prevent burns.
The symbol indicates a connection terminal to the equipment frame.
If this symbol is on a product, it indicates that mercury is present in the display lamp. Please note that the lamp must be properly disposed of according to federal, state, and local laws.
The WARNING heading in the user documentation explains hazards that might result in personal injury or death. Always read the associated information very carefully before performing the indicated procedure.
The CAUTION heading in the user documentation explains hazards that could damage the instrument. Such damage may invalidate the warranty.
The CAUTION heading with the symbol in the user documentation explains hazards that could result in moderate or minor injury or damage the instrument. Always read the associated information very carefully before performing the indicated procedure. Damage to the instrument may invalidate the warranty.
Instrumentation and accessories shall not be connected to humans.
Before performing any maintenance, disconnect the line cord and all test cables.
To maintain protection from electric shock and fire, replacement components in mains circuits — including the power transformer, test leads, and input jacks — must be purchased from Keithley. Standard fuses with applicable national safety approvals may be used if the rating and type are the same. The detachable mains power cord provided with the instrument may only be replaced with a similarly rated power cord. Other components that are not safety-related may be purchased from other suppliers as long as they are equivalent to the original component (note that selected parts should be purchased only through Keithley to maintain accuracy and functionality of the product). If you are unsure about the applicability of a replacement component, call a Keithley office for information.
Unless otherwise noted in product-specific literature, Keithley instruments are designed to operate indoors only, in the following environment: Altitude at or below 2,000 m (6,562 ft); temperature 0 °C to 50 °C (32 °F to 122 °F); and pollution degree 1 or 2.
To clean an instrument, use a cloth dampened with deionized water or mild, water-based cleaner. Clean the exterior of the instrument only. Do not apply cleaner directly to the instrument or allow liquids to enter or spill on the instrument. Products that consist of a circuit board with no case or chassis (e.g., a data acquisition board for installation into a computer) should never require cleaning if handled according to instructions. If the board becomes contaminated and operation is affected, the board should be returned to the factory for proper cleaning/servicing.
Safety precaution revision as of June 2017.
Introduction ................................................................................................................ 1-1

Table of contents

Welcome .............................................................................................................................. 1-1
Introduction to this manual ................................................................................................... 1-1
Extended warranty ............................................................................................................... 1-2
Contact information .............................................................................................................. 1-2
Performance verification ........................................................................................... 2-1
Introduction .......................................................................................................................... 2-1
Factory service ..................................................................................................................... 2-2
Verification test requirements .............................................................................................. 2-2
Environmental conditions .......................................................................................................... 2-2
Warmup period .......................................................................................................................... 2-2
Line power................................................................................................................................. 2-2
Recommended test equipment ................................................................................................. 2-3
Autocalibration ..................................................................................................................... 2-3
Running autocalibration ............................................................................................................ 2-4
Scheduling autocalibration ........................................................................................................ 2-5
Reviewing calibration information .............................................................................................. 2-6
Monitoring internal temperature ................................................................................................ 2-6
Calibration verification limits ................................................................................................ 2-7
Example reading limit calculation .............................................................................................. 2-7
Calculating resistance reading limits ......................................................................................... 2-7
Performing the verification test procedures ......................................................................... 2-8
Test summary ........................................................................................................................... 2-8
Test considerations ................................................................................................................... 2-9
Front-panel calibration verification ....................................................................................... 2-9
DC voltage verification ............................................................................................................ 2-10
AC voltage verification ............................................................................................................ 2-12
Digitize voltage verification ...................................................................................................... 2-16
Frequency verification ............................................................................................................. 2-19
Simulated thermocouple Type J temperature verification ....................................................... 2-20
Simulated RTD temperature verification ................................................................................. 2-22
Resistance verification ............................................................................................................ 2-26
DC current verification ............................................................................................................. 2-34
Digitize current verification ...................................................................................................... 2-39
AC current verification ............................................................................................................. 2-41
Capacitance verification .......................................................................................................... 2-44
Verifying zero values using a 4-wire short ............................................................................... 2-46
Rear-panel verification ....................................................................................................... 2-48
DC current 10 A range verification .......................................................................................... 2-48
Digitize current 10 A range verification .................................................................................... 2-50
AC current 10 A verification..................................................................................................... 2-52
Adjustment ................................................................................................................. 3-1
Introduction .......................................................................................................................... 3-1
Environmental conditions ..................................................................................................... 3-2
Table of contents Model DMM7510 7½ Digit Graphical Sampling Multimeter Calibration and Adjustment Manual
Temperature and relative humidity ............................................................................................ 3-2
Line power................................................................................................................................. 3-2
Warmup period ..................................................................................................................... 3-2
Adjustment overview ............................................................................................................ 3-2
Recommended test equipment ............................................................................................ 3-3
General adjustment considerations ..................................................................................... 3-3
Initial instrument setup ......................................................................................................... 3-4
Select the correct terminals ....................................................................................................... 3-4
Select the TSP command set .................................................................................................... 3-4
Verify instrument date and time ................................................................................................. 3-5
Set up remote connections ....................................................................................................... 3-5
Unlock calibration ...................................................................................................................... 3-6
Remote calibration adjustment procedures ......................................................................... 3-6
Rear-terminal adjustment steps ................................................................................................ 3-6
Front-terminal adjustment steps .............................................................................................. 3-10
Save calibration and set the adjustment dates ........................................................................ 3-20
Setting time, adjustment, and verification dates ...................................................................... 3-20
Adjustment command timing and error checking .................................................................... 3-21
Example calibration adjustment code ...................................................................................... 3-23
TSP command reference ........................................................................................... 4-1
TSP commands .................................................................................................................... 4-1
Introduction ............................................................................................................................... 4-1
acal.count .................................................................................................................................. 4-1
acal.lastrun.internaltemp ........................................................................................................... 4-2
acal.lastrun.tempdiff .................................................................................................................. 4-3
acal.lastrun.time ........................................................................................................................ 4-4
acal.nextrun.time ....................................................................................................................... 4-5
acal.revert() ............................................................................................................................... 4-6
acal.run() ................................................................................................................................... 4-6
acal.schedule() .......................................................................................................................... 4-7
cal.adjust.ac() ............................................................................................................................ 4-8
cal.adjust.count ....................................................................................................................... 4-10
cal.adjust.date ......................................................................................................................... 4-11
cal.adjust.dc() .......................................................................................................................... 4-12
cal.adjust.internaltemp ............................................................................................................ 4-13
cal.adjust.rear.ac() .................................................................................................................. 4-13
cal.adjust.rear.dc() .................................................................................................................. 4-14
cal.adjust.tempdiff ................................................................................................................... 4-15
cal.lock() .................................................................................................................................. 4-16
cal.password ........................................................................................................................... 4-17
cal.save()................................................................................................................................. 4-17
cal.unlock() .............................................................................................................................. 4-18
cal.verify.date .......................................................................................................................... 4-19
Contact information .................................................................. 1-2

Welcome

Thank you for choosing a Keithley Instruments product. The Keithley Instruments Model is a 7½ digit graphical sampling multimeter that expands standard DMM functions with high-speed digitizing and large graphical color touchscreen display. This DMM offers a broad range of measurement capabilities, including 17 measurement functions. In addition to industry-leading DC accuracies, functions such as capacitance, 10 A current, and 18-bit current and voltage digitizing are included. Tying all these features together is a large 5-inch color touchscreen display that brings users an unprecedented combination of data visualization and interaction, enabling users to gain deeper insight into their measurements.
Section 1

Introduction

In this section:
Welcome .................................................................................. 1-1
Introduction to this manual ....................................................... 1-1
Extended warranty ................................................................... 1-2
The DMM7510 provides superior measurement accuracy and the speed necessary for a broad range of applications, from system applications and production testing to benchtop applications. The DMM7510 meets application requirements for production engineers, research and development engineers, test engineers, and scientists.

Introduction to this manual

This manual provides instructions to help you calibrate and adjust your DMM7510. In this manual, the term "calibration" refers to the process of verifying that the accuracy of the instrument is within its one-year accuracy specifications. The term "adjustment" refers to the process of changing the calibration constants so that the accuracy of the instrument is within its one-year accuracy specifications.
This manual presents calibration information, adjustment information, and command descriptions for the calibration and adjustment commands.
For additional command descriptions, refer to the DMM7510 Reference Manual (part number DMM7510-901-01). This manual is on the Product Information CD-ROM that came with your instrument. It is also available on the Product Support web page (tek.com/product-support
).
Section 1: Introduction Model DMM7510 Calibration and Adjustment Manual

Extended warranty

Additional years of warranty coverage are available on many products. These valuable contracts protect you from unbudgeted service expenses and provide additional years of protection at a fraction of the price of a repair. Extended warranties are available on new and existing products. Contact your local Keithley Instruments office, sales partner, or distributor for details.

Contact information

If you have any questions after you review the information in this documentation, please contact your local Keithley Instruments office, sales partner, or distributor. You can also call the corporate headquarters of Keithley Instruments (toll-free inside the U.S. and Canada only) at 1-800-935-5595, or from outside the U.S. at +1-440-248-0400. For worldwide contact numbers, visit the
Instruments website (tek.com/keithley).
Keithley
1-2 DMM7510-905-01 Rev. C / October 2018
In this section:
Rear-panel verification ............................................................2-48
Introduction .............................................................................. 2-1
Factory service ......................................................................... 2-2
Verification test requirements ................................................... 2-2
Autocalibration ......................................................................... 2-3
Calibration verification limits ..................................................... 2-7
Performing the verification test procedures .............................. 2-8
Front-panel calibration verification ........................................... 2-9

Introduction

Use the procedures in this section to verify that DMM7510 accuracy is within the limits stated in the instrument’s one-year accuracy specifications. Specifications and characteristics are subject to change without notice; refer to the Product Support web page (tek.com/product-support recent specifications.
Section 2

Performance verification

) for the most
You can use these verification procedures to:
Make sure that the instrument was not damaged during shipment.
Verify that the instrument meets factory specifications.
Determine if adjustment is required.
Verify that adjustment was done properly.
The information in this section is intended for qualified service personnel only, as described by the types of product users in the Safety precautions (on page 1-1 procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages, that if contacted, could cause personal injury or death. Use appropriate safety precautions when working with hazardous voltages.
If the instrument is still under warranty and its performance is outside specified limits, please contact your local Keithley Instruments office, sales partner, or distributor. You can also call the corporate headquarters of Keithley Instruments (toll-free inside the U.S. and Canada only) at 1-800-935-5595, or from outside the U.S. at +1-440-248-0400. For worldwide contact numbers, visit the
Instruments website (tek.com/keithley).
). Do not attempt these
Keithley
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual

Factory service

To return your instrument to Keithley Instruments for repair:
Call the Repair Department at 1-800-833-9200 or send an email to
RMAREQUEST@tektronix.com for a Return Material Authorization (RMA) number.
Carefully pack the instrument in the original packing carton.
Write ATTENTION REPAIR DEPARTMENT and the RMA number on the shipping label.

Verification test requirements

Be sure that you perform these verification tests:
Under the proper environmental conditions.
After the specified warmup period.
Using the correct line voltage.
Using the proper test equipment.
Using the specified output signal and reading limits.

Environmental conditions

Conduct the calibration verification procedures in a test environment with:
An ambient temperature of 18 °C to 28 °C.
A relative humidity of less than or equal to 80 percent, unless otherwise noted.
No direct airflow on the input terminals.

Warmup period

Allow the DMM7510 to warm up for at least 90 minutes before conducting the calibration verification procedures.
If the instrument has been subjected to temperature extremes (more than 5 °C above or below T allow additional time for the internal temperature of the instrument to stabilize. Typically, allow an additional hour to stabilize an instrument that is 10 °C outside the specified temperature range.
Also, allow the test equipment to warm up for the time recommended by the manufacturer.

Line power

The DMM7510 requires a line voltage of 100 V to 240 V and a line frequency of 50 Hz or 60 Hz. Calibration verification tests should be performed within this range.
2-2 DMM7510-905-01 Rev. C / October 2018
cal
),
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
The instrument automatically senses the line frequency at power up.
Fluke
5720A or 5730A
High-Performance
DCV, ACV, ACI, and
See following Fluke
5725A
Amplifier
DCI and ACI
See following note.
Fluke
8508A
8.5-Digit Reference Multimeter
DCV and resistance
See following note.
Keithley Instruments
3390
Function/Arbitrary Waveform Generator
Frequency
See following note.
IET Labs, Inc.
1423-A
Precision Decade
Capacitance, 1 nF to
See following
IET Labs, Inc.
HACS-Z-A-2E-1uF
Series HACS-Z High
Capacitance Box
Capacitance, 1 µF to
See following Keithley Instruments
8610 or 8620
4-Wire DMM Shorting Plug
DCV, digitize DCV, and resistance
See following note.
HYMEG
FA-65-1G and
1 GΩ and 10 GΩ, 2%
resistor
910 MΩ, parallel 1
See following

Recommended test equipment

The following table summarizes the recommended calibration verification equipment. You can use alternate equipment if that equipment has specifications that meet or exceed those listed in the table below. Test equipment uncertainty adds to the uncertainty of each measurement. Generally, test equipment uncertainty should be at least four times more accurate than corresponding DMM7510 specifications.
In this manual, the Model 8610 shorting plug is shown in the figures. However, you can use either the Model 8610 or the Model 8620 shorting plug.
Manufacturer Model Description Used for Uncertainty
Corporation
FA-65-10G
Multifunction Calibrator
Capacitor
Accuracy Decade
tolerance, 25 PPM/ °C
resistance
1 µF
100 µF
and 10
note.
note.
note.
note.
Refer to the manufacturer's specifications to calculate the uncertainty, which varies for each function and range test point.
DMM7510-905-01 Rev. C / October 2018 2-3
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual

Autocalibration

Autocalibration removes measurement errors that are caused by the performance drift on the components used in this DMM as a result of temperature and time. Autocalibration improves short-term accuracy of the DMM7510. However, you must still perform regular full calibration adjustment with metrology equipment to maintain overall accuracy. To maintain accuracy, run autocalibration when the instrument temperature changes by more than ±5 °C or one week has elapsed since the last autocalibration. To check the temperature difference, you can view the temperature change on the Calibration menu. You can also use remote commands to retrieve the temperature difference.
The instrument regularly monitors the internal temperature for the voltage, current, 2-wire resistance, 4-wire resistance, diode, temperature, and DC voltage ratio function when autozero is enabled. Temperature checking begins after the warm-up time completes. If you are using digitize functions, periodically check the temperature drift by using the front-panel calibration screen or use remote commands to check the autocalibration temperature drift. If there is a more than ±5 °C difference between this temperature and the temperature when the last autocalibration was run, the instrument generates an event in the event log and a warning message.
The autocalibration constants are stored through a power cycle. You do not need to run autocalibration if the power has been cycled.
You can run autocalibration with input cables connected. At the start of the autocalibration process, the front terminals are monitored. If more than 30 V DC or 1 V AC is detected on the front-panel inputs, autocalibration is not run and an event message is displayed.
Autocalibration also monitors the temperature at the start and end of autocalibration. If the start and end temperature differs by more than ±1 °C, the autocalibration values are not stored and a warning message is generated.

Running autocalibration

After the instrument has completed its warm-up period, you can run autocalibration as needed using the front panel or over a remote interface. Autocalibration takes about six minutes to run. During autocalibration, you cannot use the instrument.
A status message is displayed on the front panel of the instrument while autocalibration is running. At completion, a status message is generated.
To prevent instrument damage, verify that no test voltages are connected to the input terminals when performing autocalibration.
Do not cycle power during the autocalibration routine. Doing so could affect the accuracy of the instrument.
2-4 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
To prepare for autocalibration:
1. Disable voltage sources on any test cables that are connected to the front-panel or rear-panel terminals.
2. Place the DMM7510 in a temperature-stable location.
3. Turn on instrument power and allow the instrument to warm up for at least 90 minutes. When the instrument has completed the warm-up period, a message is displayed and an information event is generated in the event log.
To run autocalibration from the front panel:
1. Press the MENU key.
2. Under System, select Calibration.
3. Select Start ACAL. A prompt is displayed.
Select Yes. A progress bar is displayed while the calibration runs.
To run autocalibration using TSP commands:
Send:
acal.run()
Once autocalibration has started, you cannot stop it. After completion, however, you can use remote commands to revert to the previous autocalibration settings. Refer to acal.revert() (on page 4-5
).

Scheduling autocalibration

You can set up your instrument to run autocalibration automatically. You can also set up the instrument to prompt you to run autocalibration at regular intervals. To determine the best schedule for your application, see the DMM7510 specifications for detail on the accuracy with and without autocalibration.
Autocalibration does not start until all actions that are active on the instrument are complete. When the scheduled time occurs, the autocalibration run command is placed in the command queue and will be executed after any previously sent commands or actions have executed. For example, if a trigger model is running when autocalibration is scheduled to run, autocalibration does not start until the trigger model stops.
If there is a command or action that is waiting a long time for an event, the autocalibration will not run until the event occurs, the action is aborted, or the instrument power is cycled.
If the scheduled time for autocalibration occurs before the warm-up period completes, the instrument will not start autocalibration. The instrument waits until the warmup period is complete before starting a scheduled autocalibration. A message is displayed when warmup is complete and autocalibration is going to run.
If the instrument is powered off when an autocalibration was scheduled, autocalibration is run as soon as the warmup period is complete when the instrument is powered on.
You can run autocalibration manually even if a scheduled autocalibration is set.
DMM7510-905-01 Rev. C / October 2018 2-5
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
When autocalibration is scheduled to run at a scheduled interval, but it runs at a time other than the scheduled interval, subsequent scheduled intervals are adjusted according to the actual autocalibration start time.
From the front panel:
1. Press the MENU key.
2. Under System, select Calibration.
3. Select Scheduling Action. To have the instrument:
Prompt you to run autocalibration: Select Notify. Run autocalibration at a specific time: Select Run.  To stop scheduling: Select None. If you select None, you do not need to make additional
settings.
4. Select Scheduling Interval.
5. Select the interval.
6. Select Scheduled Time to select the time when the autocalibration will run or when you will be prompted to run it.
To review the next schedule time and date, see the information listed next to Next Run.
Using TSP commands:
Refer to acal.schedule() (on page 4-7).

Reviewing calibration information

The Calibration screen displays information about the last autocalibration and factory calibrations that were run and the present status. For detail on this screen, refer to "System Calibration menu" in the DMM7510 Reference Manual.
For autocalibration, you can also access this information from the commands in the SCPI ACAL subsystem or the TSP acal.* commands.

Monitoring internal temperature

You can monitor the temperature difference between the actual internal temperature and the temperature when autocalibration ran through the front panel or by using remote commands. With remote commands, you can also check the present internal temperature and the internal temperature when autocalibration was last run. Temperature is returned in Celsius (°C).
The internal temperature is not updated on the Calibration screen until the warmup period is complete. The remote commands always return the present temperature.
From the front panel:
1. Press the MENU key.
2. Under System, select Calibration.
3. The Temperature Difference is displayed.
2-6 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
Using SCPI commands:
For the present internal temperature, send:
:SYSTem:TEMPerature:INTernal?
For the temperature difference, send:
:ACAL:LASTrun:TEMPerature:DIFFerence?
For the temperature when autocalibration was last run, send:
:ACAL:LASTrun:TEMPerature:INTernal?
Using TSP commands:
For the present internal temperature, send:
print(localnode.internaltemp)
For the temperature difference, send:
print(acal.lastrun.tempdiff)
For the temperature when autocalibration was last run, send:
print(acal.lastrun.internaltemp)

Calibration verification limits

The calibration verification limits stated in this section have been calculated using only the DMM7510 one-year accuracy specifications, within 30 days of autocalibration and T do not include test equipment uncertainty. If a particular measurement falls outside the allowable range, recalculate new limits based on both the DMM7510 specifications and corresponding test equipment specifications.
Specifications and characteristics are subject to change without notice; please refer to the
Instruments website (tek.com/keithley) for the most recent specifications.

Example reading limit calculation

Assume you are testing the 10 VDC range using a 10 V input value. Using the DMM7510 one-year accuracy specification for 10 VDC of ± (14 ppm of reading + 1.2 ppm of range), the calculated limits are:
Reading limits = 10 V ± [(10 V × 14 ppm) + (10 V × 1.2 ppm)]
Reading limits = 10 V ± (0.00014 + 0.000012) V
Reading limits = 10 V ± 0.000152 V
±5 °C from T
OPER
. They
ACAL
Keithley
Reading limits = 9.999848 V to 10.000152 V
DMM7510-905-01 Rev. C / October 2018 2-7
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual

Calculating resistance reading limits

Resistance reading limits must be recalculated based on the actual calibration resistance values supplied by the equipment manufacturer. Calculations are performed in the same manner as shown in the preceding example. Use the actual calibration resistance values instead of the nominal values in the example when performing your calculations.
For example, assume that you are testing the 10 kΩ range using an actual 10.03 kΩ calibration resistance value. Using DMM7510 one-year 10 kΩ range accuracy of ± (30 ppm of reading + 3 ppm of range), the calculated reading limits are:
Reading limits = 10.03 ± [(10.03 kΩ x 30 ppm) + (10 kΩ x 3 ppm)]
Reading limits = 10.03 kΩ ± [(0.3009) + (0.03)] Ω
Reading limits = 10.03 kΩ ± 0.3309 Ω
Reading limits = 10.029669 kΩ to 10.030331

Performing the verification test procedures

The following topics provide a summary of calibration verification test procedures and items to consider before performing any calibration verification test.

Test summary

Front-panel tests:
DC voltage verification (on page 2-10)
AC voltage verification (on page 2-12)
Digitize voltage verification (on page 2-16)
Frequency verification (on page 2-18)
Simulated thermocouple type J temperature verification (on page 2-20)
Simulated RTD temperature verification (on page 2-22)
Dry circuit resistance verification (on page 2-26)
Resistance verification (on page 2-25)
DC current verification (on page 2-34)
Digitize current verification (on page 2-39)
AC current verification (on page 2-41)
Capacitance verification (on page 2-43)
Verifying zero values using a 4-wire short (on page 2-46)
2-8 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
Rear-panel tests:
DC current 10 A range verification (on page 2-48)
Digitize current 10 A range verification (on page 2-50)
AC current 10 A verification (on page 2-51)
If the DMM7510 is not within specifications and is not under warranty, see the adjustment procedures in Adjustment (on page 3-1
) for information about adjusting the instrument.

Test considerations

When performing the calibration verification procedures:
Be sure to restore factory front-panel defaults. From the front panel, select the MENU key, select
Info/Manage, and select System Reset.
Make sure that the test equipment is warmed up for the time recommended by the manufacturer
and is connected to the DMM7510 input/output terminals.
Make sure that the correct DMM7510 terminals are selected with the TERMINALS FRONT/REAR
switch.
Make sure the test equipment is set up for the proper function and range.
Do not connect test equipment to the DMM7510 through a scanner, multiplexer, or other
switching equipment.
Make sure that autocalibration has been performed with 30 days and that the temperature difference is less than ±5 °C. To check autocalibration, press the MENU key and select Calibration. If elapsed time is more than 30 days or the temperature difference is more than ±5 °C, run autocalibration before verifying the DMM7510.
The front and rear terminals of the instrument are rated for connection to circuits rated Measurement Category II up to 300 V, as described in International Electrotechnical Commission (IEC) Standard IEC 60664. This range must not be exceeded. Do not connect the instrument terminals to CAT III or CAT IV circuits. Connection of the instrument terminals to circuits higher than CAT II can cause damage to the equipment and severe personal injury.
DMM7510-905-01 Rev. C / October 2018 2-9
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual

Front-panel calibration verification

The following topics describe verification procedures that are done with connections attached to the terminals on the DMM7510 front panel.

DC voltage verification

The maximum input voltage between INPUT HI and INPUT LO is 750 V DC and 750 V AC. Exceeding this value may create a shock hazard.
The maximum common-mode voltage (the voltage between INPUT LO and chassis ground) is 500 V hazard.
Verify DC voltage accuracy for the 100 mV to 1000 V ranges
To verify 100 mV to 1000 VDC voltage accuracies, you will:
Apply accurate DC voltages from the calibrator to the DMM7510 front-panel terminals.
Verify that the displayed readings are within specified limits.
Use the values in the tables following the steps below to verify the performance of the DMM7510. Actual values depend on the published specifications (see Example reading limit calculation 2-7)).
. Exceeding this value may cause a breakdown in insulation that can create a shock
PEAK
(on page
Use shielded low-thermal connections when testing the 100 mV and 1 V ranges to avoid errors caused by noise or thermal effects. Connect the shield to the output LO terminal of the calibrator.
2-10 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
To verify DC voltage accuracy:
Perform relative offset
0
n/a
n/a Full scale (+)
1.0000000E-01
9.9997300E-02
1.0000270E-01
Half scale (+)
5.0000000E-02
4.9998200E-02
5.0001800E-02
Full scale (–)
–1.0000000E-01
–1.0000270E-01
–9.9997300E-02
1. Use a low-thermal cable to connect the DMM7510 HI and LO INPUT terminals to the calibrator HI and LO terminals as shown in the following figure.
Figure 1: DC voltage 100 mV to 1000 V ranges verification connections
2. On the DMM7510, press the FUNCTION key and select DC voltage.
3. On the Home screen, select the button next to Range and select 100 mV.
4. Press the MENU key.
5. Under Measure, select Settings.
6. Set Input Impedance to Auto.
7. Set the calibrator output to 0 V.
8. Set the calibrator to OPERATE.
9. Allow 5 minutes of settling time.
10. Press the MENU key.
11. Select Calculations.
12. Select Rel Acquire.
13. Source positive and negative full-scale and half-scale voltages and allow for proper settling, as
listed in Verify the DC voltage 100 mV range (on page 2-11
).
14. Select each range on the DMM7510, allow for proper settling, and verify the remaining ranges
according to the following tables.
Verify the DC voltage 100 mV range
Description Verification point Lower limit Upper limit
Half scale (–) –5.0000000E-02 –5.0001800E-02 –4.9998200E-02
DMM7510-905-01 Rev. C / October 2018 2-11
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
Verify the DC voltage 1 V range
Full scale (+)
1.0000000E+00
9.9998300E-01
1.0000170E+00
Half scale (+)
5.0000000E-01
4.9999050E-01
5.00009500E-01
Half scale (–)
–5.0000000E-01
–5.0000950E-01
–4.9999050E-01
Full scale (–)
–1.0000000E+00
–1.0000170E+00
–9.9998300E-01
Full scale (+)
1.000000E+01
9.999848E+00
1.0000152E+01
Half scale (+)
5.000000E+00
4.9999180E+00
5.0000820E+00
Half scale (–)
–5.0000000E+00
–5.0000820E+00
–4.9999180E+00
Full scale (–)
–1.0000000E+01
–1.0000152E+01
–9.999848E+00
Full scale (+)
1.000000E+02
9.999730E+01
1.0000270E+02
Half scale (–)
–5.0000000E+01
–5.0001600E+01
–4.999840E+01
Full scale (–)
–1.0000002E+02
–1.0000270E+02
–9.999730E+01
Full scale (+)
1.0000000E+03
9.9997200E+02
1.0000280E+03
Half scale (+)
5.000000+02
4.9998350E+02
5.0001650E+02
Half scale (–)
–5.000000+02
–5.0001650E+02
–4.9998350E+02
Full scale (–)
–1.0000000E+03
–1.0000280E+03
–9.9997200E+02
Description Verification point Lower limit Upper limit
Verify the DC voltage 10 V range
Description Verification point Lower limit Upper limit
Verify the DC voltage 100 V range
The information in this section is intended for qualified service personnel only, as described by the types of product users in the Safety precautions (on page 1-1
). Do not attempt these
procedures unless you are qualified to do so.
Some of these procedures may expose you to hazardous voltages, that if contacted, could cause personal injury or death. Use appropriate safety precautions when working with hazardous voltages.
Description Verification point Lower limit Upper limit
Half scale (+) 5.000000E+01 4.999840E+01 5.000160E+01
Verify the DC voltage 1000 V range
Description Verification point Lower limit Upper limit

AC voltage verification

To verify AC voltage accuracy:
For the 100 mV to 100 V ranges, apply accurate voltages from the calibrator to the DMM7510
front-panel terminals.
For the 700 V range, connect the Fluke 5725A Amplifier to the calibrator. Apply accurate voltages
from the calibrator terminals to the terminals on the front panel of the DMM7510.
Verify that the displayed readings are within specified limits.
2-12 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
Use the values in the tables following the steps below to verify the performance of the DMM7510. Actual values depend on the published specifications (see Example reading limit calculation
(on page
2-7)).
The maximum input voltage between INPUT HI and INPUT LO is 1000 DC. Exceeding this value may create a shock hazard.
The maximum common-mode voltage (the voltage between INPUT LO and chassis ground) is 500 V
. Exceeding this value may cause a breakdown in insulation that can create a shock
PEAK
hazard.
Verify AC voltage accuracy for the 100 mV to 100 V ranges
Use shielded, low-capacitance cabling. For the 100 mV to 100 V ranges, avoid loading that exceeds 1000 pF.
Excessive capacitance may result in additional load regulation uncertainties and could cause the calibrator output to open (go into standby).
To verify AC voltage accuracy:
1. Connect the DMM7510 HI and LO INPUT connectors to the calibrator as shown in the following
figure.
Figure 2: Connections for AC voltage verification 100 mV to 100 V ranges
2. On the DMM7510, press the FUNCTION key and select AC voltage.
3. On the Home screen, select the button next to Range and select 100 mV.
4. Press the MENU key.
5. Under Measure, select Settings.
DMM7510-905-01 Rev. C / October 2018 2-13
6. Make sure that detector bandwidth is set to 30 Hz.
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
AC voltage is specified for the detector bandwidth setting of 3 Hz. Three Hz measures accurately for
0.1
3.0E+01
9.991000E-02
1.000900E-01
0.1
5.0E+04
9.981000E-02
1.001900E-01
0.1
1.0E+05
9.932000E-02
1.006800E-01
1
3.0E+01
9.991000E-01
1.000900E+00
1
1.0E+03
9.991000E-01
1.000900E+00
1
1.0E+05
9.932000E-01
1.006800E+00
100
3.0E+01
9.991000E+01
1.000900E+02
100
1.0E+03
9.991000E+01
1.000900E+02
100
5.0E+04
9.981000E+01
1.001900E+02
100
1.0E+05
9.932000E+01
1.006800E+02
input signals from 3 Hz to 300 kHz, with reading rates ≈ 0.5 readings/s. To improve verification throughput to ≈ 3.3 readings/s, set detector bandwidth to 30 Hz for frequencies of 30 Hz to 300 kHz. To verify frequencies 1 kHz and higher, set the detector bandwidth to 300 Hz for faster ≈ 55 readings/s throughput.
7. Source AC voltages for each of the frequencies listed in the Verify the AC voltage 100 mV range
(on page 2-14) table.
8. Repeat these steps for each range and frequency listed in the tables below. For each voltage
setting, be sure that the reading is within low and high limits.
Verify the AC voltage 100 mV range
Input Frequency Lower limit Upper limit
0.1 1.0E+03 9.991000E-02 1.000900E-01
Verify the AC voltage 1 V range
Input Frequency Lower limit Upper limit
1 5.0E+04 9.981000E-01 1.001900E+00
Verify the AC voltage 100 V range
Input Frequency Lower limit Upper limit
Verify AC voltage accuracy for the 700 V range
Use shielded low capacitance cabling. For the 700 V range, avoid cable capacitances of >150 pF.
Excessive capacitance may result in additional load regulation uncertainties and could cause the calibrator output to open (go into standby).
To verify AC voltage accuracy for the 700 V range:
1. Put the calibrator in Standby.
2. Connect the DMM7510 HI and LO INPUT connectors to the calibrator as shown in the following
figure.
2-14 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
3. For 700 V at 50 kHz and 100 kHz outputs, connect the calibrator to the Fluke 5725A amplifier.
Figure 3: Connections for AC voltage accuracy verification 700 V range
1. On the DMM7510, press the FUNCTION key and select AC voltage.
2. On the Home screen, select the button next to Range and select 700 V.
3. Press the MENU key.
4. Select Settings.
5. Ensure that detector bandwidth is set to 30 Hz.
AC voltage is specified for the detector bandwidth setting of 3 Hz. Three Hz measures accurately for input signals from 3 Hz to 300 kHz, with reading rates ≈ 0.5 readings/s. To improve verification throughput to ≈ 3.3 readings/s, set detector bandwidth to 30 Hz for frequencies of 30 Hz to 300 kHz. To verify frequencies 1 kHz and higher, set the detector bandwidth to 300 Hz for faster ≈ 55 readings/s throughput.
6. Set the calibrator to OPERATE.
7. Source AC voltages for each of the frequencies listed in the "Verify the AC voltage 700 V range"
table, below. Be sure that the readings are within low and high limits.
DMM7510-905-01 Rev. C / October 2018 2-15
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
Verify the AC voltage 700 V range
700
5.0E+01
6.993700E+02
7.006300E+02
700
1.0E+03
6.993700E+02
7.006300E+02
700
5.0E+04
6.986700E+02
7.013300E+02
700
1.0E+05
6.952400E+02
7.047600E+02
Input Frequency Lower limit Upper limit

Digitize voltage verification

To verify digitize DC voltage accuracy, you will:
Apply accurate voltages from the calibrator to the terminals on the front panel of the DMM7510.
Verify that the displayed readings are within specified limits.
Use the values in the tables following the steps below to verify the performance of the DMM7510. Actual values depend on the published specifications (see Example reading limit calculation 2-7)).
The maximum input voltage between INPUT HI and INPUT LO is 750 V DC and 750 V AC. Exceeding this value may create a shock hazard.
The maximum common-mode voltage (the voltage between INPUT LO and chassis ground) is 500 V
. Exceeding this value may cause a breakdown in insulation that can create a shock
PEAK
hazard.
(on page
Verify the digitize voltage 100 mV to 1000 V ranges
Use shielded low-thermal connections when testing the 100 mV and 1 V ranges to avoid errors caused by noise or thermal effects. Connect the shield to the output LO terminal of the calibrator.
2-16 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
To verify digitize voltage accuracy:
1. Connect the DMM7510 HI and LO INPUT connectors to the calibrator as shown in the following
figure.
Figure 4: Connections for digitize voltage verification 100 mV to 1000 V ranges
2. On the DMM7510, press the FUNCTION key, select the Digitize Functions tab, and select
Digitize Voltage.
3. On the Home screen, select the button next to Range and select 100 mV.
4. Press the MENU key.
5. Under Measure, select Settings.
6. Set the Sample Rate to 1000.
7. Set the Aperture to Auto.
8. Set the Count to 100.
9. Set the calibrator output to 0.00000 mV DC and allow the reading to settle.
10. Press the MENU key.
11. Under Measure, select Calculations.
12. Select Rel Acquire.
13. Source positive and negative full-scale and half-scale voltages, as listed in the following table.
Verify the 100 mV to 100 V range settings listed in the tables below. For each voltage setting, verify that the STATISTICS swipe screen reading for Average is within low and high limits.
The Fluke 5720A or 5730A calibrator 1000 V range 0.0 V setting is not verified.
DMM7510-905-01 Rev. C / October 2018 2-17
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
Verify the digitize voltage 100 mV range
Perform relative offset
0
n/a
n/a Full scale (+)
0.1
9.99680E-02
1.00032E-01
Half scale (+)
0.05
4.99790E-02
5.00210E-02
Half scale (–)
-0.05
–5.00210E-02
–4.99790E-02
Full scale (–)
-0.1
–1.00032E-01
–9.99680E-02
Verify zero
0
–7.50E-05
7.50E-05
Full scale (+)
1
9.99805E-01
1.00020E+00
Half scale (+)
0.5
4.99865E-01
5.00135E-01
Half scale (–)
–0.5
–5.00135E-01
–4.99865E-01
Full scale (–)
–1
–1.00020E+00
–9.99805E-01
Verify zero
0
–7.50E-04
7.50E-04
Full scale (+)
10
9.99805E+00
1.00020E+01
Half scale (+)
5
4.99865E+00
5.00135E+00
Half scale (–)
–5
–5.00135E+00
–4.99865E+00
Full scale (–)
–10
–1.00020E+01
–9.99805E+00
Verify zero
0
–7.50E-03
7.50E-03
Full scale (+)
100
9.99805E+01
1.00020E+02
Half scale (+)
50
4.99865E+01
5.00135E+01
Half scale (–)
–50
–5.00135E+01
–4.99865E+01
Full scale (–)
–100
–1.00020E+02
–9.99805E+01
Full scale (+)
1000
9.99805E+02
1.00020E+03
Half scale (–)
–500
–5.00135E+02
–4.99865E+02
Full scale (–)
–1000
–1.00020E+03
–9.99805E+02
Description Input Lower limit Upper limit
Verify the digitize voltage 1 V range
Description Input Lower limit Upper limit
Verify the digitize voltage 10 V range
Description Input Lower limit Upper limit
Verify the digitize voltage 100 V range
Description Input Lower limit Upper limit
Verify the digitize voltage 1000 V range
Description Input Lower limit Upper limit
Half scale (+) 500 4.99865E+02 5.00135E+02
2-18 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification

Frequency verification

To verify frequency accuracy, you will:
Apply accurate frequencies from the function generator to the terminals on the front panel of the
DMM7510.
Verify that the displayed readings are within specified limits.
Use the values in the table following the steps below to verify the performance of the DMM7510. Actual values depend on the published specifications (see Example reading limit calculation 2-7)).
1. Connect the Keithley Instruments Model 3390 function generator to the DMM7510 INPUT HI and
LO terminals as shown in the following figure.
Figure 5: Connections for frequency verification and adjustment
(on page
2. On the DMM7510, press the FUNCTION key, select the Measure Functions tab, and select
Frequency.
3. Select the MENU key.
4. Under Measure, select Settings.
5. Set the Aperture to 250 ms.
6. Set the Threshold Range to 10 V.
7. Set the Threshold Level to 0 V.
8. Press the HOME key.
9. Source the voltage and frequency values as listed in Verify the frequency (on page 2-20
). For
each setting, be sure that the reading is within low and high limits.
1. On the DMM7510, press the FUNCTION key, select the Measure Functions tab, and select
Frequency.
2. Select the MENU key.
3. Select Settings.
4. Set the Aperture to 250 ms.
5. Set the Threshold Range to 10 V.
6. Set the Threshold Level to 0 V.
DMM7510-905-01 Rev. C / October 2018 2-19
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
7. Press the HOME key.
10 Hz at 5 V
1.00E+01
9.999197E+00
1.000080E+01
1 kHz at 5 V
1.00E+03
9.999197E+02
1.000080E+03
10 kHz at 5 V
1.00E+04
9.999197E+03
1.000080E+04
100 kHz at 5 V
1.00E+05
9.999197E+04
1.000080E+05
250 kHz at 5 V
2.50E+05
2.499799E+05
2.500201E+05
500 kHz at 5 V
5.00E+05
4.999598E+05
5.000402E+05
8. Source the voltage and frequency values as listed in Verify the frequency (on page 2-20
each setting, be sure that the reading is within low and high limits.
). For
Verify the frequency
Use the following values to verify the performance of the DMM7510. Actual values depend on published specifications (see Example reading limit calculation (on page 2-7
Description Frequency (Hz) Lower limit (Hz) Upper limit (Hz)
)).

Simulated thermocouple Type J temperature verification

To verify thermocouple accuracy, you will:
Apply accurate voltages from the calibrator to the terminals on the front panel of the DMM7510.
Verify that the displayed readings are within specified limits.
Thermocouple accuracy is verified by using a DC voltage calibrator to output values from standard thermocouple tables available from the National Institute of Standards and Technology (NIST) or other sources.
In the table following the steps below, three representative values are listed from a Type J thermocouple table for temperatures –190 °C, 0 °C, and 750 °C, with their respective thermocouple voltages listed in the “Uncompensated calibrator source value” column. The calibrator source values are based on NIST Monograph 175, reference data 60, version 2.0.
Verify thermocouple accuracy
Because the cable connecting the calibrator to the DMM7510 can have non-trivial thermal offset voltages, you must first correct for these to verify the DMM7510 specifications.
2-20 DMM7510-905-01 Rev. C / October 2018
Model DMM7510 Calibration and Adjustment Manual Section 2: Performance verification
To verify the simulated thermocouple Type J temperature:
1. Connect the DMM7510 HI and LO INPUT terminals to the calibrator HI and LO terminals as
shown in the following figure.
Figure 6: Connections for thermocouple verification
2. On the DMM7510, press the FUNCTION key and select DC voltage.
3. Press the MENU key.
4. Under Measure, select Settings.
5. Set the range to 100 mV.
6. Set Input Impedance to Auto.
7. Set autozero to On.
8. Select Integration Rate. The Integration Rate dialog box opens.
9. Set the unit to NPLC.
10. Set NPLC to 1 PLC.
11. Select OK and press the HOME key to return to the Home Screen.
12. Set the calibrator to output 0 V and enable the output.
13. Allow five minutes for settling of the thermal voltage.
14. Record the measured offset voltage to 1 µV precision. If necessary, use the DMM7510 filter
settings to reduce the noise of this measurement (for filter settings, go to MENU > Measure Calculations).
15. Press the DMM7510 FUNCTION key and select Temperature.
16. Press the MENU key.
17. Under Measure, select Settings.
18. On the Measure Settings screen, set the following values:
Units: °C Transducer: TC Thermocouple: J Temperature (simulated reference temperature): 0 °C Integration Rate: 1 PLC Auto Zero: On
DMM7510-905-01 Rev. C / October 2018 2-21
Section 2: Performance verification Model DMM7510 Calibration and Adjustment Manual
–190 °C
–7.659 mV
–190.2 °C
–189.8 °C
0 °C
0.000 mV
–0.2 °C
0.2 °C
750 °C
42.281 mV
749.8 °C
750.2 °C
19. Set the calibrator to the simulated thermocouple voltage you want (from the following table), first
correcting for the offset voltage measured in step 14. For example, if the measured offset voltage was –2 µV, set the calibrator to –7.659 mV – (–0.002 mV), which equals –7.657 mV, to simulate – 190 °C.
20. Verify that the DMM7510 reading is within lower and upper limits.
21. Repeat steps 18 and 19 for each value in the following table.
Use the following values to verify the performance of the DMM7510. Actual values depend on published specifications (see Example reading limit calculation (on page 2-7
)).
Simulated temperature
Uncompensated calibrator source value (V)
Lower limit Upper limit

Simulated RTD temperature verification

Use the following information to verify the performance of the DMM7510. Actual calibrator source values will vary. RTD verification is based on the calibrator sourcing resistance and the DMM7510 conversion of the resistance measurement to calculated temperature based on the Callendar-Van Dusen equation.
To verify RTD temperature accuracy, you will:
Apply accurate resistance from the calibrator to the terminals on the front panel of the DMM7510.
Verify that the displayed readings are within specified limits.
RTD equations
The temperature versus resistance readings listed in the RTD reference tables are calculated using the Callendar-Van Dusen equation. There are two equations that are based on different temperature ranges. There is an equation for the –200 °C to 0 °C range and one for the 0 °C to 850 °C range.
Equation for –200 °C to 0 °C temperature range
R
= R0 [1 + AT + BT2 + CT3(T – 100)]
RTD
where:
R
R
is the calculated resistance of the RTD
RTD
is the known RTD resistance at 0 °C
0
T is the temperature in °C
A = alpha [1 + (delta/100)]
B = –1 (alpha)(delta)(1E-4)
C = –1 (alpha)(beta)(1E-8)
2-22 DMM7510-905-01 Rev. C / October 2018
Loading...
+ 77 hidden pages