Agilent 3458A Data Sheet

Page 1
Shatters
performance barriers
of speed and
accuracy!
Agilent 3458A Multimeter
Data Sheet
Page 2
Performance Highlights
• 5 ranges: 0.1 V to 1000 V
• 8.5 to 4.5 digit resolution
• Up to 100,000 readings/sec (4.5 digits)
• Maximum sensitivity: 10 nV
• 0.6 ppm 24 hour accuracy
• 8 ppm (4 ppm optional) / year voltage reference stability
Ohms
• 9 ranges: 10 to 1 G
• Two-wire and four-wire Ohms with offset compensation
• Up to 50,000 readings/sec (5.5 digits)
• Maximum Sensitivity: 10 µ
• 2.2 ppm 24 hour accuracy
• 6 ranges: 10 mV to 1000 V
• 1 Hz to 10 MHz bandwidth
• Up to 50 readings/sec with all readings to specified accuracy
• Choice of sampling or analog true rms techniques
• 100 ppm best accuracy
dc Current
• 8 ranges: 100 nA to 1 A
• Up to 1,350 readings/sec (5.5 digits)
• Maximum sensitivity: 1pA
• 14 ppm 24 hour accuracy
ac Current
• 5 ranges: 100 µA to 1 A
• 10Hz to 100 kHz bandwidth
• Up to 50 readings/sec
• 500 ppm 24 hour accuracy
Frequency and Period
• Voltage or current ranges
• Frequency: 1 Hz to 10 MHz
• Period: 100 ns to 1 sec
• 0.01% accuracy
• ac or dc coupled
Maximum Speeds
• 100,000 readings/sec at
4.5 digits (16 bits)
• 50,000 readings/sec at 5.5 digits
• 6,000 readings/sec at 6.5 digits
• 60 readings/sec at 7.5 digits
• 6 readings/sec at 8.5 digits
Measurement Set-Up Speed
• 100,000 readings/sec over GPIB* or with internal memory
• 110 autoranges/sec
• 340 function or range changes/sec
• Post-processed math from internal memory
2
Page 3
Access speed and accuracy through a powerful, convenient front panel.
Display
• Bright, easy-to-read, vacuum flourescent display
• 16 character alpha-numeric display to easily read data, mes­sages, and commands
Standard Function/Range Keys
• Simple to use, for bench measure­ments of dcV, acV, Ohms, current, frequency and period
• Select autorange or manual ranging
Menu Command Keys
• Immediate access to eight common commands
• Shifted keys allow simple access to complete command menu
Numeric/User Keys
• Numeric entry for constants and measurement parameters
• Shifted keys (f0 through f9) access up to ten user-defined setups
Volts/ Ohms /Ratio Terminals
• Gold-plated tellurium copper for minimum thermal emf
• 2-wire or 4-wire Ohms measurements
• dc/dc or ac/dc ratio inputs
Rear Input Terminals for convenient system use
External Trigger Input
Current Measurement Terminals
• Easy fuse replacement with fuse holder built into terminal
Guard Terminal and Switch
• For maximum common mode noise rejection
Front-Rear Terminal Switch
• Position selects front or rear measurement terminals
External Output
• Programmable TTL output pulse with
5 modes for flexible system interface
• Defaults to a voltmeter complete pulse
GPIB Interface Connector
3
Page 4
Finally!
A system multimeter with BOTH
high speed and high accuracy.
The Agilent Technologies 3458A Multimeter performance barriers of speed and accuracy on the production test floor, in R&D, and in the cali­bration lab. The 3458A is simply the fastest, most flexible, and most accurate multimeter ever offered by Agilent Technologies. In your system or on the bench, the 3458A saves you time and money with unprecedented test system throughput and accuracy, seven function measurement flex­ibility, and low cost of ownership.
shatters long-standing
Contents
Test System Throughput / 6
Calibration Lab Precision / 9
High Resolution Digitizing / 10
Technical Specifications / 12
Specs Overview / 12
Section 1: DC Voltage / 13
Section 2: Resistance / 14
Section 3: DC Current / 16
Section 4: AC Voltage / 17
Section 5: AC Current / 22
Section 6: Frequency/ Period / 23
Section 7: Digitizing / 24
Section 8: System Specs / 26
Section 9: Ratio / 27
Section 10: Math Functions / 27
Section 11: General Specs / 28
Select a reading rate of 100,000 readings per second for maximal test throughput. Or achieve highest levels of precision with up to 8.5 digits of measurement resolution and 0.1 part per million transfer accuracy. Add to this, programming compatibility through the Agilent Multimeter Language (ML) and the 3458A’s simplicity of operation and you have the ideal multimeter for your most demanding applications.
Section 12: Ordering Information / 29
Accessories / 29
Other Meters / 30
Page 5
The 3458A Multimeter for:
High test system throughput
Calibration lab precision
Faster testing
• Up to 100,000 readings/sec
• Internal test setups >340/sec
• Programmable integration times from 500 ns to 1 sec
Greater test yield
More accuracy for tighter test margins
• Up to 8.5 digits resolution
Longer up-time
• Two-source (10 V, 10 k) calibration, including ac
• Self-adjusting, self-verifying auto-calibration for all functions and ranges, including ac
Superb transfer measurements
8.5 digits resolution
• 0.1 ppm dc Volts linearity
• 0.1 ppm dc Volts transfer capability
• 0.01 ppm rms internal noise
Extraordinary accuracy
• 0.6 ppm for 24 hours in dc Volts
• 2.2 ppm for 24 hours in Ohms
• 100 ppm mid-band ac Volts
• 8 ppm (4 ppm optional) per year voltage reference stability
High resolution digitizing
Greater waveform resolution and accuracy
• 16 to 24 bits resolution
• 100,000 to 0.2 samples/sec
• 12 MHz bandwidth
• Timing resolution to 10 ns
• Less than 100 ps time jitter
• Over 75,000 reading internal memory
10
5
Page 6
• Faster system start-up
Multimeter Language (ML) compatible
• Faster measurements and setups
100,000 readings/sec in 4.5 digits 50,000 readings/sec in 5.5 digits 340 function or range changes/sec
• Longer system up-time
For High Test System Throughput
The Agilent 3458A System Multime­ter heightens test performance in three phases of your production test: faster test system start-up, faster test throughput, and lower cost of ownership through longer system uptime, designed-in reliabil­ity, and fast and easy calibration.
Faster system start-up
The value of a fast system multime­ter in production test is clear. But it is also important that the dmm pro­grams easily to reduce the learning time for new system applications. The Agilent Multimeter Language (ML) offers a standard set of com­mands for the multimeter user that consists of easily understood, read­able commands. Easier program­ming and clearer documentation reduce system development time.
Faster measurements and setups
Now you can have a system dmm with both fast and accurate mea­surements. The 3458A optimizes your measurements for the right combination of accuracy, resolu­tion, and speed. The 3458A Multimeter fits your needs from
4.5 digit dc Volts measurements at 100,000 per second, to 8.5 digit dc Volts measurements at 6 per second, or anywhere in between in 100 ns steps.
Even the traditionally slower measurement functions, such as ac Volts, are quicker with the 3458A. For example, you can mea­sure true rms acV at up to 50 readings per second with full accuracy for input frequencies greater than 10 kHz.
Besides high reading rates, the 3458A’s design was tuned for the many function and level changes required in testing your device. The 3458A can change function and range, take a measurement, and output the result at 340 per second. This is at least 5 times faster than other dmms. In addition, the 3458A transfers high speed measurement data over GPIB or into and out of its 75,000 reading memory at 100,000 readings per second.
You can reduce your data transfer overhead by using the unique non­volatile Program Memory of the 3458A to store complete measure­ment sequences. These test sequen­ces can be programmed and initiat­ed from the front panel for stand-alone operation without a controller.
Finally, the 3458A Multimeter makes fast and accurate measure­ments. Consider the 3458A’s
0.6 ppm 24 hour dc Volts accuracy, 100 ppm ac Volts accuracy and its standard functions of dcV, acV, dcI, acI, Ohms, frequency and period. Greater measurement accuracy from your dmm means higher confidence and higher test yields. More functions mean greater versa­tility and lower-cost test systems.
Longer system up-time
The 3458A Multimeter performs a complete self-calibration of all func­tions, including ac, using high stabil­ity internal standards. This self- or auto-calibration eliminates mea­surement errors due to time drift and temperature changes in your rack or on your bench for superior accuracy. When it’s time for periodic calibration to external standards, simply connect a precision 10 Vdc source and a precision 10 kresis­tor. All ranges and functions, including ac, are automatically calibrated using precision internal ratio transfer measurements rela­tive to the external standards.
The 3458A’s reliability is a product of Agilent’s “10 X” program of defect reduction. Through exten­sive environmental, abuse, and stress testing during the design stages of product development, has reduced the number of defects and early failures in its instruments by a factor of ten over the past ten years. Our confidence in the 3458A’s reliability is reflected in the low cost of the option for two addi­tional years of return-to-repair. This option (W30), when combined with the standard one-year warran­ty, will give you three years of worry-free operation.
678
Page 7
Page 8
Page 9
• 8.5 digits resolution
• 0.1 ppm dcV linearity
• 100 ppm acV absolute accuracy
• 4 ppm/year optional stability
For Calibration Lab Precision
In the calibration lab, you’ll find the 3458A’s 8.5digits to have extra­ordinary linearity, low internal noise, and excellent short term stability. The linearity of the 3458A’s Multislope A to D convert­er has been characterized with state-of-the-art precision. Using Josephsen Junction Array intrinsic standards, linearity has been mea­sured within ±0.05 ppm of 10Volts. The 3458A’s transfer accuracy for 10 Volts dc is 0.1 ppm over 1 hour ±0.5°C. Internal noise has been reduced to less than 0.01 ppm rms yielding 8.5 digits of usable resolution. So, the right choice for your calibration standard dmm is the 3458A.
dcV stability
The long term accuracy of the 3458A is a remarkable 8 ppm per year— more accurate than many system dmms are after only a day. Option 002 gives you a higher stability voltage reference specified to 4 ppm/year for the ultimate performance.
Reduced-error resistance
The 3458A doesn’t stop with accu­rate dcV. Similar measurement accuracy is achieved for resis­tance, acV, and current. You can measure resistance from 10 µ to 1Gwith midrange accuracy of 2.2 ppm.
Finally, the 3458A, like its dmm predecessors, offers offset-compensated Ohms on the 10 to 100 kranges to eliminate the errors introduced by small
series voltage offsets. Usable for both two- and four-wire ohms, the 3458A supplies a current through the unknown resistance, measures the voltage drop, sets the current to zero, and measures the voltage drop again. The result is reduced error for resistance measure­ments.
Precision acV
The 3458A introduces new heights of true rms ac volts performance with a choice of traditional analog or a new sampling technique for higher accuracy. For calibration sources and periodic waveforms from 1Hz to 10 MHz, the 3458A’s precision sampling technique offers extraordinary accuracy. With 100 ppm absolute accuracy for 45 Hz to 1 kHz or 170 ppm absolute accuracy to 20 kHz, the 3458A will enhance your measure­ment capabilities. Accuracy is maintained for up to 2 years with only a single 10 Volt dc precision standard. No ac standards are nec­essary. For higher speed and less accuracy, the analog true rms ac technique has a midband absolute measurement accuracy of 300 ppm using the same simple calibration procedure. With a bandwidth of 10 Hz to 2 MHz and reading rates to 50 /second, the analog technique is an excellent choice for high throughput computer-aided testing.
Easy calibration
The 3458A gives you low cost of ownership with a simple, two­source electronic calibration. With its superior linearity, the 3458A is fully calibrated, including ac, from a precision 10 V dc source and a precision 10 kresistor. All ranges and functions are automatically calibrated using precise internal ratio transfer measurements rela­tive to these external standards. In addition, the 3458A’s internal volt­age standard and resistance stan­dard are calibrated. Now you can perform a self-verifying, self- or auto-calibration relative to the 3458A’s low drift internal stan­dards at any time with the ACAL command. So, if your dmm’s envi­ronment changes, auto-calibration optimizes your measurement accuracy.
Calibration security
Unlike other dmms, the 3458A goes to great lengths to assure calibration security. First, a pass­word security code “locks” calibra­tion values and the self-calibration function. Next, you can easily store and recall a secured message for noting items, such as calibra­tion date and due date. Plus, the 3458A automatically increments a calibration counter each time you “unlock” the dmm— another safe­guard against calibration tamper­ing. If you have a unique situation or desire ultimate security, use the internal dmm hardwired switch to force removal of the instrument covers to perform calibration.
9
Page 10
• 16 bits at 100,000 samples/sec
• Effective rates to 100 Msamples/sec
• Signal bandwidth of 12 MHz
• 10 ns timing with <100 ps jitter
For High Resolution Digitizing
Easily acquire waveforms
Simple, application-oriented com­mands in the Agilent Multimeter Language (ML) make the task of waveform digitizing as easy as measuring dcV. Simply specify the sweep rate and number of samples.
Integration or track-and-hold paths
The 3458A gives you the choice of two configurations for high speed measurements: a 150kHz band­width integrating path with a vari­able aperture from 500ns to 1 sec­ond or a 12 MHz bandwidth path with a fixed 2 ns aperture and 16-bit track-and-hold. Use the integration path for lower noise, but use the track-and-hold path to precisely capture the voltage at a single point on a waveform.
Direct sampling function
The 3458A has two sampling func­tions for digitizing wave-forms: direct sampling and sequential or sub-sampling. With direct sam­pling, the 3458A samples through the 12 MHz path followed by the 2 ns track-and-hold providing 16 bits of resolution. The maximum sample rate is 50,000 samples/ second or 20 µs between samples. Samples can be internally paced by a 0.01% accurate timebase with time increments in 100 ns steps. Data transfers directly to your computer at full speed or into the dmm’s internal reading memory. Waveform reconstruction consists of simply plotting the digitized voltage readings versus the sam­pling interval of the timebase.
Sequential sampling function
Sequential or sub-sampling uses the same measurement path as direct sampling; however sequen­tial sampling requires a periodic input signal. The 3458A will synchronize to a trigger point on the waveform set by a level thresh­old or external trigger. Once syn­chronized, the dmm automatically acquires the waveform through digitizing successive periods with time increment steps as small as 10 ns, effectively digitizing at rates up to 100 Msamples/second. All you specify is the effective time­base and the number of samples desired, the 3458A automatically optimizes its sampling to acquire the waveform in the least amount of time. Then, for your ease of use, the 3458A automatically re-orders the data in internal memory to reconstruct the waveform.
10
Digitizing Configurations
Page 11
11
Page 12
3458A Technical Specifications
Section 1: DC Voltage 13 Section 7: Digitizing 24 Section 2: Resistance 14 Section 8: System Specifications 26 Section 3: DC Current 16 Section 9: Ratio 27 Section 4: AC Voltage 17 Section 10: Math Functions 27 Section 5: AC Current 22 Section 11: General Specifications 28 Section 6: Frequency/Period 23 Section 12: Ordering Information 29
Introduction
The Agilent 3458A accuracy is specified as a part per million (ppm) of the reading plus a ppm of range for dcV, Ohms, and dcI. In acV and acI, the specifi­cation is percent of reading plus percent of range. Range means the name of the scale, e.g. 1 V, 10 V, etc.; range does not mean the full scale reading, e.g. 1.2 V, 12 V, etc. These accuracies are valid for a specific time from the last calibration.
Absolute versus Relative Accuracy
All 3458A accuracy specifications are relative to the calibration standards. Absolute accuracy of the 3458A is determined by adding these relative accu­racies to the traceability of your calibration stan­dard. For dcV, 2 ppm is the traceability error from the factory. That means that the absolute error rel­ative to the U.S. National Institute of Standards and Technology (NIST) is 2 ppm in addition to the dcV accuracy specifications. When you recalibrate the 3458A, your actual traceability error will depend upon the errors from your calibration standards. These errors will likely be different from the error of 2ppm.
EXAMPLE 1: Relative Accuracy; 24 hour operating temperature is Tcal ±1°C
Assume that the ambient temperature for the mea­surement is within ±1°C of the temperature of cali­bration (Tcal). The 24 hour accuracy specification for a 10 V dc measurement on the 10 V range is
0.5 ppm + 0.05 ppm. That accuracy specification means:
0.5 ppm of Reading + 0.05 ppm of Range
For relative accuracy, the error associated with the measurement is:
(0.5 / 1,000,000 x 10 V) + (0.05 / 1,000,000 x 10 V) =
± 5.5 µV or 0.55 ppm of 10 V
Errors from temperature changes
The optimum technical specifications of the 3458A are based on auto-calibration (ACAL) of the instrument within the previous 24 hours and follow­ing ambient temperature changes of less than ±1°C. The 3458A’s ACAL capability corrects for measurement errors resulting from the drift of critical components from time and temperature.
The following examples illustrate the error correc­tion of auto-calibration by computing the relative measurement error of the 3458A for various tem­perature conditions. Constant conditions for each example are:
10 V DC input 10 V DC range Tcal = 23°C 90 day accuracy specifications
EXAMPLE 2: Operating temperature is 28°C; with ACAL
This example shows basic accuracy of the 3458A using auto-calibration with an operating tempera­ture of 28°C. Results are rounded to 2 digits.
(4.1 ppm x 10V) + (0.05 ppm x 10V) = 42 µV
Total relative error = 42 µV
EXAMPLE 3: Operating temperature is 38°C; without ACAL
The operating temperature of the 3458A is 38°C, 14°C beyond the range of Tcal ±1°C. Additional measurement errors result because of the added temperature coefficient without using ACAL.
(4.1 ppm x 10 V) + (0.05 ppm x 10 V) = 42 µV
Temperature Coefficient (specification is per °C):
(0.5 ppm x 10 V + 0.01 ppm x 10 V) x 14°C = 71µV
EXAMPLE 4: Operating temperature is 38°C; with ACAL
Assuming the same conditions as Example 3, but using ACAL significantly reduces the error due to temperature difference from calibration tempera­ture. Operating temperature is 10°C beyond the standard range of Tcal ±5°C.
(4.1 ppm x 10V) + (0.05 ppm x 10V) = 42µV
Temperature Coefficient (specification is per °C):
(0.15 ppm x 10V + 0.01 ppm x 10V) x 10°C = 16µV
Example 5: Absolute Accuracy; 90 Day
Assuming the same conditions as Example 4, but now add the traceability error to establish absolute accuracy.
(4.1 ppm x 10 V) + (0.05 ppm x 10 V) = 42 µV
Temperature Coefficient (specification is per °C):
(0.15 ppm x 10 V + 0.01 ppm x 10 V) x 10°C = 16 µV
factory traceability error of 2 ppm:
(2 ppm x 10 V) = 20 µV
Total absolute error = 78 µV
Additional errors
When the 3458A is operated at power line cycles below 100, additional errors due to noise and gain become significant. Example 6 illustrates the error correction at 0.1 PLC.
Example 6: Operating temperature is 28°C; 0.1 PLC
Assuming the same conditions as Example 2, but now add additional error.
(4.1 ppm x 10 V) + (0.05 ppm x 10 V) = 42 µV
Referring to the Additional Errors chart and RMS Noise Multiplier table, additional error at 0.1 PLC is:
(2 ppm x 10 V) + (0.4 ppm x 1 x 3 x 10 V) = 32 µV
Total relative error = 74 µV
Total error = 113µV
Total error = 58 µV
12
Page 13
DC Voltage
Range Full Scale Resolution Impedance (ppm of Reading + ppm of Range) / °C
100 mV 120.00000 10 nV > 10 G 1.2 + 1 0.15 + 1
1 V 1.20000000 10 nV > 10 G 1.2 + 0.1 0.15 + 0.1
10 V 12.0000000 100 nV > 10 G 0.5 + 0.01 0.15 + 0.01
100 V 120.000000 1 µV 10 M± 1% 2 + 0.4 0.15 + 0.1
1000 V 1050.00000 10 µV 10 M± 1% 2 + 0.04 0.15 + 0.01
3
Accuracy
Range 24 Hour
[ppm of Reading (ppm of Reading for Option 002) + ppm of Range]
100 mV 2.5 + 3 5.0 (3.5) + 3 9 (5) + 3 14 (10) + 3
1 V 1.5 + 0.3 4.6 (3.1) + 0.3 8 (4 )+ 0.3 14 (10) + 0.3
10 V 0.5 + 0.05 4.1 (2.6) + 0.05 8 (4) + 0.05 14 (10) + 0.05
100 V 2.5 + 0.3 6.0 (4.5) + 0.3 10 (6)+ 0.3 14 (10) + 0.3
6
1000 V
Maximum Input Temperature Coefficient
1 Year
1
5
Without ACAL
4
90 Day
5
With ACAL
2
2 Year
5
2.5 + 0.1 6.0 (4.5) + 0.1 10 (6) + 0.1 14 (10) + 0.1
Transfer Accuracy/ Linearity
Range (ppm of Reading + ppm of Range)
10 Min, Tref ± 0.5°C
100 mV 0.5 + 0.5
1 V 0.3 + 0.1
10 V 0.05 + 0.05
100 V 0.5 + 0.1
1000 V 1.5 + 0.05
Conditions
• Following 4 hour warm-up. Full scale to 10% of full scale.
• Measurements on the 1000 V range are within 5% of the initial measurement value and following measurement settling.
• Tref is the starting ambient temperature.
• Measurements are made on a fixed range (> 4 min.) using accepted metrology practices.
Section 1 / DC Voltage
1 Additional error from Tcal or last
ACAL ± 1° C .
2 Additional error from Tcal ± 5°C.
3 Specifications are for PRESET;
NPLC 100.
4 For fixed range (> 4 min.), MATH NULL
and Tcal ± 1°C.
5 Specifications for 90 day, 1 year and 2
year are within 24 hours and ± 1° C of last ACAL; Tcal ±5°C; MATH NULL and fixed range.
ppm of Reading specifications for High Stability (Option 002) are in parentheses.
Without MATH NULL, add 0.15 ppm of Range to 10 V, 0.7 ppm of Range to 1 V, and 7ppm of Range to 0.1V. Without math null and for fixed range less than 4 minutes, add 0.25 ppm of Range to 10 V, 1.7 ppm of Range to 1 V and 17 ppm of Range to 0.1 V.
Add 2 ppm of reading additional error for factory traceability to US NIST. Traceability error is the absolute error relative to National Standards associ­ated with the source of last external calibration.
6 Add 12 ppm X (Vin / 1000)
error for inputs > 100 V.
2
additional
Settling Characteristics
For first reading or range change error, add 0.0001% of input voltage step additional error. Reading settling times are affected by source impedance and cable dielectric absorption characteristics.
Additional Errors Noise Rejection (dB)
NPLC < 1 0 90 140
NPLC ≥ 1 60 150 140
NPLC 10 60 150 140
NPLC 100 60 160 140
NPLC = 1000 75 170 140
*RMS Noise
Range Multiplier
0.1V x20 1V x2 10V x1 100V x2 1000V x1
7
8
AC NMR
For RMS noise error, multiply RMS noise result from graph by multiplier in chart. For peak noise error, multiply RMS noise error by 3.
AC ECMR DC ECMR
7 Applies for 1 kunbalance in the LO
lead and ± 0.1% of the line frequency currently set for LFREQ.
8 For line frequency ± 1%, ACNMR is
40 dB for NPLC 1, or 55 dB for NPLC 100. For line frequency ± 5%, ACNMR is 30 dB for NPLC 100.
13
Page 14
Section 1 / DC Voltage
Reading Rate (Auto-Zero Off)
Temperature Coefficient
For a stable environment ± 1°C add the following additional error for AZERO OFF
Range Error
100 mV - 10 V 5 µV/ °C
100 V - 1000 V 500 µV/ °C
(Auto-Zero Off)
Selected Reading Rates
NPLC Aperture Digits Bits A-Zero Off A-Zero On
1
Readings / Sec
0.0001 1.4 µs 4.5 16 100,00034,130
0.0006 10 µs 5.5 18 50,000 3,150
0.01 167 µs26.5 21 5,300 930
0.1 1.67 ms26.5 21 592 245
1 16.6 ms27.5 25 60 29.4
10 0.166 s28.5 28 6 3
100 8.5 28 36 / min 18 / min
1000 8.5 28 3.6 / min 1.8 / min
Maximum Input
Rated Input Non-Destructive
HI to LO ± 1000 V pk ± 1200 V pk
LO to Guard
Guard to Earth
4
5
± 200 V pk ± 350 V pk
± 500 V pk ± 1000 V pk
Input Terminals
Terminal Material: Gold-plated Tellurium Copper
Input Leakage Current: <20 pA at 25°C
1 For PRESET;DELAY 0; DISP OFF; OFOR-
MAT DINT; ARANGE OFF.
2 Aperture is selected independent of
line frequency (LFREQ). These aper­tures are for 60 Hz NPLC values where 1 NPLC = 1 / LFREQ. For 50 Hz and NPLC indicated, aperture will increase by 1.2 and reading rates will decrease by 0.833.
3 For OFORMAT SINT.
10
LO to Guard with guard open.
4>10
12
Guard to Earth.
5>10
Section 2 / Resistance
Two-wire and Four-wire Ohms (OHM and OHMF Functions)
Range Full Scale Resolution Source Voltage Circuit Lead Resistance Series Offset (ppm of Reading + ppm of Range) / ° C
10 12.00000 10 µ 10 mA 0.1 V 12 V 20 0.01 V 3 + 1 1 + 1
100 120.00000 10 µ 1 mA 0.1 V 12 V 200 0.01 V 3 + 1 1 + 1
1 k 1.2000000 100 µ 1 mA 1.0 V 12 V 150 0.1 V 3 + 0.1 1 + 0.1
10 k 12.000000 1 m 100 µA 1.0 V 12 V 1.5 k 0.1 V 3 + 0.1 1 + 0.1
100 k 120.00000 10 m 50 µA 5.0 V 12 V 1.5 k 0.5 V 3 + 0.1 1 + 0.1
1 M 1.2000000 100 m 5 µA 5.0 V 12 V 1.5 k 3 + 1 1 + 1
10 M 12.000000 1 500 nA 5.0 V 12 V 1.5 k 20 + 20 5 + 2
100 M7120.00000 10 500 nA 5.0 V 5 V 1.5 k 100 + 20 25 + 2
7
1G
1.2000000 100 500 nA 5.0 V 5 V 1.5 k 1000 + 20 250 + 2
Maximum Current 4 Test Open Maximum Maximum Temperature Coefficient
(OHMF) (OCOMP ON) Without ACAL
5
4 Current source is ± 3% absolute
accuracy.
5 Additional error from Tcal or last
ACAL ± 1° C.
6 Additional error from Tcal ± 5° C.
7 Measurement is computed from
10 M in parallel with input.
With ACAL
6
14
Page 15
Accuracy1 (ppm of Reading + ppm of Range)
Range 24 Hour
2
90 Day
3
1 Year
3
10 5 + 3 15 + 5 15 + 5 20 + 10
100 3 + 3 10 + 5 12 + 5 20 + 10
1 k 2 + 0.2 8 + 0.5 10 + 0.5 15 + 1
10 k 2 + 0.2 8 + 0.5 10 + 0.5 15 + 1
100 k 2 + 0.2 8 + 0.5 10 + 0.5 15 + 1
1 M 10 + 1 12 + 2 15 + 2 20 + 4
10 M 50 + 5 50 + 10 50 + 10 75 + 10
100 M 500 + 10 500 + 10 500 + 10 0.1% + 10
1 G 0.5% + 10 0.5% + 10 0.5% + 10 1% + 10
2 Year
3
Two-Wire Ohms Accuracy
For Two-Wire Ohms ( OHM ) accuracy, add the following offset errors to the Four-Wire Ohms ( OHMF ) accuracy. 24 Hour: 50 m. 90 Day: 150 m. 1 Year: 250 m. 2 Year: 500 m
Section 2 / Resistance
1 Specifications are for PRESET;
NPLC 100; OCOMP ON; OHMF.
2 Tcal ± 1°C.
3 Specifications for 90 day, 1 year, and
2 year are within 24 hours and ± 1°C of last ACAL; Tcal ± 5°C.
Add 3 ppm of reading additional error for factory traceability of 10 kto US NIST. Traceability is the absolute error relative to National Standards associated with the source of last external calibration.
Additional Errors Selected Reading Rates
5
NPLC
Aperture Digits Auto-Zero Off Auto-Zero On
4
0.0001 1.4 µs 4.5 100,000
0.0006 10 µs 5.5 50,000 3,150
0.01 167 µs
6
6.5 5,300 930
0.1 1.66 ms66.5 592 245
1 16.6 ms67.5 60 29.4
10 0.166 s
6
7.5 6 3
100 7.5 36 / min 18 / min
Measurement Consideration
Agilent recommends the use of Teflon* cable or other high impedance, low dielectric absorption cable for these measurements.
Maximum Input
*RMS Noise
Range Multiplier
10 & 100 1k to 100 kx1 1 M 10 M 100 M 1 G
x
x x x x
10
1.5 2 120 1200
For RMS noise error, multiply RMS noise result from graph by multiplier in chart. For peak noise error, multiply RMS noise error by 3.
Settling Characteristics
For first reading error following range change, add the total 90 day measurement error for the current range. Preprogrammed settling delay times are for <200 pF external circuit capacitance.
HI to LO ± 1000 V pk ± 1000 V pk
HI & LO Sense to LO ± 200 V pk ± 350 V pk
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
Temperature Coefficient (Auto-Zero Off)
For a stable environment ± 1°C add the following error for AZERO OFF. (ppm of Range) / °C
Range Error Range Error
10 50 1 M 1
100 50 10 M 1
1 k 5 100 M 10
10 k 5 1 G 100
100 k 1
Readings / Sec
7
4,130
Rated Input Non-Destructive
4 For PRESET; DELAY 0; DISP OFF;
OFORMAT DINT; ARANGE OFF. For OHMF or OCOMP ON, the
maximum reading rates will be slower.
5 Ohms measurements at rates
< NPLC 1 are subject to potential noise pickup. Care must be taken to provide adequate shielding and guarding to main­tain measurement accuracies.
6 Aperture is selected independent
of line frequency (LFREQ). These apertures are for 60 Hz NPLC values where 1 NPLC = 1 / LFREQ. For 50 Hz and NPLC indicated, aperture will increase by 1.2 and reading rates will decrease by
0.833.
7 For OFORMAT SINT.
* Teflon is a registered trademark
of E.I. duPont deNemours and Co.
15
Page 16
Section 3 / DC Current
DC Current (DCI Function )
Range Full Scale Resolution Resistance Voltage (ppm of Reading + ppm of Range) / ° C
100 nA 120.000 1 pA 545.2 k 0.055 V 10 + 200 2 + 50
1 µA 1.200000 1 pA 45.2 k 0.045 V 2 + 20 2 + 5
10 µA 12.000000 1 pA 5.2 k 0.055 V 10 + 4 2 + 1
100 µA 120.00000 10 pA 730 0.075 V 10 + 3 2 + 1
1 mA 1.2000000 100 pA 100 0.100 V 10 + 2 2 + 1
10 mA 12.000000 1 nA 10 0.100 V 10 + 2 2 + 1
100 mA 120.00000 10 nA 1 0.250 V 25 + 2 2 + 1
1 A 1.0500000 100 nA 0.1 <1.5 V 25 + 3 2 + 2
Accuracy
Range 24 Hour
100 nA
1 µA
10 µA
3
(ppm Reading + ppm Range)
6
6
6
100 µA 10 + 6 15 + 8 20 + 8 25 + 8
1 mA 10 + 4 15 + 5 20 + 5 25 + 5
10 mA 10 + 4 15 + 5 20 + 5 25 + 5
100 mA 25 + 4 30 + 5 35 + 5 40 + 5
1 A 100 + 10 100 + 10 110 + 10 115 + 10
Maximum Shunt Burden Temperature Coefficient
1 Year
1
5
With ACAL
2 Year
Without ACAL
4
90 Day
5
10 + 400 30 + 400 30 + 400 35 + 400
10 + 40 15 + 40 20 + 40 25 + 40
10 + 7 15 + 10 20 + 10 25 + 10
1 Additional error from Tcal
or last ACAL ± 1° C .
2
5
2 Additional error from Tcal ± 5°C.
3 Specifications are for PRESET;
NPLC 100.
4 Tcal ± 1° C.
5 Specifications for 90 day, 1 year, and 2
year are within 24 hours and ± 1° C of last ACAL; Tcal ± 5° C.
Add 5 ppm of reading additional error for
factory traceability to US NIST. Traceabil­ity error is the sum of the 10 V and 10 k traceability values.
6 Typical accuracy.
Settling Characteristics
For first reading or range change error, add .001% of input current step additional error. Reading settling times can be affected by source impedance and cable dielectric absorption characteristics.
Additional Errors
*RMS Noise
Range Multiplier
100 nA x100 1 µA x10 10 µA to 1 A x1
For RMS noise error, multiply RMS noise result from graph by multiplier in chart. For peak noise error, multiply RMS noise error by 3.
Measurement Considerations
Agilent recommends the use of Teflon cable or other high impedance, low dielectric absorption cable for low current measurements. Current measurements at rates < NPLC 1 are subject to potential noise pickup. Care must be taken to provide adequate shielding and guarding to maintain measurement accuracies.
Selected Reading Rates
NPLC Aperture Digits Readings / Sec
0.0001 1.4 µs 4.5 2,300
0.0006 10 µs 5.5 1,350
0.01 167 µs
0.1 1.67 ms
1 16.6 ms
10 0.166 s
100 7.5 18 / min
7
8
8
8
8
6.5 157
6.5 108
7.5 26
7.5 3
Maximum Input
Rated Input Non-Destructive
I to LO ± 1.5 A pk < 1.25 A rms
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
7 For PRESET; DELAY 0; DISP OFF; OFORMAT
DINT; ARANGE OFF.
8 Aperture is selected independent of
line frequency (LFREQ). These apertures are for 60 Hz NPLC values where 1 NPLC = 1 / LFREQ. For 50 Hz and NPLC indicated, aperture will increase by 1.2 and reading rates will decrease by 0.833.
16
Page 17
Section 4 / AC Voltage
General Information
The Agilent 3458A supports three techniques for measuring true rms AC voltage, each offering unique capabilities. The desired measurement technique is selected through the SETACV command. The ACV functions will then apply the chosen method for subsequent measurements.
The following section provides a brief description of the three operation modes along with a summary table helpful in choosing the technique best suited to your specific measurement need.
SETACV SYNC Synchronously Sub-sampled Computed true rms technique.
This technique provides excellent linearity and the most accurate measurement results. It does require that the input signal be repetitive ( not random noise for example ). The bandwidth in this mode is from 1 Hz to 10 MHz.
SETACV ANA Analog Computing true rms conversion technique.
This is the measurement technique at power-up or following an instrument reset. This mode works well with any signal within its 10 Hz to 2 MHz bandwidth and provides the fastest measurement speeds
SETACV RNDM Random Sampled Computed true rms technique.
This technique again provides excellent linearity, however the overall accuracy is the lowest of the three modes. It does not require a repetitive input signal and is therefore well suited to wideband noise measure ments. The bandwidth in this mode is from 20 Hz to 10 MHz.
Selection Table
Best Repetitive
Technique Frequency Range Accuracy Signal Required
Synchronous Sub-sampled 1 Hz - 10 MHz 0.010% Yes 0.025 10
.
-
Readings / Sec
Minimum Maximum
Analog 10 Hz - 2 MHz 0.03% No 0.8 50
Random Sampled 20 Hz - 10 MHz 0.1% No 0.025 45
Synchronous Sub-sampled Mode (ACV Function, SETACV SYNC)
Range Full Scale Maximum Resolution Input Impedance (% of Reading +% of Range) / °C
10 mV 12.00000 10 nV 1 M± 15% with <140pF 0.002 + 0.02
100 mV 120.00000 10 nV 1 M± 15% with <140pF 0.001 + 0.0001
1 V 1.2000000 100 nV 1 M± 15% with <140pF 0.001 + 0.0001
10 V 12.000000 1 µV 1 M± 2 % with <140pF 0.001 + 0.0001
100 V 120.00000 10 µV 1 M± 2% with <140pF 0.001 + 0.0001
1000 V 700.0000 100 µV 1 M± 2% with <140pF 0.001 + 0.0001
AC Accuracy
24 Hour to 2 Year (% of Reading + % of Range)
Range 40 Hz 1 kHz 20 kHz 50 kHz 100 kHz 300 kHz 1 MHz 2 MHz
10 mV 0.03 + 0.0 3 0.02 + 0.011 0.03 + 0.011 0.1 + 0.011 0.5 + 0.011 4.0 + 0.02
100 mV - 10 V 0.007 + 0.004 0.007 + 0.002 0.014 + 0.00 2 0.03 + 0.002 0.08 + 0.002 0.3 + 0.01 1 + 0.01 1.5 + 0.01
100 V 0.02 + 0.004 0.02 + 0.002 0.02 + 0.002 0.035 + 0.002 0.12 + 0.002 0.4 + 0.01 1.5 + 0.01
1000 V 0.04+ 0.004 0.04 + 0.002 0.06 + 0.002 0.12 + 0.002 0.3 + 0.002
2
ACBAND 2MHz
1 Hz to
3
40 Hz to
3
1 kHz to
3
20 kHz to
3
Temperature Coefficient
50 kHz to 100 kHz to 300 kHz to 1 MHz to
1
1 Additional error beyond ± 1°C, but within
+ 5°C of last ACAL.
For ACBAND > 2MHz, use 10 mV range temperature coefficient for all ranges.
2 Specifications apply full scale to 10% of
full scale, DC < 10% of AC, sine wave input, crest factor = 1.4, and PRESET. Within 24 hours and ± 1° C of last ACAL. Lo to Guard Switch on..
Peak (AC + DC) input limited to 5 x full scale for all ranges in ACV function.
Add 2 ppm of reading additional error for factory traceability of 10 V DC to US NIST.
3 LFILTER ON recommended.
AC Accuracy continued on following page.
17
Page 18
Section 4 / AC Voltage
AC Accuracy continued: 24 Hour to 2 Year (% of Reading + % of Range)
ACBAND >2 MHz
Range 100 kHz 1 MHz 4 MHz 8 MHz 10 MHz
10 mV 0.09 + 0.06 1.2 + 0.05 7 + 0.07 20 +0.08
100 mV - 10 V 0.09 + 0.06 2.0 + 0.05 4 + 0.07 4 + 0.08 15 + 0.1
100 V 0.12 + 0.002
1000 V 0.3 + 0.01
Transfer Accuracy
Range % of Reading
100 mV - 100 V ( 0.002 + Resolution in %)
AC + DC Accuracy (ACDCV Function)
For ACDCV Accuracy apply the following additional error to the ACV accuracy. (% of Range)
DC <10% of AC Voltage
Range ACBAND 2MHz ACBAND > 2MHz Temperature Coefficient
10 mV 0.09 0.09 0.03
100 mV - 1000 V 0.008 0.09 0.0025
45 Hz to 100 kHz to 1 MHz to 4 MHz to 8 MHz to
1
Conditions
• Following 4 Hour warm-up
• Within 10 min and ±0.5°C of the reference measurement
• 45 Hz to 20 kHz, sine wave input
• Within ±10% of the reference voltage and frequency
2
1 Resolution in % is the value of RES com-
mand or parameter (reading resolution as percentage of measurement range).
2 Additional error beyond ± 1°C, but within
± 5° C of last ACAL. (% of Range) / °C. For ACBAND >2MHz,use 10m V range temper­ature coefficient. Lo to Guard switch on.
DC >10% of AC Voltage
Range ACBAND 2MHz ACBAND > 2MHz Temperature Coefficient
10 mV 0.7 0.7 0.18
100 mV - 1000 V 0.07 0.7 0.025
2
Additional Errors
Apply the following additional errors as appropriate to your particular measurement setup. (% of Reading)
Input Frequency
Source R 0 - 1 MHz 1 - 4 MHz 4 - 8 MHz 8 - 10 MHz
0 0 2 5 5 1 - 2 (Resolution in %) x 1
50 Terminated 0.003 0 0 0 2 - 3 (Resolution in %) x 2
75 Terminated 0.004 2 5 5 3 - 4 (Resolution in %) x 3
50 0.005 3 7 10 4 - 5 (Resolution in %) x 5
Reading Rates
ACBAND Low Maximum Sec / Reading
1 - 5 Hz 6.5
5 - 20 Hz 2.0
20 - 100 Hz 1.2
100 - 500 Hz 0.32
> 500 Hz 0.02
4
3
Crest Factor Resolution Multiplier
% Resolution Maximum Sec / Reading
0.001 - 0.005 32
0.005 - 0.01 6.5
0.01 - 0.05 3.2
0.05 - 0.1 0.64
0.1 - 1 0.32
> 1 0.1
1
3 Flatness error including instrument
loading.
4 Reading time is the sum of the
Sec / Reading shown for your configuration. The tables will yield the slowest reading rate for your configuration. Actual reading rates may be faster. For DELAY-1; ARANGE OFF.
Settling Characteristics
There is no instrument settling required.
Common Mode Rejection
For 1 kimbalance in LO lead, > 90 dB, DC to 60 Hz.
18
Page 19
Section 4 / AC Voltage
High Frequency Temperature Coefficient
For outside Tcal ±5°C add the following error. (% of Reading) / °C
Frequency
Range 2 - 4 MHz 4 - 10 MHz
10 mV - 1 V 0.02 0.08
10 V - 1000 V 0.08 0.08
Analog Mode
Range Full Scale Maximum Resolution Input Impedance (% of Reading+ % of Range) / °C
10 mV 12.00000 10 nV 1 M± 15% with < 140pF 0.003 + 0.006
100 mV 120.0000 100 nV 1 M± 15% with < 140pF 0.002 + 0.0
1 V 1.200000 1 µV 1 M± 15% with < 140pF 0.002 + 0.0
10 V 12.00000 10 µV 1 M± 2% with < 140pF 0.002 + 0.0
100 V 120.0000 100 µV 1 M± 2% with < 140pF 0.002 + 0.0
1000 V 700.000 1 mV 1 M± 2% with < 140pF 0.002 + 0.0
AC Accuracy
(ACV Function, SETACV ANA)
2
Maximum Input
Rated Input Non-Destructive
HI to LO ± 1000 V pk ± 1200 V pk
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
Volt - Hz Product 1x10
Temperature Coefficient
8
1
1 Additional error beyond ± 1°C, but within
± 5°C of last ACAL.
2 Specifications apply full scale to 1/20 full
scale, sinewave input, crest factor = 1.4, and PRESET. Within 24 hours and ± 1°C of last ACAL. Lo to Guard switch on.
Maximum DC is limited to 400V in ACV function.
Add 2 ppm of reading additional error for factory traceability of 10V DC to US NIST.
24 Hour to 2 Year (% Reading + % Range)
Range 20 Hz 40 Hz 100 Hz 20 kHz 50 kHz 100 kHz 250 kHz 500 kHz 1 MHz 2 MHz
10 mV 0.4 + 0.32 0.15 + 0.25 0.06 + 0.25 0.02 + 0.25 0.15 + 0.25 0.7 + 0.35 4 + 0.7
100 mV - 10 V 0.4 + 0.02 0.15 + 0.02 0.06 + 0.01 0.02 + 0.01 0.15 + 0.04 0.6 + 0.08 2 + 0.5 3 + 0.6 5 + 2 10 + 5
100 V 0.4 + 0.02 0.15 + 0.02 0.06 + 0.01 0.03 + 0.01 0.15+ 0.04 0.6+ 0.08 2 + 0.5 3 + 0.6 5 + 2
1000 V 0.42 + 0.03 0.17 + 0.03 0.08 + 0.02 0.06 + 0.02 0.15 + 0.04 0.6 + 0.2
10 Hz to 20 Hz to 40 Hz to 100 Hz to 20 kHz to 50 kHz to 100 kHz to 250 kHz to 500 kHz to 1 MHz to
AC + DC Accuracy (ACDCV Function)
For ACDCV Accuracy apply the following additional error to the ACV accuracy. (% of Reading + % of Range)
DC < 10% of AC Voltage DC > 10% of AC Voltage
Range Accuracy Temperature Coefficient
10 mV 0.0 + 0.2 0 + 0.015 0.15 + 3 0 + 0.06
100 mV-1000 V 0.0 + 0.02 0 + 0.001 0.15 + 0.25 0 + 0.007
3
Accuracy Temperature Coefficient
Additional Errors
Apply the following additional errors as appropriate to your particular measurement setup.
Low Frequency Error ( % of Reading )
ACBAND Low
Signal 10 Hz - 1 kHz 1 - 10 kHz > 10 kHz Frequency NPLC >10 NPLC >1 NPLC > 0.1
10 - 200 Hz 0
200 - 500 Hz 0 0.15
500 - 1 kHz 0 0.015 0.9
1 - 2 kHz 0 0 0.2
2 - 5 kHz 0 0 0.05
5 - 10 kHz 0 0 0.01
Crest Factor Error ( % of Reading)
Crest Factor Additional Error
1 - 2 0
2 - 3 0.15
3 - 4 0.25
4 - 5 0.40
3 Additional error beyond ± 1° C, but within
± 5°C of last ACAL.
3
(% of Reading + % of Range) / °C.
19
Page 20
Section 4 / AC Voltage
Reading Rates
ACBAND Low NPLC ACV ACDCV
10 Hz 10 1.2 1
1 kHz 1 1 0.1
10 kHz 0.1 1 0.02
1
Sec / Reading
Settling Characteristics
For first reading or range change error using default delays, add .01% of input step additional error. The following data applies for DELAY 0.
Function ACBAND Low DC Component Settling Time
ACV 10Hz DC < 10% AC 0.5 sec to 0.01%
DC > 10% AC 0.9 sec to 0.01%
ACDCV 10 Hz-1 kHz 0.5 sec to 0.01%
1 kHz - 10 kHz 0.08 sec to 0.01%
10 kHz 0.015 sec to 0.01%
Maximum Input Common Mode Rejection
Rated Input Non-Destructive For 1 kimbalance in LO lead, > 90 dB, DC - 60 Hz.
HI to LO ± 1000 V pk ± 1200 V pk
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
Volt - Hz Product 1 x10
8
1 For DELAY-1; ARANGE OFF.
For DELAY 0; NPLC .1 , unspecified reading rates of greater than 500 / Sec are possible.
Random Sampled Mode
Range Full Scale Maximum Resolution Input Impedance ( % of Reading + % of Range ) / °C
10 mV 12.000 1 µV 1 M± 15% with < 140pF 0.002 + 0.02
100 mV 120.00 10 µV 1 M± 15% with < 140pF 0.001+ 0.0001
1 V 1.2000 100 µV 1 M± 15% with < 140pF 0.001+ 0.0001
10 V 12.000 1 mV 1 M± 2% with < 140pF 0.001+ 0.0001
100 V 120.00 10 mV 1 M± 2% with < 140pF 0.001+ 0.0001
1000 V 700.0 100 mV 1 M± 2% with < 140pF 0.001+ 0.0001
AC Accuracy
3
(ACV Function, SETACV RNDM)
Temperature Coefficient
2
24 Hour to 2 Year (% of Reading + % of Range)
ACBAND 2 MHz ACBAND > 2 MHz
20 Hz to 100 kHz to 300 kHz to 1 MHz to 20 Hz to 100 kHz to 1 MHz to 4 MHz to 8 MHz to
Range 100 kHz 300 kHz 1 MHz 2 MHz 100 kHz 1 MHz 4 MHz 8 MHz 10 MHz
10 mV 0.5+0.02 4+0.02 0.1+0.05 1.2+0.05 7+0.07 20+0.08
100 mV–10 V 0.08+0.002 0.3+0.01 1+0.01 1.5+0.01 0.1+0.05 2+0.05 4+0.07 4+0.08 15+0.1
100 V 0.12+0.002 0.4+0.01 1.5+0.01 0.12+0.002
1000 V 0.3+0.01 0.3+0.01
2 Additional error beyond ± 1°C, but within
± 5°C of last ACAL.
For ACBAND > 2 MHz, use 10 mV range temperature coefficient for all ranges.
3 Specifications apply from full scale to
5% of full scale, DC < 10% of AC, sine wave input, crest factor = 1.4, and PRESET. Within 24 hours and ± 1°C of last ACAL. LO to Guard switch on.
Add 2 ppm of reading additional error for factory traceability of 10V DC to US NIST.
Maximum DC is limited to 400V in ACV function.
20
Page 21
Section 4 / AC Voltage
AC + DCV Accuracy (ACDCV Function)
For ACDCV Accuracy apply the following additional error to the ACV accuracy. (% of Range).
DC 10% of AC Voltage DC >10% of AC Voltage
Range 2 MHz > 2 MHz Coefficient
ACBAND ACBAND Temperature ACBAND ACBAND Temperature
10 mV 0.09 0.09 0.03 0.7 0.7 0.18
100 mV - 1 kV 0.008 0.09 0.0025 0.07 0.7 0.025
1
2 MHz > 2 MHz Coefficient
1
Additional Errors
Apply the following additional errors as appropriate to your particular measurement setup. (% of Reading)
Input Frequency
Source R 0 - 1 MHz 1 - 4 MHz 4 - 8 MHz 8 - 10 MHz
0 0 2 5 5
50 Terminated 0.003 0 0 0
75 Terminated 0.004 2 5 5
50 0.005 3 7 10
Reading Rates
% Resolution ACV ACDCV
0.1 - 0.2 40 39
0.2 - 0.4 11 9.6
0.4 - 0.6 2.7 2.4
0.6 - 1 1.4 1.1
1 - 2 0.8 0.5
2 - 5 0.4 0.1
>5 0.32 0.022
3
Sec / Reading
Settling Characteristics
For first reading or range change error using default delays, add 0.01% of input step additional error. The following data applies for DELAY 0.
Function DC Component Settling Time
ACV DC < 10% of AC 0.5 sec to 0.01%
DC > 10% of AC 0.9 sec to 0.01%
ACDCV No instrument settling required.
2
Crest Factor Resolution Multiplier
1 - 2 (Resolution in %) x 1
2 - 3 (Resolution in %) x 3
3 - 4 (Resolution in %) x 5
4 - 5 (Resolution in %) x 8
High Frequency Temperature Coefficient
For outside Tcal ± 5°C add the following error. (% of Reading) / °C
Range 2 - 4 MHz 4 - 10 MHz
10 mV - 1 V 0.02 0.08
10 V - 1000 V 0.08 0.08
Common Mode Rejection
For 1 kimbalance in LO lead, > 90 dB, DC to 60 Hz.
Maximum Input
Rated Input Non-Destructive
HI to LO ± 1000 V pk ± 1200 V pk
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
Volt - Hz Product 1 x 10
8
1 Additional error beyond ± 1°C, but within
± 5°C of last ACAL. (% of Reading) / °C.
For ACBAND > 2MHz, use 10mV range temperature coefficient for all ranges.
2 Flatness error including instrument
loading.
3 For DELAY -1; ARANGE OFF. For DELAY 0
in ACV, the reading rates are identical to ACDCV.
21
Page 22
Section 5 / AC Current
AC Current (ACI and ACDCI Functions)
Range Full Scale Resolution Resistance Voltage (% of Reading + % of Range) / °C
100 µA 120.0000 100 pA 730 0.1 V 0.002 + 0
1 mA 1.200000 1 nA 100 0.1 V 0.002 + 0
10 mA 12.00000 10 nA 10 0.1 V 0.002 + 0
100 mA 120.0000 100 nA 1 0.25 V 0.002 + 0
1 A 1.050000 1 µA 0.1 < 1.5 V 0.002 + 0
AC Accuracy
2
24 Hour to 2 Year (% Reading + % Range)
Maximum Shunt Burden Temperature Coefficient
1
1 Additional error beyond ± 1°C, but within
± 5°C of last ACAL.
2 Specifications apply full scale to 1/20 full
scale, for sine wave inputs, crest factor =
1.4, and following PRESET within 24 hours and ± 1°C of last ACAL.
Add 5 ppm of reading additional error for factory traceability to US NIST. Traceabil­ity is the sum of the 10 V and 10 ktrace­ability values.
3 Typical performance.
4 1 kHz maximum on the 100 µA range.
Range 20 Hz 45 Hz 100 Hz 5kHz 20 kHz
10 Hz to 20 Hz to 45 Hz to 100 Hz to 5 kHz to 20 kHz to 50 kHz to
4
100 µA
0.4+0.03 0.15+0.03 0.06+0.03 0.06 +0.03
3
50 kHz
3
1 mA - 100 mA 0.4+0.02 0.15+0.02 0.06+0.02 0.03+0.02 0.06+0.02 0.4 +0.04 0.55+0.15
1 A 0.4+0.02 0.16+0.02 0.08+0.02 0.1+0.02 0.3 +0.02 1 + 0.04
AC + DC Accuracy (ACDCI Function)
For ACDCI Accuracy apply the following additional error to the ACI accuracy. (% of Reading + % of Range).
DC10% of AC DC > 10% of AC
Accuracy Temperature Coefficient
0.005 + 0.02 0.0 + .001 0.15 + 0.25 0.0 + 0.007
5
Accuracy Temperature Coefficient
5
Additional Errors
Apply the following additional errors as appropriate to your particular measurement setup.
Low Frequency Error ( % of Reading ) Crest Factor Error (% of Reading)
ACBAND Low Crest Factor Additional Error
Signal 10 Hz - 1 kHz 1 - 10 kHz > 10 kHz Frequency NPLC >10 NPLC >1 NPLC >0.1 2 - 3 0.15
10 - 200Hz 0 3 - 4 0.25
200 - 500 Hz 0 0.15 4 - 5 0.40
500 - 1 kHz 0 0.015 0.9
1 - 2 kHz 0 0 0.2
2 - 5 kHz 0 0 0.05
5 - 10 kHz 0 0 0.01
1 - 2 0
100 kHz
3
5 Additional error beyond ± 1°C,
but within ± 5 °C of last ACAL. (% of Reading + % of Range) / °C.
Reading Rates
22
6
Maximum Sec / Reading
ACBAND Low NPLC ACI ACDCI
10 Hz 10 1.2 1
1 kHz 1 1 0.1
10 kHz 0.1 1 0.02
6 For DELAY-1; ARANGE OFF. For DELAY 0;
NPLC .1, unspecified reading rates of greater than 500/sec are possible.
Page 23
Settling Characteristics
For first reading or range change error using default delays, add .01% of input step additional error for the 100 µA to 100 mA ranges. For the 1 A range add .05% of input step additional error. The following data applies for DELAY 0.
Function ACBAND Low DC Component Settling Time
ACI 10 Hz DC < 10% AC 0.5 sec to 0.01%
DC > 10% AC 0.9 sec to 0.01%
ACDCI 10 Hz-1 kHz 0.5 sec to 0.01%
1 kHz - 10 kHz 0.08 sec to 0.01%
10 kHz 0.015 sec to 0.01%
Maximum Input
Rated Input Non-Destructive
I to LO ± 1.5 A pk < 1.25 A rms
LO to Guard ± 200 V pk ± 350 V pk
Guard to Earth ± 500 V pk ± 1000 V pk
Section 5 / AC Current
Frequency / Period Characteristics
Voltage (AC or DC Coupled) Current (AC or DC Coupled)
ACV or ACDCV Functions
Frequency Range 1 Hz – 10 MHz 1 Hz – 100 kHz
Period Range 1 sec – 100 ns 1 sec – 10 µs
Input Signal Range 700 V rms – 1 mV rms 1 A rms – 10 µA rms
Input Impedance 1 M± 15% with < 140 pF 0.1 – 730
Accuracy
Range 0°C – 55°C
1 Hz – 40 Hz 1 s – 25 ms
40 Hz – 10 MHz 25 ms – 100 ns
24 Hour – 2 Year
0.05 % of Reading
0.01 % of Reading
Measurement Technique:
Reciprocal Counting
Time Base:
10 MHz ± 0.01%, 0°C to 55°C
1
ACI or ACDCI Functions
Reading Rates
Resolution Gate Time
0.00001% 1 s 0.95
> 0.0001% 100 ms 9.6
> 0.001% 10 ms 73
> 0.01% 1 ms 215
> 0.1% 100 µs 270
Trigger Filter:
Selectable 75 kHz Low Pass Trigger Filter
Slope Trigger:
Positive or Negative
Section 6 / Frequency/Period
1 The source of frequency measurements
1
2
3
Readings/Sec
4
and the measurement input coupling are determined by the FSOURCE command.
2 Range dependent, see ACI for specific
range impedance values.
3 Gate Time is determined by the specified
measurement resolution.
4 For Maximum Input specified to fixed
range operation. For auto range, the maximum speed is 30 readings/sec for ACBAND 1 kHz.
Actual Reading Speed is the longer of 1 period of the input, the chosen gate time, or the default reading time-out of 1.2 sec.
Level Trigger:
± 500% of Range in 5% steps
23
Page 24
Section 7 / Digitizing
General Information
The Agilent 3458A supports three independent methods for signal digitizing. Each method is discussed below to aid in selecting the appropriate setup best suited to your specific application.
DCV Standard DCV function.
This mode of digitizing allows signal acquisition at rates from 0.2 readings / sec at 28 bits resolution to 100 k readings/sec at 16 bits. Arbitrary sample apertures from 500 ns to 1 sec are selectable with 100 ns reso­lution. Input voltage ranges cover 100 mV to 1000 V full scale. Input bandwidth varies from 30 kHz to 150 kHz depending on the measurement range.
DSDC Direct Sampling DC Coupled measurement technique. DSAC Direct Sampling AC Coupled measurement technique.
In these modes the input is sampled through a track / hold with a fixed 2 ns aperture which yields a 16 bit reso­lution result. The sample rate is selectable from 6000 sec / sample to 20 µs / sample with 100 ns resolution. Input voltage ranges cover 10 mV peak to 1000 V peak full scale. The input bandwidth is limited to 12 MHz.
SSDC Sub-Sampling ( Effective time sampling ) DC Coupled. SSAC Sub-Sampling ( Effective time sampling ) AC Coupled.
These techniques implement synchronous sub-sampling of a repetitive input signal through a track / hold with a 2 ns sample aperture which yields a 16 bit resolution result. The effective sample rate is settable from 6000 sec / sample to 10 ns / sample with 10 ns resolution. Sampled data can be time ordered by the instrument and output to the GPIB. Input voltage ranges cover 10 mV peak to 1000 V peak full scale. The input bandwidth is limited to 12 MHz.
Summary of Digitizing Capabilities
Technique Function Input Bandwidth Best Accuracy Sample Rate
Standard DCV DC - 150 kHz 0.00005 - 0.01% 100 k / sec
Direct-sampled DSDC / DSAC DC - 12 MHz 0.02% 50 k / sec
Sub-sampled SSDC / SSAC DC - 12 MHz 0.02% 100 M / sec (effective)
Standard DC Volts Digitizing (DCV Function)
Range Impedance Voltage
100 mV >10
1 V >10
10 V >10
100 V 10 M < 500 µV 30 kHz 200 µs
1000 V 10 M < 500 µV 30 kHz 200 µs
Input Offset Typical Settling Time
10
< 5 µV 80 kHz 50 µs
10
< 5 µV 150 kHz 20 µs
10
< 5 µV 150 kHz 20 µs
DC Performance
0.005 % of Reading + Offset
1
Maximum Sample Rate (See DCV for more data.)
Readings / sec Resolution Aperture
100 k 15 bits 0.8 µs
100 k 16 bits 1.4 µs
50 k 18 bits 6.0 µs
1
Bandwidth to 0.01% of Step
Sample Timebase
Accuracy: 0.01 % Jitter: < 100 ps rms
External Trigger
Latency: < 175 ns Jitter: < 50 ns rms
2
Level Trigger
Latency: < 700 ns Jitter: < 50 ns rms
1 ±1°C of an AZERO or within 24 hours and
± 1°C of last ACAL.
2 < 125 ns variability between multiple
3458As.
24
Page 25
Dynamic Performance
100 mV, 1 V, 10 V Ranges; Aperture = 6 µs
Test Input (2 x full scale pk-pk) Result
DFT-harmonics 1 kHz < -96 dB
DFT-spurious 1 kHz < -100 dB
Differential non-linearity dc < 0.003% of Range
Signal to Noise Ratio 1 kHz > 96 dB
Section 7 / Digitizing
Direct and Sub-sampled Digitizing
1
Range
10 mV 1 Mwith 140 pF < 50 µV 2 MHz
100 mV 1 Mwith 140 pF < 90 µV 12 MHz
1 V 1 Mwith 140 pF < 800 µV 12 MHz
10 V 1 Mwith 140 pF < 8 mV 12 MHz
100 V 1 Mwith 140 pF < 80 mV 12 MHz
1000 V 1 Mwith 140 pF < 800 mV 2 MHz
Input Offset Typical
Impedance Voltage
(DSDC, DSAC, SSDC and SSAC Functions)
2
DC to 20 kHz Performance
0.02 % of Reading + Offset
2
Maximum Sample Rate
Function Readings / sec Resolution
SSDC, SSAC 100 M (effective) 416 bits
DSDC, DSAC 50 k 16 bits
Dynamic Performance
100 mV, 1 V, 10 V Ranges; 50,000 Samples/sec
Test Input (2 x full scale pk-pk) Result
DFT-harmonics 20 kHz < - 90 dB
DFT-harmonics 1.005 MHz < - 60 dB
DFT-spurious 20 kHz < - 90 dB
Differential non-linearity 20 kHz < 0.005 % of Range
Signal to Noise Ratio 20 kHz > 66 dB
Bandwidth
3
3
Sample Timebase
Accuracy: 0.01 % Jitter: < 100 ps rms
External Trigger
Latency: < 125 ns Jitter: < 2 ns rms
5
Level Trigger
Latency: < 700 ns Jitter: < 100 ps, for 1 MHz full scale input
1 Maximum DC voltage limited to 400 V DC
in DSAC or SSAC functions.
2 ±1°C and within 24 hours of last ACAL
ACV.
3 Limited to 1 x 10 8 V-Hz product.
4 Effective sample rate is determined by
the smallest time increment used during synchronous sub-sampling of the repeti­tive input signal, which is 10 ns.
5 < 25 ns variability between multiple
3458As.
25
Page 26
Section 8 / System Specifications
Function-Range-Measurement
The time required to program via GPIB a new measurement configuration, trigger a reading, and return the result to a controller with the following instrument setup: PRESET FAST; DELAY 0; AZERO ON; OFORMAT SINT; INBUF ON; NPLC 0.
TO - FROM Configuration Description GPIB Rate
1
DCV 10 V to DCV 10 V 180 / sec 340 / sec any DCV / OHMS to any DCV / OHMS 85 / sec 110 / sec any DCV / OHMS to any DCV / OHMS with DEFEAT ON 150 / sec 270 / sec TO or FROM any DCI 70 / sec 90 / sec TO or FROM any ACV or ACI 75 / sec 90 / sec
Selected Operating Rates
2
Rate
DCV Autorange Rate (100 mV to 10 V) 110 / sec Execute simple command changes (CALL, OCOMP, etc.) 330 / sec Readings to GPIB, ASCII 630 / sec Readings to GPIB, DREAL 1000 / sec Readings to GPIB, DINT 50,000 / sec Readings to internal memory, DINT 50,000 / sec Readings from internal memory to GPIB, DINT 50,000 / sec Readings to GPIB, SINT 100,000 / sec Readings to internal memory, SINT 100,000 / sec Readings from internal memory to GPIB, SINT 100,000 / sec Maximum internal trigger reading rate 100,000 / sec Maximum external trigger reading rate 100,000 / sec
Memory
Standard Option 001
Readings Bytes Readings Bytes
Reading Storage ( 16 bit ) 10,240 20 k +65,536 +128 k
Non-volatile, for subprograms and / or state storage
14 k
Subprogram Rate
1 Using HP 9000 Series 350.
2 SINT data is valid for
APER 10.8 µs.
Delay Time Timer
Accuracy ±0.01% ± 5 ns Accuracy ±0.01% ± 5 ns Maximum 6000 s Maximum 6000 s Resolution 10 ns Resolution 100 ns Jitter 50 ns pk-pk Jitter <100 ps rms
26
Page 27
Section 9 / Ratio
Type of Ratio
DCV / DCV Ratio = (Input) / (Reference) ACV / DCV Reference: (HI Sense to LO) - (LO Sense to LO) ACDCV / DCV Reference Signal Range: ±12 V DC (autorange only)
1
1 All SETACV measurement types are
selectable.
LO Sense to LO limited to ± 0.25 V.
Accuracy
± (Input error + Reference Error) Input error = 1 x Total Error for input signal measurement function (DCV, ACV, ACDCV) Reference error = 1.5 x Total error for the range of the reference DC input
Section 10 / Math Functions
General Math Function Specifications
Math is executable as either a real-time or post processed operation.
Math function specifications do not include the error in X ( the instrument reading ) or errors in user entered values. The range of values input or output is + 1.0 x 10 GPIB. The minimum execution time is the time required to complete one math operation after each reading has completed.
-
37
to + 1.0 x 10
37
. Out of range values indicate OVLD in the display and 1 x 10
38
to
NULL: SCALE:
X-OFFSET (X-OFFSET) / SCALE Minimum Execution Time = 180 µs Minimum Execution Time = 500 µs
PERC: PFAIL:
100 x (X-PERC) / PERC Based on MIN, MAX registers Minimum Execution Time = 600 µs Minimum Execution Time = 160 µs
dB: dBm:
20 x Log (X/REF) 10 x Log [(X Minimum Execution Time = 3.9 ms Minimum Execution Time = 3.9 ms
RMS: FILTER:
1-pole digital filter 1-pole digital filter Computed rms of inputs. Weighted Average of inputs Minimum Execution Time = 2.7 ms Minimum Execution Time= 750 µs
STAT: CTHRM (FTHRM):
MEAN, SDEV computed for sample °C (°F) temperature conversion for population (N-1). 5 kthermistor (40653B). NSAMP, UPPER, LOWER accumulated. Minimum Execution Time = 160 µs Minimum Execution Time = 900 µs
CTHRM2K (FTHRM2K): CTHRM10K (FTHRM10K):
°C (°F) temperature conversion for °C (°F) temperature conversion for
2.2 kthermistor (40653A). 10 kthermistor (40653C). Minimum Execution Time = 160 µs Minimum Execution Time = 160 µs
CRTD85 (FRTD85): CRTD92 (FRTD92):
°C (°F) temperature conversion for °C (°F) temperature conversion for RTD of 100 , Alpha = 0.00385 RTD of 100 , Alpha = 0.003916 (40654A or 40654B). Minimum Execution Time = 160 µs Minimum Execution Time = 160 µs
2
/ RES) / 1mW]
27
Page 28
Section 11 / General Specifications
Operating Environment
o
0
C to 55oC
Operating Humidity Range
up to 95% RH at 40
o
C
Physical Characteristics
88.9 mm H x 425.5 mm W x 502.9 mm D Net Weight: 12 kg (26.5 lbs) Shipping Weight 14.8 kg (32.5 lbs)
IEEE-4888 Interface
Complies with the following:
IEEE-488.1 Interface Standard IEEE-728 Codes/Formats Standard HPML (Multimeter Language)
Storage Temperature
o
-40
C to + 75oC
Warm-Up Time
4 Hours to published specifications
Power Requirements
100/120 V, 220/240 V ±10% 48-66 Hz, 360-420 Hz automatically sensed < 30 W, < 80 VA (peak) Fused: 1.5 @ 115 V or 0.5 A @230 V
Designed in Accordance with
Safety: IEC 348, UL1244, CSA EMI:FTZ 1046, FCC part 15-J Classification: Classified under MIL-T-28800D as Type III, Class 5, Style E, and Color R.
Warranty Period
One year
Input Terminals
Gold-plated Tellurium Copper
Included with 3458A
Test Lead Set (34118B) Power Cord Operating Manual (P/N 03458-90004) Calibration Manual (P/N 03458-90016) Assembly Level Repair Manual (P/N 03458-90010) Quick Reference Guide (P/N 03458-90005)
Field Installation Kits Part Number
Option 001 Extended Reading Memory 03458-87901 Option 002 High Stability Reference 03458-80002 Extra Keyboard Overlays (5 each) 03458-84303
Available Documentation Part Number
Product Note 3458A-1: Optimizing Throughput and Reading Rate 5953-7058 Product Note 3458A-2: High Resolution Digitizing with the 3458A 5953-7059 Product Note 3458A-3: Electronic Calibration of the 3458A 5953-7060 Extra Manual Set 03458-90100
28
Page 29
Agilent 3458A Multimeter
(with GPIB, 20k bytes reading memory, and 8 ppm stability)
Option 001 Extended Reading Memory (Expands total to 148 k bytes) Option 002 High Stability (4 ppm/year) Reference
Option 1BP MIL-STD-45662A Certificate of Calibration - with data Option W30 Three year customer return repair coverage Option W32 Three year customer return calibration coverage Option 907 Front Handles Kit (P/N 5062-3988) Option 908 Rack Mount Kit (P/N 5062-3974) Option 909 Rack Mount Kit with handles (P/N 5062-3975)
Accessories
10833A GPIB Cable (1m) 10833B GPIB Cable (2m) 10833C GPIB Cable (4m) 10833D GPIB Cable (0.5m)
34118B Test Lead Set 11053A Low thermal test lead pair, spade lug to spade lug, 0.9 m 11174A Low thermal test lead pair, spade lug to banana, 0.9 m 11058A Low thermal test lead pair, banana to banana, 0.9 m 34301A 700 MHz Rf Probe 34300A 40 kV ac/dc High Voltage Probe 34119A 5 kV dc/ac 1 MHz High Voltage Probe 34302A Clamp-on ac/dc Current Probe (100A) 11059A Kelvin Probe Set (4-wires, 1 m) 11062A Kelvin Clip Set (2 each)
Section 12 / Ordering Information
To p : Low thermal test leads Bottom: Kelvin probe and clip set
29
Page 30
More High Performance Multimeters to Meet Your Needs
34401A Multimeter
• 6.5 digits of resolution
• 15 ppm basic 24-hr accuracy
• 11 measurement functions
• 1,000 readings per second
• GPIB and RS-232 standard
Agilent offers a full line
of affordable, high
performance DMMs
from 3.5 digit Handhelds
to the 8.5 digit 3458A. Please
consult your T&M catalog
or contact the nearest
Agilent Technologies sales
office for more information.
The new standard in price / performance
If you are looking for an affordable, high performance DMM, look no further. The 34401A brings you all the performance you expect from Agilent Technologies, but at a price that will surprise you.
Uncompromised performance
The 34401A combines a powerful measure­ment engine with an advanced feature set. The results are impressive: 6.5 digits of resolution, 1,000 readings per second, 11 measurement functions, standard GPIB and RS-232, built-in limit test, and room for 512 readings in volatile memory. The 34401A is at home either on your bench or in your test system.
Affordable workhorse
By leveraging 3458A measurement technology, replacing piles of discrete chips with custom ICs, and by designing for manufacturability, we have eliminated costs without sacrificing reliability. The 34401A has a proven track record, with tens of thousands of units in the field today and an actual MTBF of over 150,000 hours. With numbers like that, chances are you’ll retire before it does.
30
6.5 digit accuracy
at a 5.5 digit price...
the 34401A
Multimeter
Page 31
34420A Nanovolt / Micro-ohm meter
• 7.5 digits of resolution
• 100 pV/100 nof sensitivity
• 8 nVpp noise
• Built-in two channel dcV scanner
• ITS-90 temperature, including SPRTs
Take the uncertainty out of your low-level measurements
When every nanovolt counts, look to the 34420A for its low-noise, accuracy, and reliability. Low-noise input amplifiers and a highly tuned input protection scheme bring reading noise down to 8 nVpp—half that of other nanovolt meters in its class. Now add 100 pV/100 nof sensitivity, 2 ppm basic 24-hr dcV accuracy, and 7.5 digits of resolution, and you’ve got accurate, repeat­able measurements you can rely on month after month.
More measurements for your money
Most existing nanovoltmeters measure only nanovolts. However, the 34420A pro­vides a more complete solution for meeting your low-level needs. We’ve added a high precision current source to enable resistance measurements from 100 nto 1 M, all without the hassle and expense of an external supply. We’ve also included ITS-90 conversion routines so you can read thermocouples, thermistors, and RTDs — even SPRTs— directly in degrees. And if that isn’t enough, a built-in two channel scanner allows automated dcV ratio and difference measurements. Better still, the 34420A offers all this functionality for less than what you are used to paying for nanovolt-only products.
Nanovolt performance
at a Microvolt price...
the 34420A
Nanovolt /Micro-Ohm Meter
31
Page 32
Agilent Technologies’ Test and Measurement Support, Services, and Assistance
Agilent Technologies aims to maximize the value you receive, while minimizing your risk and problems. We strive to ensure that you get the test and measure­ment capabilities you paid for and obtain the support you need. Our extensive sup­port resources and services can help you choose the right Agilent products for your applications and apply them successfully. Every instrument and system we sell has a global warranty. Support is available for at least five years beyond the produc­tion life of the product. Two concepts underlie Agilent’s overall support policy: “Our Promise” and “Your Advantage.”
Our Promise
“Our Promise” means your Agilent test and measurement equipment will meet its advertised performance and functionality. When you are choosing new equipment, we will help you with product information, including realistic performance specifica­tions and practical recommendations from experienced test engineers. When
you use Agilent equipment, we can verify that it works properly, help with product operation, and provide basic measurement assistance for the use of specified capabili­ties, at no extra cost upon request. Many self-help tools are available.
Your Advantage
“Your Advantage” means that Agilent offers a wide range of additional expert test and measurement services, which you can purchase according to your unique technical and business needs. Solve prob­lems efficiently and gain a competitive edge by contracting with us for calibration, extra­cost upgrades, out-of-warranty repairs, and on-site education and training, as well as design, system integration, project man­agement, and other professional services. Experienced Agilent engineers and techni­cians worldwide can help you maximize your productivity, optimize the return on investment of your Agilent instruments and systems, and obtain dependable measure­ment accuracy for the life of those products.
By internet, phone, or fax, get assistance with all your test and measurement needs.
Online Assistance
www.agilent.com/find/assist
Phone or Fax
United States: (tel) 1 800 452 4844
Canada: (tel) 1 877 894 4414 (fax) (905) 206 4120
Europe: (tel) (31 20) 547 2323 (fax) (31 20) 547 2390
Japan: (tel) (81) 426 56 7832 (fax) (81) 426 56 7840
Latin America: (tel) (305) 269 7500 (fax) (305) 269 7599
Australia: (tel) 1 800 629 485 (fax) (61 3) 9272 0749
New Zealand: (tel) 0 800 738 378 (fax) (64 4) 495 8950
Asia Pacific: (tel) (852) 3197 7777 (fax) (852) 2506 9284
Product specifications and descriptions in this document subject to change without notice.
Copyright © 1996, 2000 Agilent Technologies Printed in U.S.A. 8/00 5965-4971E
Loading...