The following safety precautions apply to both operating and maintenance personnel and must be followed during
all phases of operation, service, and repair of this instrument.
Before applying power to this instrument:
• Read and understand the safety and operational information in this manual.
• Apply all the listed safety precautions.
• Verify that the voltage selector at the line power cord input is set to the correct line voltage. Operating the
instrument at an incorrect line voltage will void the warranty.
• Make all connections to the instrument before applying power.
• Do not operate the instrument in ways not specified by this manual or by B&K Precision.
Failure to comply with these precautions or with warnings elsewhere in this manual violates the safety standards
of design, manufacture, and intended use of the instrument. B&K Precision assumes no liability for a customer’s
failure to comply with these requirements.
Category rating
The IEC 61010 standard defines safety category ratings that specify the amount of electrical energy available and
the voltage impulses that may occur on electrical conductors associated with these category ratings. The category
rating is a Roman numeral of I, II, III, or IV. This rating is also accompanied by a maximum voltage of the circuit to
be tested, which defines the voltage impulses expected and required insulation clearances. These categories are:
Category I (CAT I): Measurement instruments whose measurement inputs are not intended to be connected to
the mains supply. The voltages in the environment are typically derived from a limited-energy transformer
or a baery.
Category II (CAT II): Measurement instruments whose measurement inputs are meant to be connected to the
mains supply at a standard wall outlet or similar sources. Example measurement environments are portable
tools and household appliances.
Category III (CAT III): Measurement instruments whose measurement inputs are meant to be connected to the
mains installation of a building. Examples are measurements inside a building’s circuit breaker panel or the
wiring of permanently-installed motors.
Category IV (CAT IV): Measurement instruments whose measurement inputs are meant to be connected to the
primary power entering a building or other outdoor wiring.
Do not use this instrument in an electrical environment with a higher category rating than what is specified in
this manual for this instrument.
You must ensure that each accessory you use with this instrument has a category rating equal to or higher than
the instrument’s category rating to maintain the instrument’s category rating. Failure to do so will lower the
category rating of the measuring system.
Electrical Power
This instrument is intended to be powered from a CATEGORY II mains power environment. The mains power
should be 115 V RMS or 230 V RMS. Use only the power cord supplied with the instrument and ensure it is
appropriate for your country of use.
Ground the Instrument
iii
To minimize shock hazard, the instrument chassis and cabinet must be connected to an electrical safety ground.
This instrument is grounded through the ground conductor of the supplied, three-conductor AC line power cable.
The power cable must be plugged into an approved three-conductor electrical outlet. The power jack and mating
plug of the power cable meet IEC safety standards.
Do not alter or defeat the ground connection. Without the safety ground connection, all accessible conductive
parts (including control knobs) may provide an electric shock. Failure to use a properly-grounded approved outlet
and the recommended three-conductor AC line power cable may result in injury or death.
Unless otherwise stated, a ground connection on the instrument’s front or rear panel is for a reference of potential
only and is not to be used as a safety ground. Do not operate in an explosive or flammable atmosphere.
Do not operate the instrument in the presence of flammable gases or vapors, fumes, or finely-divided particulates.
The instrument is designed to be used in oice-type indoor environments. Do not operate the instrument
• In the presence of noxious, corrosive, or flammable fumes, gases, vapors, chemicals, or finely-divided
particulates.
• In relative humidity conditions outside the instrument’s specifications.
• In environments where there is a danger of any liquid being spilled on the instrument or where any liquid
• In air temperatures exceeding the specified operating temperatures.
• In atmospheric pressures outside the specified altitude limits or where the surrounding gas is not air.
• In environments with restricted cooling air flow, even if the air temperatures are within specifications.
• In direct sunlight.
This instrument is intended to be used in an indoor pollution degree 2 environment. The operating temperature
range is 0◦C to 40◦C and 20% to 80% relative humidity, with no condensation allowed. Measurements made by
this instrument may be outside specifications if the instrument is used in non-oice-type environments. Such
environments may include rapid temperature or humidity changes, sunlight, vibration and/or mechanical shocks,
acoustic noise, electrical noise, strong electric fields, or strong magnetic fields.
Do not operate instrument if damaged
If the instrument is damaged, appears to be damaged, or if any liquid, chemical, or other material gets on or inside
the instrument, remove the instrument’s power cord, remove the instrument from service, label it as not to be
operated, and return the instrument to B&K Precision for repair. Notify B&K Precision of the nature of any
contamination of the instrument.
iv
Clean the instrument only as instructed
Do not clean the instrument, its switches, or its terminals with contact cleaners, abrasives, lubricants, solvents,
acids/bases, or other such chemicals. Clean the instrument only with a clean dry lint-free cloth or as instructed in
this manual. Not for critical applications
This instrument is not authorized for use in contact with the human body or for use as a component in a
life-support device or system.
Do not touch live circuits
Instrument covers must not be removed by operating personnel. Component replacement and internal
adjustments must be made by qualified service-trained maintenance personnel who are aware of the hazards
involved when the instrument’s covers and shields are removed. Under certain conditions, even with the power
cord removed, dangerous voltages may exist when the covers are removed. To avoid injuries, always disconnect
the power cord from the instrument, disconnect all other connections (for example, test leads, computer interface
cables, etc.), discharge all circuits, and verify there are no hazardous voltages present on any conductors by
measurements with a properly-operating voltage-sensing device before touching any internal parts. Verify the
voltage-sensing device is working properly before and aer making the measurements by testing with
known-operating voltage sources and test for both DC and AC voltages. Do not aempt any service or adjustment
unless another person capable of rendering first aid and resuscitation is present.
Do not insert any object into an instrument’s ventilation openings or other openings.
Hazardous voltages may be present in unexpected locations in circuitry being tested when a fault condition in the
circuit exists.
Fuse replacement must be done by qualified service-trained maintenance personnel who are aware of the
instrument’s fuse requirements and safe replacement procedures. Disconnect the instrument from the power line
before replacing fuses. Replace fuses only with new fuses of the fuse types, voltage ratings, and current ratings
specified in this manual or on the back of the instrument. Failure to do so may damage the instrument, lead to a
safety hazard, or cause a fire. Failure to use the specified fuses will void the warranty.
Servicing
v
Do not substitute parts that are not approved by B&K Precision or modify this instrument. Return the instrument
to B&K Precision for service and repair to ensure that safety and performance features are maintained.
For continued safe use of the instrument
• Do not place heavy objects on the instrument.
• Do not obstruct cooling air flow to the instrument.
• Do not place a hot soldering iron on the instrument.
• Do not pull the instrument with the power cord, connected probe, or connected test lead.
• Do not move the instrument when a probe is connected to a circuit being tested.
The B&K Precision models 6010 and 6011 baery analyzers enable far beer accuracy than previous oerings.
With the BA6011 featuring an input range up to 300V, measurement of a greater range of baery configurations is
possible.
Features:
• Best accuracy 0.05%
• Test frequency 1kHz
• Bin sorting comparator
• Adjustable measurement speed for fast readout or beer accuracy
• USB, GPIB and Ethernet interfaces come standard
• Save and recall up to 100 measurement setups
(10 internal, and 90 external (USB stick) records)
• 4.3” color TFT LCD with 480 x 272 pixels
1.1Package Contents
Please inspect the instrument mechanically and electrically upon receiving it. Unpack all items from the shipping
carton, and check for obvious signs of physical damage that may have occurred during transport. Report any
damage to the shipping agent immediately. Save the original packing carton for future shipping and storage.
Every instrument ships with the following contents:
• 1 x Model BA6010/BA6011 Baery Tester
• 1 x User Manual
• 1 x AC power cord
• 1 x 4-wire kelvin clips
• 1 x Certificate of calibration
• 1 x Test report
Verify that all items above are included in the shipping container. If anything is missing, please contact B&K
Precision.
Before connecting and powering up the instrument, thoroughly review the instructions and information in this
chapter.
2.1Input Power Requirements
The instrument has a selectable AC input that accepts line voltage and frequency input within:
AC Input: 110 V ±10% or 220 V ±10%
Frequency: 47 — 63 Hz
Before connecting to an AC outlet or external power source, be sure that the line voltage selector is installed in the
correct position of 110 V or 220 V and the power switch is in the OFF position. Also, verify that the AC power cord,
including the extension line, is compatible with the rated voltage/current and that there is suicient circuit
capacity for the power supply. Once verified, connect the cable firmly.
The included AC power cord is safety certified for this instrument operating in rated
range. To change a cable or add an extension cable, be sure that it can meet the
required power ratings for this instrument. Any misuse with wrong or unsafe cables
will void the warranty.
An AC input fuse is necessary when powering the instrument. The fuse is located at the back of the instrument. If
the fuse needs to be replaced, ensure the AC input power cord is disconnected from the instrument prior to
replacement. Refer to Table 2.1 for fuse requirements.
Before replacing fuse, disconnect the AC power cord first to prevent electric shock.
Only use same rating of the fuse. Using a dierent fuse may damage the instrument.
110 V Fuse220 V Fuse
T 2 AL, 250 VT 1 AL, 250 V
Table 2.1: Fuses
2.2.1Fuse Replacement
1. Check and/or Change Fuse
• Locate the fuse box above the AC input in the rear panel.
• With 2 fingers, press both le and right sides of the fuse box and pull it out.
• Check and replace fuse (if necessary) for the required line voltage operation.
2. Check and/or Change Line Voltage
Line voltage is selected and configured via fuse holder orientation.
• The beige colored piece is the fuse holder and line voltage selector. To change the line voltage
configuration between 110 V and 220 V, pull this piece out of the fuse box and rotate it 180 degrees.
• Re–insert the fuse box. The configured line voltage is visible through an opening visible at the end of
the holder. It will display either 110 or 220. If neither of these labels is shown, pull out the fuse holder
and turn it until it shows the line voltage configuration desired.
Figure 2.1: Change Line Voltage Configuration
Do not connect power to the instrument until theline voltage is configured correctly.
Applying an incorrect line voltage or configuring the line voltage improperly will
damage the instrument and void all warranty.
Disassembly of the case by any unauthorized persons will void the warranty.
Verify and check to make sure proper AC voltages are available to power the instrument. The AC voltage
range must meet the acceptable specification as explained in section 2.1.
2. Connect Power
Connect AC power cord to the AC receptacle in the rear panel. The power buon on the front panel should
illuminate red. Press the power buon to turn ON the instrument. The buon should illuminate green, and
show the boot screen while loading. Aer loading, the main screen will be displayed (Figure 2.3), and if
password protection is enabled it will prompt the user for the password.
Enter the password and press the function key below [Enter].
Default Password: 2523
Figure 2.3: Password Entry
2.4Password Protection
The instrument has a password protection feature that allows the instrument to be locked at boot time to prevent
unauthorized users from using it. The default password is: 2523
To change the password, select the Password parameter from the System Setup menu, and select the Modify
so menu option. The user is prompted with “Input password:”. Enter the current password (or default password if
this is setup for the first time), then press Enter. Then, “New password:” prompt will appear. Enter your new
password. The password MUST be numeric and must be 1 — 8 numbers in length.
This instrument does not have a recovery mechanism to retrieve forgoen passwords.
Once password protection is enabled, the instrument will be locked during boot-up until the password is entered.
2.5Connecting Kelvin Clips (TLKB1)
The instrument comes included with the TLKB1 Kelvin clips accessory (Figure 2.4) which connects to the four
BNC connectors. To connect, align all four BNC connectors of the TLKB1 to the input terminals of the instrument.
Ensure that the connectors slide all the way into each terminal (you may need to adjust the BNC lock rings).
Then, turn the lock rings of each terminal all the way to the right for a secure connection.
There are two main menu groups: “Display” and “Setup”. Each menu may include File and/or too
Display Menu - Accessible by pressing thebuon.
MEAS DISP Section 3.7
BIN DISP Section 3.9
TRACE DISP Section 3.10
STATIS DISP Section 3.11
Setup Menu - Accessible by pressing thebuon.
MEAS SETUP Section 3.8
BIN SETUP Section 3.9.5
TRACE SETUP Section 3.10.1
STATIS SETUP Section 3.11
File Menu Accessible by using the arrow keys to select [FILE] on-screen. The file system is accessible in all
menus.
Tools Menu Accessible by using the arrow keys to select [TOOLS] on-screen.
The Tools menu is only available for display menus and will not appear in setup menus. Each display menu (i.e.
MEAS DISP, BIN DISP, TRACE DISP, STATIS DISP) has dierent options available.
3.1Front Panel Keys
3.1.1Key Lock
The front panel buons are locked by pressing thebuon or via remote command. When enabled, all
buons except andare disabled, and the lock buon will illuminate red,. The lock iconalso
appears in the upper right corner of the screen.
The display screen may be captured and saved as a .GIF file to an external USB flash drive.
1. Insert a USB flash drive into the front USB host port and wait for the USB iconto appear in the upper
right corner of the screen.
2. Press thebuon.
A message prompt will say “Screen Copy .. . ”.
3. Wait until the message says “Copy completed.” and disappears.
The screenshot will be saved into the USB subfolder /PIC.
3.1.3Reset Key
The reset key initiates a reboot of the system.
3.2Menu Operation
1. Press thebuon or thebuon to access the display or setup menus.
2. At the boom of the screen, related so menu items are displayed. Use the function keys to select the
corresponding so menu times that are directly above them. Each item has its own unique display showing
setup parameters, measurements, and more.
3. Use the,,, orarrow keys to select the on-screen parameters of the [FILE] or [TOOLS]
menus. Selected items are highlighted in BLUE .
Figure 3.1: So Menu
4. Most on-screen parameters, when selected, will have options to select or change using the so menu at the
boom of the screen.
5. If an on-screen parameter is numeric, the keypad may be used to enter and change the values. Numeric
values are highlighted in RED , and additional items in the so menu will be available to set the units
(u, m, k, x1). To enter a negative value, press thebuon first, and then enter the number. The
buon is also the backspace buon, and works in the usual way.
When entering a parameter using the numeric keypad entry is completed by selecting the units from the
on-screen so-keys as shown in Figure 3.3. Some parameters have fewer unit options than others. The list of all
options and their descriptions is as follows:
x1Denotes x1 unit prefix of the entered value.
uDenotes micro (x10−6) unit prefix of the entered value.
mDenotes milli (x10−3) unit prefix of the entered value.
kDenotes kilo (x10+3) unit prefix of the entered value.
Other parameters (like “Delay” in Measurement Setup) may be changed by the keypad and another set of
on-screen so-keys. Table 3.1 describes these keys and their function.
Figure 3.3: Numeric Notations
Note: Baudrate seings in SYSTEM SETUP menu is an exception, where INCR+ and DECR– options are available to select between the following discrete rates: 9600, 19200, 28800, 38400, 96000,
INCR++Coarse adjustment - Increments the selected numeric value by the hundreds digit
(i.e. 100 will increase to 200)
INCR+Fine adjustment - Increments the selected numeric value by the tens digit (i.e. 10
will increase to 20) or by the ones digit (i.e. 10 will increase to 11) depending on the
selected parameter.
DECR- -Coarse adjustment - Decrements the selected numeric value by the hundreds digit
(i.e. 200 will decrease to 100)
DECR-Fine adjustment - Decrements the selected numeric value by the tens digit (i.e. 20
will decrease to 10) or by the ones digit (i.e. 11 will decrease to 10) depending on the
selected parameter.
CLEARThe selected value will be set to 0.
CLEAR LINEThe value of all parameters at the selected row/line will be set to 0.
Table 3.1: Coarse and Fine numeric adjustments
Figure 3.4: Increment/Decrement Values
3.3System Setup Menu
The system menu is accessible by pressing thebuon and selecting SYSTEM SETUP from the on-screen
menu.
In this menu, all parameters can be configured by using the arrow keys for selection, and the function keys or
numeric keypad to make changes.
Table 3.2 lists the configurable parameters in the system menu: Key sound, Remote Interface, Language, Baud
Rate, Password Protection, Bus Address, Date, Time.
3.3.1System Tools
The system tools menu is accessed from the System Setup Display by pressingonce, and then pressing
thekey twice to select TOOLS. If the current selection is on a parameter, then use any of the arrow keys to
navigate to TOOLS. Selection of on-screen fields is always available, however, the selector may be hidden behind
the main screen label.
When TOOLS is selected, the so menu displays 3 options described in Table 3.3. Use the corresponding function
Enables/disables the beep sound aer any buon press.
Options: OFF, ON
Selects the remote interface to use for remote communication.
See Section 5.1 for detailed operation instructions.
Options: RS232, GPIB, USBTMC, USBCDC
Changes the language display.
Options: English, Chinese
Selects the baud rate seing to use for remote communication
for RS232 and USBCDC interface options.
Options: 9600, 19200, 28800, 38400,96000, 115200
Enables/disables and configures password protection. See Section 2.4 for more information.
Options: OFF, ON, Modify
The address of the GPIB interface.
Valid Range: 1 - 31
Sets the system date. The format is as follows: Year-MonthDay (YY-MM-DD)
Sets the system time. The format is as follows: Hour-MinutesSeconds (HH-MM-SS)
Table 3.2: System Setup Parameters
Figure 3.7: System Tools
ParameterOn-screen LabelDescription and Options
System ResetSystem ResetReboot the instrument.
Default SeingsDefault Set
Firmware UpdateUpdate
Change all seings to defaults and reboot. See Section 3.3.2 for
the default values.
Using firmware loaded to a USB memory stick, update the
firmware of the instrument.
5. When finished, the display shows “Short completed” message briefly.
6. To apply the short compensation to the measurements, select SHORT ON from the so menu.
Figure 3.8: Zeroing the kelvin clip leads
Incorrectly performing the short correction will introduce meaurement osets and
reduce accuracy.
3.5Measurement Accuracy
Resistance and Voltage are primary measurements of this analyzer, and their accuracy is specified in the main
specifications section. Measurement of resistance, more accurately impedance, is accomplished by driving a
sinusoid signal and measuring the voltage developed across the output terminals of the unit. For a purely resistive
device, the resistance and impedance are simple, Z = r. See Chapter 6, specifically the resistance section for the
impedance accuracy, Ae.
The measurement accuracy of the reactive (X, L and C) parameters depends on the Real impedance component.
The smaller the resistive (Real) component the more accurate the measurement. The resistive element of an
Inductor or Capacitor is related to the Dissipation (D) and ality (Q) Factors. As dissipation increases, the
relative contribution of the reactive component to the impedance is reduced.
Essentially, when the resistive component of a device’s complex impedance at 1kHz (the frequency output by this
unit) is small compared to the real impedance, the accuracy is ≈ R
Capacitance Impedance |Z| and resistance accuracy are equivalent due to how the unit measures. Voltage
generated across the measurement terminals at a given current and frequency is determined and used to compute
resistance/impedance. D is greater than 0.1, so the accuracy correction is dominated by it.
The primary and secondary measurement readings are saved to an external USB drive.
1. Connect a USB flash drive to the USB host port on the front panel.
Note: The USB flash drive must be formaed to FAT16 or FAT32 file system.
2. Go to the TOOLS menu from Measurement Display screen, and select SAVE from the so menu.
3. When set to ON, primary and secondary measurements start saving to a .CSV file on the USB flash drive.
The file will be located inside a folder labeled CSV.
4. To stop saving, select SAVE from so menu to toggle OFF.
The measurement display is accessed by pressing thebuon. Shown are the selected measurements as well
as the voltage measured and current sourced to make the measurements.
The FUNC, SPEED, RNG_V, and RNG_R parameters are also configurable from this screen. Use the arrow keys
to select them, and the keypad and so menu function keys to change them. Refer to Table 3.8 for a detailed
description.
Figure 3.9: Measurement Display
Figure 3.10: Measurement Setup Function Keys
Note: When the Trigger parameter in Measurement Setup is not set to INT (internal trigger), the measurement
display will not show primary and secondary measurement until a triggered.
When Vm (Measured voltage display) and/or Im (Measured current display) parameters are set to OFF in Measurement Setup, “OFF” is shown beside Vm and/or Im.
3.8Measurement Setup
Access the Measurement setup display (Figure 3.11) by pressing thebuon, and then press the function
key corresponding to MEAS SETUP from the so menu. All measurement parameters in this menu are
Selects the trigger source.
Options: INT, MAN, EXT, BUS
INT (Internal) – Automatic continuous measurement
MAN (Manual) – Each measurement is made by pressing the
Trigger SourceTRIG
Measurement DelayDELAY
Measurement Voltage
Display
Deviation Primary
Measurement Display
Voltage RangeRNG_V
Vm
DEV_A
buon.
EXT (External) – Measurement is made upon receiving a trig-
ger signal from either the rear panel BNC or handler interface terminals.
BUS (Bus) – Measurement is made upon receiving a trigger
command from a remote interface.
Sets a measurement delay time.
Valid Range: 0 ms - 60 seconds
Note: Values entered are in milliseconds (ms).
Enables/disables the measured voltage display in the measurement display screen.
Options: OFF, ON
Enables/disables the deviation measurement display of the primary measurement function.
Options: OFF, ABS, %
ABS - Primary measurement display will show the dierence
between measured value and REF_A value.
% - Same as ABS, but measurement will show percentage difference instead.
Configures the measurement DC voltage range.
Options: AUTO, HOLD, 300V (*60V), 30V (*6V)
*Model BA6010
AUTO - Autoranging.
HOLD - Locks the current voltage range.
300V(60V) - Selects the 300 V/60 V range.
30V(6V) - Selects the 30 V/6 V range.
AUTO - Autoranging.
HOLD - Locks the current impedance range.
,- Selects between 30mΩ, 300mΩ, 3Ω, 30Ω, 300Ω, 3kΩ
range.
Sets the number of samples for measurement averaging.
Valid Range: 1 to 255
Selects the frequency of the measurement source.
Options: 50Hz, 60Hz
Select the same frequency as that of the AC power source to
minimize measurement error.
CHAPTER 3. OPERATION24
ParameterOn-screen LabelDescription/Options
Sets the reference value for the primary measurement when
displayed in ABS or % (See DEV_A parameter)
Valid Range: Dependent on the measurement range of the selected primary function.
Primary Measurement
Reference Value
Secondary Measurement
Reference Value
REF_A
REF_B
A MEAS option will appear in the so menu. Select this to set
the reference value to the last measured value. To update the
measured value, press
to go to the Measurement Display. Then, press
to go back to Measurement Setup and select REF_A and press
MEAS again.
Sets the reference value for the secondary measurement when
displayed in ABS or % (See DEV_A parameter)
Valid Range: Dependent on the measurement range of the selected primary function.
A MEAS option will appear in the so menu. Select this to set
the reference value to the last measured value. To update the
measured value, pressto go to the Measurement Dis-
play. Then, pressto go back to Measurement Setup and
select REF_B and press MEAS again.
Table 3.8: Measurement Setup Parameters
3.8.2Measurement Tools
Access the measurement tools menu from the Measurement Display by pressingonce, and then press
twice to select TOOLS. If the current selection is on a parameter, use the arrow keys to navigate to TOOLS.
Figure 3.12: Measurement Tools Menu
When TOOLS is selected, the so menu has five configurable and selectable options. Use the corresponding
function keys to make a selection. See Table 3.9 for details.
The bin comparator function allows for sorting, comparison against preset limits, and pass/fail testing of
components. A total of 9 bins are available, each with high and low limits for both primary and secondary
measurements.
To use the bin function:
1. Select the primary and secondary measurement parameters for the test. See Section 3.8, Measurement
Setup for details.
2. Setup the bin parameters from the BIN Setup display.
3. Navigate to the BIN Display to start using this function.
3.9.1BIN Display
The BIN comparator measurements and test results are accessed in the BIN DISP screen. To access, press the
buon, and then press the function key corresponding to BIN DISP from the so menu.
The Mode, Sound, COMP, and Load BIN parameters are configurable seings. To select, use the arrow keys,
and the keypad and so menu function keys to change them. See Table 3.10 for details.
Note: When the Trigger parameter in Measurement Setup is NOT set to INT (internal trigger),
the display will not show compare test results until a trigger is received from the selected source.
When Vm (Measured voltage display) and/or Im (Measured current display) parameters are set
to OFF in Measurement Setup, the measurement display will show OFF next to Vm: and/or Im:
below the secondary measurement.
Figure 3.13: BIN Comparator Display
3.9.2BIN Sorting
The BIN comparator function also can sort measured values into BINs.
Go to BIN Setup, Section 3.9.5, and configure the upper and lower limits for the set of bins desired, up to 9 of
them. Set all other parameters as necessary, see Table 3.20.
Compare The comparator function will test the measure-
ments against the upper and lower limits of the Loaded
BIN, which are configured in BIN Setup.
BIN The comparator function will test the measurements
Comparator ModeMode
against the upper and lower limits of all nine bins 1
through 9. The bin number that passes will be indicated.
If it passes multiple bin limits, only the first bin will be
indicated (based on bin numeric order). This option is
useful for component sorting.
NG Enables the beep sound when test fails.
Pass/Fail Beep SoundSound
Comparator Function StateCOMP
Loaded BinLoaded BIN
Table 3.10: BIN Comparator Display Parameters
1. Go to the BIN Display, and select BIN for the Mode parameter.
2. Configure all other parameters as necessary, see Table 3.10.
3. Connect the TLKB1 kelvin clips to the DUT.
Note: If both COMPA and COMPB are enabled, the test will apply to both primary and secondary
measurements. If either of the results fail, the instrument will indicate fail.
When measurements pass, the display will indicate the passing BIN number. If the measurements pass in multiple
bins, the first BIN (in numerical order) is displayed. For example, if the measurements pass in BIN3 and BIN6, the
display will indicate BIN 3. Figure 3.14 shows the display when the measurements pass in BIN 1.
GD Enables the beep sound when test passes.
OFF Disables the beep sound regardless of test results.
ON Enable the BIN comparator
OFF Disabled
Comparison range (BIN)
Uses limits defined for the specified BIN.
Only available when in Comparison mode.
When measurements fail, the display indicates OUT as shown in Figure 3.15.
3.9.3Compare Pass/Fail Testing
The BIN comparator function can be used to compare the DUT (device under test) measurements against a
specified upper and lower limit for pass/fail (Go/No Go) style testing.
Go to BIN Setup, Section 3.9.5, and configure all the upper and lower limits for the bins to use in the test. Setup all
other parameters as necessary, as described in Table 3.10.
Note: If both COMPA and COMPB are enabled, the test will apply to both primary and secondary
measurements. If either of the results failed, the final result will indicate fail.
LO measured value is below the “Load BIN” range seings.
HI measured value is above the “Load BIN” range seings.
Figure 3.17: Compare Test - Below Limits
Figure 3.18: Compare Test - Above Limits
3.9.4BIN Tools
The BIN tools menu is accessed from the BIN Display by pressing theand selecting BIN DISP. Then press
right arrow key twice to select TOOLS. If the current selection is on a parameter, then use any of the arrow
keys to navigate to TOOLS.
Figure 3.19: BIN Tools Menu
3.9.5BIN function setup
Press thebuon, and then press the function key corresponding to BIN SETUP to open the BIN SETUP
menu; Figure 3.20 will be displayed. Use the arrow keys to select each parameter, and the function keys to make
changes. Enter numeric parameters (i.e. nominal and high and low limits) using the keypad, and the function keys
to select the units. See Table 3.11 for parameter and option details in this menu.
Selects the measurement mode for the BIN function.
Options: %, ABS
% — The high and low limits used for testing is a tolerance percentage.
Measurement ModeMode
Comparator State
Primary Measurement
Comparator State
Secondary Measurement
Nominal Value
Primary Measurement
Nominal Value
Secondary Measurement
High Limit
Primary Measurement
COMPA
COMPB
NOMA
NOMB
HIGHA
ABS — The high and low limits used for testing is an absolute
value of the selected parameter.
HIGHA[], LOWA[], HIGHB[], LOWB[] — labels display either [X] units, where ‘X’ denotes the primary and secondary
measurement units when applicable (i.e. Ω, V, H, F, r(radians)),
or [%] with the limits defining a percentage.
Enables or disables the primary measurement comparator
function. When disabled, primary measurements will not be
tested.
Options: OFF, ON
Enables or disables the secondary measurement comparator
function. When disabled, secondary measurements will not be
evaluated.
Options: OFF, ON
Sets the nominal value of the bin comparator function for
the primary measurement. This parameter only applies when
Mode is set to %.
Valid Range: Dependent on the range of the primary measurement selected.
Sets the nominal value of the bin comparator function for the
secondary measurement. This parameter only applies when
Mode is set to %.
Valid Range: Dependent on the range of the secondary measurement selected.
Sets the upper limit value for the comparator function that applies to the primary measurement.
Valid Range: -100 - 100 (%), -10000 - 10000 (ABS)
Sets the lower limit value for the comparator function that applies to the primary measurement.
Valid Range: -100 - 100 (%), -10000 - 10000 (ABS)
Sets the upper limit value for the comparator function that applies to the secondary measurement.
Valid Range: -100 - 100 (%), -10000 - 10000 (ABS)
Sets the lower limit value for the comparator function that applies to the secondary measurement.
Valid Range: -100 - 100 (%), -10000 - 10000 (ABS)
3.9.6Saving BIN Comparator Results
When TOOLS is selected, the so menu will have the SAVE option; this saves the primary and secondary
measurements and the test result of the BIN comparator function to an external USB flash drive.
1. Insert an external USB flash drive to the front panel USB host port until the USB icon on the top right of the
display will appear.
During USB activity, the unit will pause and not respond until the activity is complete.
2. Press the function key below the SAVE START option to save. Data is saved as a .CSV (comma delimited)
file in the folder named CSV in the USB flash drive.
3. To stop saving, press the function key below the SAVE STOP option.
Note: The saved data has the following format: Primary measurement, Secondary measurement,
Test Result. Test Result is either LO, HI, or IN for Compare Mode. For BIN MODE, it is either
OUT or the BIN number (1 – 9). If measurements are made continuously, the instrument saves
data at approximately twice a second.
3.10Trace Function
The trace function measures the primary and secondary measurements and plots them versus time. To use the
trace function, there are three steps to follow:
1. Select the primary and secondary measurement parameters for the test. To do this, follow instructions in
Measurement Setup, Section 3.8.
2. Setup the trace parameters from the Trace Setup display.
3. Navigate to the Trace Display Tools menu to start the test.
3.10.1Trace Setup
The trace setup menu can be accessed by pressing thebuon, and then press the function key
corresponding to TRACE SETUP from the so menu. Use the arrow keys to select each parameter, and the
function keys to make changes. Parameters that require entering a numeric value must use the keypad, and the
function keys to select the units.
The below table explains the parameters and their options/range.
ParameterOn-screen LabelDescription/Options
Sets the total time to run the trace function. The test will com-
Total TimeTotal
Sampling IntervalInterval
Primary Measurement
Trace Scale Maximum
Primary Measurement
Trace Scale Minimum
Secondary Measurement
Trace Scale Maximum
Secondary Measurement
Trace Scale Minimum
Primary Measurement
Upper Limit
Primary Measurement
Lower Limit
A MAX
A MIN
B MAX
B MIN
A Stop 1
A Stop 2
plete when this time has elapsed.
Valid Range: 1 - 99999 seconds
Sets the time between each primary and secondary measurement captured and traced.
Valid Range: 1.0 - 86400.0 seconds
Sets the maximum range of the scale of the trace for the primary measurement.
Valid Range: 0 - 99999k
Sets the minimum range of the scale of the trace for the primary
measurement.
Valid Range: 0 - 99999k
Sets the maximum range of the scale of the trace for the secondary measurement.
Valid Range: 0 - 99999k
Sets the minimum range of the scale of the trace for the secondary measurement.
Valid Range: 0 - 99999k
Sets the upper limit value as a stop condition of the test. If the
primary measuredvalue exceeds this for 2 consecutive samples,
the test will stop.
Valid Range: Dependent on the measurement range of the selected primary function.
Sets the lower limit value as a stop condition of the test. If the
primary measuredvalue exceeds this for 2 consecutive samples,
the test will stop.
Valid Range: Dependent on the measurement range of the selected primary function.
Sets the upper limit value as a stop condition of the test. If the
Secondary Measurement
Upper Limit
Secondary Measurement
Lower Limit
B Stop 1
B Stop 2
Table 3.12: Trace Setup Parameters
secondary measured value exceeds this for 2 consecutive samples, the test will stop.
Valid Range: Dependent on the measurement range of the selected secondary function.
Sets the lower limit value as a stop condition of the test. If the
secondary measured value exceeds this for 2 consecutive samples, the test will stop.
Valid Range: Dependent on the measurement range of the selected secondary function.
Figure 3.22: Trace Display
The Trace Display, Figure 3.22 shows the measurements and trace results on the screen. To show this display,
press thebuon, then pressing the function key corresponding to TRACE DISP from the so menu.
3.10.2Trace Display Tools
The trace display tools menu, Figure 3.23, is accessed from the Trace Display by pressingand select
TRACE DISP, and then pressing thekey twice to select TOOLS.
The trace display tools menu has options to start the trace test and to control what to display on the screen. See
Table 3.13.
3.10.3Start a Trace
• Configure all the seings under Trace Setup, and then go to the Trace Display Tools menu.
• Select the TRACE to display then select SCAN START.
• While testing, the display plots the measured values.
Starts or stops the trace test.
Options: START, STOP
Selects the trace to display.
Valid Range: A, B, A+B
Trace DisplayTRACE
Cursor DisplayCURSOR
Maximum/Minimum
Value Display
• Press CURSOR to see the values of a measured point on the trace. Using thekeys to move the
cursor along the trace.
• The maximum and minimum values of the primary and secondary measurements of the trace may also be
displayed by pressing Max-Min to ON.
Max-Min
Table 3.13: Trace Display Tools Parameters
A The primary measurement trace.
B The secondary measurement trace.
Enables/disables the cursor. When enabled, the cursor can be
controlled using thearrow keys. The cursor is used to
view the primary and secondary measurements, and the timestamp of any measured points on the trace.
Enables/disables the display of the maximum and minimum
values of both the primary and secondary measurement traces.
Options: ON, OFF
3.11Statistical Measurement
The instrument can calculate statistical measurements, which are accessed by pressingand selecting
STATIS DISP from the so menu.
Before performing a statistical calculation, some parameters must be configured first. They are grouped together
on display the box highlighted in cyan like this, parameter . See Figure 3.26. Use the arrow keys to select each
parameter, and the function keys to make changes. Parameters requiring entering a numeric values are entered
with the keypad, and then the function keys to select the units. See Table 3.14 for the parameters and their
options/range.
ParameterOn-screen LabelDescription/Options
Selects the measurement mode for the statistical measurement
function.
Options:%, ABS
% The high and low limits used for measurement is a tolerance
percentage.
Measurement ModeMode
Upper limit (Hi) = Nominal ∗ (1 + Upperlimit%)
Lower limit (Lo) = Nominal ∗ (1 + LowerLimit%)
ABS The high and low limits used for testing is an absolute
value of the selected parameter.
When selected, the Hi and Lo parameters will display [X] units,
where ‘X’ denotes the primary and secondary measurement
units when applicable (i.e. Ω, V, H, F, r(radians)).
Hi(num) Total number times the measurement results exceed the upper limit value Hi[].
Lo(num) Total number times the measurement results are below the lower limit value Lo[].
In(num) Total number of times the measurement result is within the upper and lower limits.
Max Displays the maximum measurement value from all the captured measurement samples.
MaxIndex Displays the index from the internal buer containing all the captured measurement samples, in
which contains the maximum measurement value.
Min Displays the minimum measurement value from all the captured measurement samples.
MinIndex Displays the index from the internal buer containing all the captured measurement samples, in
which contains the minimum measurement value.
3.11.1Statistical Measurement Operation
To operate the statistical measurement function, press thebuon and select STATIS DISP from the so
menu. Pressright arrow key twice to select TOOLS. If the current selection is on a parameter, then use any of
the arrow keys to navigate to TOOLS.
Figure 3.28: Statistics Tools Menu
In the TOOLS menu, there are two options: Statis START, TRIG.
Operation steps
1. Select the primary and secondary measurement parameters to perform the calculation. To do this, follow
instructions in Section 3.8, Measurement Setup.
2. Configure all statistical parameters (described in Table 3.14) as necessary.
3. From the TOOLS menu, press the function key corresponding to Statis START so key to begin the
calculation.
• If the trigger source is not set to internal trigger (INT), select TRIG from the so menu to manually
trigger a single capture of a measurement sample.
• To complete calculation, continue to press TRIG until the message prompt displays “Measurement
4. The instrument will begin capturing measurement samples and store them into an internal buer. In the
message prompt, display will indicate the sample index and the value stored into that index in this format:
num = X, val = #
Where,
X = measurement sample index
val = measurement sample value
5. When the calculation completes, the message prompt will display “Measurement Completed”. All
statistical measurement results from the calculation will be displayed on the same screen.
The instrument file system menu is used to browse, save, and load files from the internal storage memory or an
external USB flash drive connected to the front panel USB port.
The file system is accessible from any display screen by using the arrow keys and selecting FILE. See Table 4.1 for
available options shown in the so menu.
ParameterOn-screen LabelDescription/Options
Internal MemoryINTER FileSelect to browse the internal memory directory.
External USB MemoryEXTER FileSelect to browse the external USB flash drive directory.
Table 4.1: File Menu
Figure 4.1: File Menu
Internal Memory Internal memory can save and recall instrument seings. The storage file has a .STA file
extension. To access internal memory, select INTER File from the FILE menu. Shown on the display is the
internal memory directory listing, and any instrument seing files stored there. Three options are available. See
Table 4.2 for details.
To exit out of the directory, press thearrow key until the Exit option in the so menu becomes available.
Then, select it to exit.
When an external USB flash drive is connected to the instrument’s front panel USB host port, the directory can be
viewed from the FILE menu.
The instrument supports USB drives that meet the following requirements:
• Support USB 2.0
• FAT16 or FAT32 (recommended) file system
When connected, the instrument will automatically create three subfolders in the root directory of the drive:
/CSV, /STA, /PIC.
/CSVThe CSV subfolder is where all measurement data files (in .CSV for-
mat) are stored.
/STAThe STA subfolderis whereall instrumentseings files(in .STA format)
are stored.
/PICThe PIC subfolder is where all screen capture images (in .GIF format)
are stored.
To access the directory, select EXTER File from the FILE menu. The display will show the external USB flash
drive directory listing (Figure 4.3). Use the Sub Dir and Parent Dir so keys to navigate through the directories.
See Table 4.3 for available option details.
Note: ONLY setup files from the /STA subfolder may be saved or recalled in the FILE menu system.
Measurement data or screen captures cannot be saved from within this menu system.
Note: The instrument’s file system can only save/select/view 100 files within the root directory
and any subfolders contained in the external USB flash drive.
Figure 4.3: External USB Drive Directory
To exit out of the directory, press thearrow key until the Exit option in the so menu becomes available, and
then select it to exit.
This option will load the selected .STA instrument seings file.
Recall Instrument SeingsLoad
Copy File to
Internal Memory
Delete FileDeleteDelete the selected file.
Enter Sub DirectorySub DirEnter the selected sub folder.
Browse Root DirectoryParent DirGo back to the root directory of the drive.
Copy to I:
Table 4.3: Save/Recall Instrument Seings
Note: The selected file must have a .STA file extension and is located
inside the /STA subfolder of the drive.
Select this to copy the selected .STA instrument seings file to the
internal memory of the instrument.
4.4Save/Recall Instrument Seings
Instrument seings including all measurement setup seings, BIN setup seings, Trace setup seings, and system
setup seings, can be stored and recalled to and from internal memory.
4.4.1Save seings to internal memory
1. Select FILE menu from any display, and then select INTER File from the so menu.
2. Use thekeys to browse and select an empty location in the internal memory directory. A Save
option will appear in the so menu when an empty location is selected.
3. Press Save using the corresponding function key, and confirm by pressing Yes in the so menu.
4. The message prompt then shows “Input FileName:”, and an on-screen so keyboard will be displayed,
prompting for a file name to save the instrument seings to.
5. Use thekeys to navigate the so keyboard. The character highlighted in BLUE is the
selected character. Press Add Char to add the selected character to the file name. Repeat selection and
enter the complete file name.
6. Select Enter from the so menu when finished. The file is then saved to internal memory. The saved file
includes the date and time.
Recall seings from internal memory
1. Select the FILE menu from any display, and then select INTER File option from the so menu.
2. Use thearrow keys to browse and select seing files from the internal memory directory. Once the
desired file is selected, press the Load option from the so menu.
3. The message prompt will ask for confirmation of this action. Select Yes to recall the seings.
4. The recalled file will have a check markin the Load column. The check mark identifies the file
containing the current instrument seings.
5. The so menu Load option will change to UnLoad when the file with theis shown.
6. To clear theand unset the seing file as default, select the Unload option from the so menu.
1. Select FILE from any display, and then select EXTER File from the so menu using the corresponding
function key.
2. Use thekeys to browse and select an empty location within the /STA subfolder. A Save option will
appear in the so menu when an empty location is selected.
3. Press the Save function key, and confirm by pressing the Yes option in the so menu.
4. The message prompt will then say “Input FileName:”. An on-screen so keyboard is displayed prompting
the operator to enter a file name where the instrument seings will be saved. See Figure 4.5
5. Use thekeys to navigate the so keyboard. The character highlighted in BLUE is the selected
character. Add the selected character to the file name using the Add Char option from the so menu, and
repeat this to enter the complete file name.
6. Select the Enter option from the so menu when finished. The file is then saved to the external memory.
The saved file includes the date and time.
1. Select the FILE menu from any display, and then select EXTER File option from the so menu.
2. Use thekeys to browse and select the seing files from within the /STA subfolder of the drive. Once
the desired file is selected, press the Load option from the so menu.
3. The message prompt will ask for confirmation. Select Yes to recall the seings.
The instrument supports numerous SCPI commands and some instrument specific commands. These commands
enable a computer to remotely communicate and control the instrument over any of the supported remote
interfaces: USBTMC, USBCDC(Virtual COM), RS-232, and GPIB.
Refer to the programming manual for details, which can be downloaded from www.bkprecision.com.
5.1Remote Interfaces
5.1.1RS232
For RS232 interface, use the DB-9 serial port in the back panel. The baud rate seings must be configured to
match the same baud rate as configured on the computer that is connected to the instrument for remote control.
Select the Baud Rate parameter to change/set the baud rate. The instrument is set to the following serial seings:
• Parity:None
• Data bits: 8
• Stop bits: 1
• Flow Control: None
Note: The DB-9 serial connector (RS-232C) on the back of the instrument requires using
a null modem or crossover DB-9 serial cable.
The instrument supports USBTMC (Test & Measurement Class) interface. Use a USB Type A to Type B cable to
connect the USB device port in the rear panel to the computer. A driver is required before it can be used for
remote communication. A computer with VISA-such as NI-VISA-installed will automatically have the driver
available, and upon connecting the USB to the device, drivers will automatically install on Windows®7 or later.
5.1.3USBVCP
USBVCP is a USB Virtual COM interface option. A driver is required (downloaded from www.bkprecision.com) to
be installed before it can be used. The setup is similar to an RS232 interface, except that a USB Type A to Type B
cable is required for remote connection instead of a DE-9 serial cable.
5.2Handler Interface
The instrument has a 36-pin handler interface, which is primarily used for outpuing BIN sorting results. This
section describes the pin definition of this interface.
Mating connector: 36 pin Centronics type, male.
PinSignal NameDescription
1/BIN1Bin Sorted Result
2/BIN2Outputs are all open collector.
3/BIN3
4/BIN4
5/BIN5
6/BIN6
7/BIN7
8/BIN8
9/BIN9
10/OUT
11-Not used
12/EXT. TRIG
13
14EXT.DCV2External DC voltage 2:
15
External trigger (When TRIG set to EXT in System Setup): Triggered by
the positive-edge pulse signal in this pin.
The DC provider pin for the optoelectronic coupling signal (/EXT_TRIG, /Key
Lock, /ALARM, /INDEX, /EOM)
All specifications apply to the unit after a temperature stabilization time of 15 minutes over an ambient temperature
range of 20 °C ± 5 °C. Specifications are subject to change without notice.
The message prompt on the display may display dierent error messages during operation. The below table list
the error messages and their description.
Error Message
Error: LO Sense Open
Error: LO Drive Open
Error: HIGH Sense Open
Error: HIGH Drive Open
Error: Measure line open
Short Failed
Load Failed
???
Description
LCUR terminal is opened or disconnected from DUT.
LPOT terminal is opened or disconnected from DUT.
HCUR terminal is opened or disconnected from DUT.
HPOT terminal is opened or disconnected from DUT.
All four input terminals are opened.
The short operation failed or aborted. This may be because the terminals are not shorted properly.
This error indicates that either a wrong instrument seings file type
was selected (file must have .STA extension) or the selected file is corrupted, preventing the instrument from recalling instrument seings
from the file.