Connections +
Feature

Maintenance & Testing – MATCHING TESTER CAPABILITIES TO REAL-WORLD REQUIREMENTS Testing to Level III Accuracy

As needs change and standards shift, installation contractors are faced with a dilemma: how to make testing choices that meet their certification requirements and do not force them to over-invest beyond near-term needs.


November 1, 2000  


Print this page

As a result of increasing requirements for higher speed networks using relatively low-cost twisted-pair copper wiring, the cabling industry has been grappling with a steadily shifting set of standards for cable test and certification during the past few years. And chances are, changes will occur well into the foreseeable future. As a result, cable installation contractors are constantly faced with a myriad of test equipment choices and often contradictory vendor claims regarding the accuracy levels needed for today and predictions for tomorrow.

The contractors’ dilemma revolves around ensuring they can cost-effectively test to the current requirements, without over-investing in undefined future possibilities or running the risk of near-term obsolescence. The best way to address this situation is to take a hard look at the test requirements actually specified by TSB-67, TSB-95 and the draft Category 6 standard, as well as the specific Radio Frequency (RF) parameters that enable a tester to meet the required accuracy levels.

CERTIFYING TO REQUIRED LEVELS

Regardless of the test methodologies used, the overriding issue has always been whether or not the tester can meet the specified parameters for certifying the cable installation to the required categories and accuracy levels. These requirements are currently migrating to cable Categories 5e and 6, as well as accuracy Levels II-E and III.

Originally, Level II accuracy requirements for testers occurred with the adoption of TSB-67. The Level I category was created to grandfather in existing test equipment that could not meet TSB-67 Level I requirements. As network speeds increased, new Categories of UTP/STP cabling evolved, pushing test requirements and accuracy levels for test equipment well beyond Levels I and II.

TSB-67 initially defined four measurements for certifying cables to Categories 3, 4, and 5: line map, link length, attenuation and crosstalk (defined as near-end crosstalk, or NEXT). It also specified the definitions for basic link and channel link test configurations, as well as the test methodologies for certifying the cabling installations. TSB-95 augments TSB-67 by defining the additional Category 5 measurement parameters of return loss and equal level far end crosstalk (ELFEXT). The new measurements of return loss and ELFEXT were incorporated at the request of the IEEE-802.3 a/b committee, which is responsible for defining the standards for transmitting Gigabit Ethernet over copper wiring (1000Base-T). These more stringent measures were needed because 1000Base-T requires a multi-transmit environment in which all four pairs transmit in both directions simultaneously.

Enhanced Category 5 (Category 5e) now also entails the definition of new Level II-E certification standards that enable existing Category 5 cabling to reliably carry 100 MHz traffic. In addition, Level II-E testing for Category 5e incorporates power sum capabilities, which sums up the worst case measurements for all of the wiring pair combinations. Power sum also provides a solid method for assessing attenuation to crosstalk ratio (ACR) characteristics. From a test parameter standpoint, Category 5e and Level II-E accuracy essentially pushed each measurement category up by 3 dB over existing Category 5 requirements.

As a next step in the speed evolution, Category 6 will be the nomenclature applied to cabling, using RJ-45 style connectors and certified to carry 200 MHz traffic, which will require testing to a new Level III accuracy. The adoption of Category 6 and Level III should also harmonize TIA Category 6 with ISO Class E cabling. Level III accuracy will incorporate all of the existing tests used in TSB-95 at the Category 5e level. However, the movement from 100 MHz to 200 MHz traffic levels (250 MHz for the test suite) requires a significant improvement by as much as 10 dB in each of the critical RF parameters that characterize the test device’s accuracy. This is spawning a whole new generation of field testers to meet Level III requirements.

CONTROLLING RF PARAMETERS

The evolution to Level III accuracy represents a monumental challenge for test equipment designers because a doubling of frequency can be equivalent to quadrupling the difficulty of maintaining test parameters. For example, because attenuation is a function of frequency it will continue to increase predictably as frequency increases. However, as attenuation increases the maintenance of a positive ACR, the crosstalk level has to shift downward by the amount that the attenuation increased. As you move out in frequency, the tester must simultaneously measure deeper into its dynamic range.

For any cable test device, the heart of test accuracy lies in its ability to meet the following strictly defined RF parameters:

Dynamic Accuracy

Residual Noise

Residual NEXT

Output Signal Balance

Common-mode Rejection

Return Loss of the test port

Return Loss Measurement accuracy parameters

DYNAMIC ACCURACY

Dynamic accuracy is the device’s ability to measure a signal and display its amplitude in dB throughout the top portion of the dynamic range (e.g. the uppermost 50-60 dB of the range). This represents how well the detector can be controlled and calibrated to maintain its flatness and accuracy across the relevant range. For example, for a known test signal of 80 MHz at 15 dB down in the dynamic range, the device could be measured to characterize how accurately it detected this signal. TSB-67 tightly specifies dynamic accuracy at a maximum of + .75 dB, measured down to a level 10 dB below the NEXT limit curve.

Dynamic accuracy is the parameter with the most critical impact on overall device accuracy. As you get further into the dynamic range and more distant from the NEXT limit curve, other parameters take over to maintain accuracy within significantly less-stringent limits. Because each parameter represents a random error function, they can all be summed (e.g. the square root of the sums of the squares) and added to the dynamic accuracy to form an overall accuracy picture. These other parameters include residual noise, residual NEXT, return loss, output signal balance and common-mode rejection.

RESIDUAL NOISE

As cable testers essentially sweep a frequency domain and take dB log-magnitude measurements, they exhibit characteristics that are quite similar to a network or spectrum analyzer — including the existence of a definable noise floor. Although highly accurate measurements can be taken through most of the dB range, as the noise floor is approached (within 10-15 dB), the signal tends to compress or flatten out until it finally reaches the noise. The maximum acceptable residual noise floor as defined by TSB-67 is represented by the following formula: 65 – 15log (f/100) dB.

RESIDUAL NEXT

Residual NEXT is the core level of crosstalk created by the instrument itself and represents the floor to any meaningful crosstalk measurements that can be taken by the device. Residual NEXT can be measured by simply terminating each of the tester’s pair connections to a precision resistor (e.g. 100 ohm) and then taking a crosstalk reading. Since the simulated “test environment” has no crosstalk, the reading represents the tester’s internally generated residual crosstalk. TSB-67 specifies the maximum acceptable residual NEXT as 55 – 15log (f/100) dB.

OUTPUT SIGNAL BALANCE & COMMON-MODE REJECTION

Output signal balance and common-mode rejection are two interrelated parameters; they can be grouped together and referred to as the “balance functions” of a tester. The reason these parameters are important is because a tester must be able to make accurate differential-mode measurements between the twisted wiring pairs. Unfortunately, in an environment where there are four pairs within the same jacket, there are more modes than just differential to differential. With both common-mode and differential-mode signals on the twisted pair cable, consideration must be given to:

Common-mode to differential-mode

Differential-mode to common-mode

Common-mode to
common-mode

Differential-mode to differential-mode

Differential mode signal current passes out one wire of a pair and returns on the other wire, making a loop. At any point along the wire pair, equal currents are flowing in two different directions, which tends to almost perfectly cancel out an external field. Therefore, differential-to-differential crosstalk coupling is very low. However, common mode current flows longitudinally, the same direction in both wires; that allows large fields around the pair, which can easily couple to other pairs. The presence of common-mode signals therefore tends to increase coupling problems across the wiring pairs. The field tests must measure the smaller differential-mode to differential-mode coupled signals and ignore the presence of coupled signals generated via the other three coupling modes.

Inadvertent common-mode emissions must be minimized from the tester’s differential signal transmitter. Output signal balance is a measure of how well the tester puts a differential signal out onto the driving lines. It is defined as the dB ratio of desired differential-mode signal to the undesired common-mode signal.

Common-mode rejection is essentially the opposite side of the coin from output signal balance. It represents a measure of how well the tester’s receiver is able to identify and reject common-mode signals from the differential-mode signals on the driven pair. This is measured by intentionally injecting a known level of common-mode signal into the test receiver. The ratio of injected signal to detected signal represents the tester’s common-mode rejection.

In TSB-67, both of these balance functions must meet the same degree of circuit balance, defined by the following Level II accuracy equation: 37 – 15log (f/100) dB.

RETURN LOSS OF THE TEST PORT

Return loss essentially represents the source-to-load signal loss at the tester. Depending on the structure of the wiring link and the number of impedance mismatches along the link, the amount of signal being reflected back to the tester’s transmitter can represent a significant degradation of the test signal being passed to the tester’s receiver. Therefore, the tester’s source/load return loss must be minimized through strict design disciplines, component selection and impedance matching. In order to keep signal reflections to a minimum for Level II accuracy, TSB-67 specifies a value of 20 dB for return loss (impedance match), looking into the tester’s test ports.

RETURN LOSS MEASUREMENT ACCURACY PARAMETERS

For Level II testers, the impedance match (or mismatch) of the measurement port was considered an error term in calculating attenuation and crosstalk accuracy. For Level III instruments, return loss is a new measurement that has its own set of accuracy parameters:

Source Match

Tracking

Directivity

The return loss measurement is a measure of the signal reflected back to the transmit end, rather than passed on to the receive end of the pair. The dB ratio of transmitted to reflected power gives the value of return loss. If the cabling and load are all very close to the correct impedance, there will be very little reflected signal and the dB value of return loss will be a large number (30 dB or greater is excellent). Less than 10 dB return loss means that significant power is lost to reflections. For example, a return loss of 20 dB means that one-tenth of transmit power is reflected to the transmitter, while the power delivered to the load is 1 dB less than expected.

Source match is the return loss (or match) looking into the measurement port, just like the Level II parameter. However, instead of being specified as 20 dB to 100 MHz, there is a Level III equation line that starts out much lower than 20 dB at lower frequencies and ends up at approximately 20 dB at 250 MHz. This is a considerable tightening of specification, so Level III products must have very tightly controlled source and load impedances to control unwanted signal reflections during the return loss measurement.

When a field tester is making a return loss measurement into either a short or an open, the entire transmitted signal is reflected. This has to be a “return loss” of 0 dB. The tracking parameter is simply a verification of this measurement. Level III will require a tracking accuracy of better than 1 dB into both the open and the short.

When a field tester makes a return loss measurement into a “perfect” match termination, it will read a very high value in dB. This is analogous to the residual NEXT parameter, where the cable is removed and each pair is terminated into a precision 100 ohm resistor. The resulting measurement is the “floor” for that measurement, the best the instrument is capable of with its own internal errors. The directivity parameter is the measurement floor for the return loss measurement, and it represents how well the tester can tell the difference between transmitted and reflected signals. Level III has an equation for directivity that has a 250 MHz value of approximately 25 dB.

COMPARISONS OF TESTER ARCHITECTURES

Recently, there has been a significant amount of industry discussion regarding the best test methodologies and architectures to deploy in response to the evolution toward tighter testing and certification standards. Much of this debate centres around scalar vs. vector methodologies. While there are trade-offs regarding cost and capabilities, the key point is that test instruments must ultimately be able to meet all of the industry certification requirements and protect the cable installers’ investments in equipment.

SCALAR VS. VECTOR MEASUREMENT ISSUES

For each of the above-defined RF parameters, either a scalar or vector tester must begin by accurately measuring the magnitude of the parameter. This objective can only be accomplished through stringent hardware design methodologies and component selection. Conformance with the basic RF parameters cannot be “added in” through software features or compensated for by adding vector capabilities. In essence, accuracy is either designed in at the hardware level or it simply is not there.

Some of the critical factors in hardware design include:

Balancing transmitters to achieve required output signal balance

Providing shielding and isolation across the tester’s PCB

Minimizing internal coupling through careful circuit layout

Ensuring cleanliness and adequate isolation of power supplies

Use of low crosstalk connectors with careful pin isolation

Matching source and load impedance characteristics

The bottom line is that when a Level III scalar instrument design is complete, it must be able to meet all appropriate third-party certification requirements for Level III. If it does, it is deemed sufficient to test and certify any customer’s cable installations up through Category 5e and Category 6 levels.

In addition to having to follow all of the above-described scalar design disciplines for accurately measuring magnitude, vector-based testers measure phase as well. With vector devices, the phase information can be used to further characterize the test instrument so that subsequent field measurements can subtract out residual errors of some parameters and give a higher level of accuracy. The more complex mathematical algorithms required for vector processing require a significantly greater hardware investment in the tester’s microprocessor, memory, etc. Although the use of vector measurements can theoretically be used to provide a higher level of “headroom” over and above the current requirements for Level II-E or Level III accuracy, from an installer’s perspective the additional cost must be weighed against the potential benefits.

WHEN ALL IS SAID AND DONE

In essence, the issuance of a Level III certification for a tester provides the solid foundation and basis for an installation contractor to test and certify their customers’ cabling. While additional investments beyond that level might provide a comfortable feeling of being “future-proofed”, the truth is that you cannot certify to a standard that does not yet exist.
Furthermore, if the cabling industry’s history has shown us anything, it is that change is both inevitable and unpredictable.

When all is said and done, field installation contractors still need access to rigorously designed and cost-effective solutions that fully meet their present-day certification requirements, and do not force them to over-invest beyond requirements for the near future. On the other hand, what is really required for effective “future proofing” is a test equipment vendor relationship that demonstrates the ability to evolve product designs as industry standards change, and a proven commitment to assisting the contractor with managing the cost of upgrading.CS

Richard Helstrom is the Director of Product Marketing for the LAN division of Acterna, Inc. of Germantown, Maryland. The company is a subsidiary of Acterna Corp., which was formed earlier this year by the merger of Wavetek Wandel Goltermann and TTC. Mr. Helstrom’s responsibilities include worldwide product planning and marketing strategy for the entire LAN Division product line.


Print this page

Related