Connections +

The Timefold Factor

Agilent Has Coined a Term for Reducing Operating Expenses in the Optical Osp. the Intent Is to Create Faster and Easier Fiber Testing Methods.

April 1, 2003  

Print this page

If you are involved in fiber optic network testing or manage groups or contractors that are, I would suggest reading this article. It will save you more time than it will take to read it.

Indeed, the art of compressing an expensive, time-consuming process and making it shorter is a play on words coined as TimeFold.

Choosing test equipment to optimize for speed of the whole process can have a dramatic effect on the return on that investment and the effectiveness of you team using that equipment.

Testing fiber is a necessary, but time consuming process. And the number of tests needed in a given network are rising:


More fiber means more measurements. The number of segments in a CATV network is rising with the need to get the fiber closer to the consumer.

Additionally, more than one fiber is being supplied to a node with up to six fibers ready at the pedestal. Narrow band services, and planning for business and telephone services are the most common reasons. All of this leads to many more fiber segments that regardless of length require attenuation and reflectance testing.


Originally just 1310nm, testing at 1550nm to handle cable modems and narrowcasting is the new standard. Additionally, 1625nm is beginning to be used to test for bending and to check the fiber is ready for future services at that wavelength.

Some operators are already utilizing Wavelength division multiplexing to combine broadcast with narrowcast services creating a need for wavelength, power and optical signal to noise measurements.

Hundreds of measurements now need to be made where no measurements existed before when fiber didn’t exist past the hub. A single 144 fiber cable can require 432 measurements to characterize loss and reflectance at two wavelengths and in both directions.

No fiber operator has increased their workforce proportionally to the increase in the number of fiber segments. In some cases, they have fewer people than two or three years ago due to increased competition and the shrinking of capital markets.

The key to reducing time is to look at the whole picture from start to print and optimizing every step.

As Carl Sagan from the popular show Cosmos once said: ” To make an apple pie from scratch, you must first invent the universe.” Following that advice, we have to start at the beginning and look at every detail in the measurement process to see how it can be improved. Since most of you don’t have 11 billion years to go into that kind of detail, I’ll summarize the major points below

We’ll look at the most common measurement made today: Optical Time Domain Reflectometry (OTDR).

It’s not a given that people know how to use the equipment, so time must be spent learning to use the instrument. Every fiber connection must be cleaned and attached.

The set up must be defined and project information entered with every measurement. Measurements are made as multiple wavelengths, scanned for events and either manually or automatically checked for deviations from limits.

The measurement is usually stored and the measurement process is repeated for another wavelength. After all fibers have been tested in one direction, the entire process is repeated from the other end of the fiber to ensure accurate loss measurements.

On page 22 is a graphic of the core measurement process and the dramatic effect speed optimization at most steps can have on the total test time. This is the essence of the TimeFold principle.

The bottom line is an improvement in testing that could take two work days to complete now can be easily done in morning testing both ends of a 144 fiber cable coming from a node.

TimeFold technologies include:


How can test equipment speed fiber prep? By reducing re-work. Use of a video microscope can ensure clean and scratch free connections, which can degrade measurement results or cause expensive rework.


The keystrokes necessary to set up, change, analyze and store a measurement can not only take time, they can drive even the most determined operator to distraction (if not carpal tunnel syndrome). After tens of times, mistakes become more imminent. By listing preprogrammed measurements and automatically incrementing intelligent filenames, measurements and storage as well as future analysis is automated.


The quality of an OTDR measurement is influenced by the dynamic range of the measurement (dB difference between initial returned signal power and the power level of receiver noise). Thousands of measurements are made per minute and averaged together. An instrument that can achieve a 1dB advantage in a given time needs only half the averaging time. By optimizing performance at short measurement times and pulsewidths commonly used, rather than focusing on figures shown on a datasheet, speed increases of over 6-11 times can be achieved. Put simply for the same money would you rather buy a car that delivers 0-200km/hr in two seconds? Or a car that has a top speed of 201km/hr but takes three minutes to accelerate to that speed?


By entering pass/fail criteria into the OTDR for values such as total link loss and reflectance, an instrument can report fibers passed or where issues are. This improves not only speed, but also eliminates mistakes due to fatigue and endless repetition.


Total Performance can be increased by a factor of three simply by using a compact flash memory card to store measurement in place of unreliable, slow small floppy disks. Internal hard drives might seem the logical choice, but cases exist when an hour is wasted daily transferring the data to a PC not to mention the disaster of a hard disk crash. Laptops accept compact flash cards, which can transfer 144 measurements in less than 10 seconds. Desktop readers are also available.


Not only is transfer speed important, but the PC analysis of traces must be automated in today’s multifiber environments. Batch processing of everything from event table generation to pass/fail testing and creation of two-direction averaged loss measurement is now available reducing post processing tasks from hours to seconds and organizing files as well. Free trace viewers are available allowing a report of thousands of fiber segments to be burned onto a CD along with the trace viewer.


We’ve discussed the pure mechanics of a measurement and the time it takes, but in the real world mistakes can happen. Any instruments that can reduce the 3R’s give its owner a competitive advantage. Contractors who make inaccurate measurements may need to redo measurements, doubling their costs and potentially fatally harming their ability to get another contract. In house users risk embarrassment and in the worst cases a defect that goes unnoticed can require expensive repair or downtime when the link goes live. Instruments that have the most repeatable loss measurements, warn of poor connections and measure reflections more accurately can eliminate the 3R’s.

As mass fiber measurements progress in to measurement of WDM signals and even fiber bandwidth characterization (such as Chromatic Dispersion), investing in instrument platforms that have a common look and feel or allowing near simultaneous measurements can fold the total measurement time for users as well.

The affect of optimizing all of the mentioned parameters can be dramatic.

Like taxes and pop music, fiber is here to stay. In today’s competitive environment, there is no choice but to increase workforce productivity and achieve a faster break even ROI.

Peter Schweiger is the North American Technical Trainer with Agilent Technologies Canada Inc. He is based in Mississauga, Ont.

As Carl Sagan from the popular show Cosmos once said: “To make an apple pie from scratch,
you must first invent the universe.”

Print this page