Connections +

The Rise Of SAN

An upsurge of data, falling prices and disaster-recovery planning has conspired to help make storage area networks a much-required technology for businesses large and small. The increased adoption rate of SANs should soon reap benefits for structured cable manufacturers.

January 1, 2005  

Print this page

In the late 1990s, the city of Saskatoon was facing a problem common to many organizations: there was too much data and too little storage space. Over the years, the city diligently increased its storage capacity through additional servers for not only online transactions such as utility and tax bills, but also for all e-mails and internal correspondence.

Even so, data continued to increase noticeably to the point where staff could not always access the information they needed in a timely fashion. The problem stemmed, to a large degree, from how the data was classified and archived.

Low-level data such as old e-mails, which users might access once or twice a year, were slowing down the system for high-level data such as online transactions.

“We are not unique in our storage requirements,” says Peter Farquharson, manager of technology integration and corporate information services for the city.

The city first adopted storage area networks (SANs) technology in 1999 from IBM Canada Inc. as part of an infrastructure reorganization to make operations more efficient.

Farquharson and others found that one of the major benefits of a SAN is its flexibility. “Having a storage dedicated to a single server meant that you had to always buy more storage, which meant that you had to shut down your system. It was a constant state of flux,” he says. “The beauty of a SAN is that you have this pool of storage and you allocate it, as required to meet your needs.”

8.5 TB of capacity

Two-tiered storage is undoubtedly one of the chief benefits of the technology, says Kyle Foster, general manager for storage sales at Markham, Ont.-based IBM Canada.

“(It) has evolved to the point that deploying two-tiered storage doesn’t mean doubling your costs.”

A SAN system allows network managers to segregate applications by performance or criticality.

High-performance devices back-up critical applications, and slower devices back-up archival applications (such as e-mails). Today, after five years of experience with SAN technology and after a number of upgrades and additions to its system, the city has about 8.5 terabytes (TB) of storage capacity.

Currently, Saskatoon has a single data centre, however plans calls for it to be split in half. Equipment from IBM will be moved to another location across the street and the two centres will be connected by a 2-Gbps Fiber Channel serial interface running over fiber-optic cable.

“We have the advantage of being the city, so we own the streets,” Farquharson says. “We are simply going to put a duct between the streets and lay some fiber across it.”

Such advances explain, to a large degree, why SANs have finally achieved the presence analysts have been predicting for years.

In many ways, the technology solves the problem of how best to access data quickly and efficiently at a time when the amount of data stored is increasing at a rate of more than 100% annually.

Foster suggests that the first trick is to deploy storage subsystems.

“You have to stop deploying servers with storage inside the server, tightly coupled to the server, and start deploying storage external to the server,” he says. “Now the storage resource is not owned by the server.”

Reitmans has signed on

Such thinking was not lost on Reitmans Canada Ltd., a national clothing retailer based in Montreal. Like many companies operating enterprise resource planning (ERP) software, it was using Unix and NT servers with dedicated disk storage.

“Whenever we needed more storage space, we were faced with the problem that another disk was attached to another server, which became a huge problem,” says Claude Martineau, IT manager for Reitmans.

“We needed, say, 100 GB on one side, but we couldn’t access it because it was on another server. That triggered our consolidation process.”

Reitmans bought its first SAN system from Hewlett-Packard (Canada) Co. in 2000. With the technology in place, the company did not have to purchase excess capacity that would often sit idle

“Before, I had to calculate the storage for each of the servers, without really knowing if I had enough,” Martineau says. “Now, I can rationalize storage and calculate how much I will need for the next three years.”

Reitmans’ total storage capacity is about 7 TB, which, given the company’s current rate of growth, will likely not be enough storage capacity for the future.

As of late last year there were 866 stores in operation consisting of 351 Reitmans, 167 Smart Set, 140 Penningtons, 30 RW & CO., 113 Addition Elle and 65 Thyme Maternity outlets.

When two companies with distinct IT systems come together as a result of a merger or buyout, (Reitman’s has acquired several companies over the last few years.) it can result in a difficult-to-manage technology infrastructure.

The problem is magnified when companies operate inefficient and outdated legacy systems.

“Moving to a storage area network is a huge step in simplifying the infrastructure,” says Foster. “You gain physical access of the storage devices by combining than all on one network, which facilitates the allocation of the resources to business applications that need the capacity.”

Crunching the numbers

When SANs were introduced about 10 years ago they were very expensive — cost ran into millions of dollars per unit — thus, large enterprises were the chief users of them. In addition, the technology suffered many interoperability problems, making them difficult to install and run smoothly.

Today, SANs not only costs substantially less, but also have, on average, become easier to install and operate.

“There has been a systematic decrease or about 15% per year in the price per port of SANs,” says James Opfer, an analyst with Stamford, Conn.-based Gartner Group, a technology-consulting firm. A business can now buy a SAN system for less than $30,000.

Leading companies include IBM, HP, EMC Corp. and Computer Associates International Inc., which combined hold a commanding market share.

At Storage Networking World in Orlando, Fla. late last year, for example, low-cost SAN packages were introduced that are easier to implement and use.

In large corporations, with more than 20,000 employees, IT managers are either upgrading their legacy storage systems or installing secondary capacity. Technology upgrades are taking place, on average, every two years.

While worldwide revenues for the SAN market approached US$1.7-billion in 2003, the storage network infrastructure market revenue is forecast to grow at an average annual compound rate of 21% and hit US$4.9 billion by 2008, Opfer says.

There is no doubt that SANs have and will continue to become a technological force in the years to come.

“We still haven’t reached the point where more than half of the small- and medium-sized businesses have deployed SAN technology,” says Foster. “But the ones that have deployed it are moving quickly to exploit all of its benefits.”

That includes turning data into information.

“Companies want to leverage that data for competitive advantage,” says Parag Suri, category business manager (network storage solutions) at HP Canada in Mississauga, Ont.

“SANs play a vital role in that business application and that’s one reason why we are seeing a huge adoption increase. It’s not only enterprise-class customers that are deploying SANs, but also small- and medium-sized businesses, a trend that started in 2002.”

The rapid decrease in price, from millions of dollars in the late 1990s to less than $30,000 makes storage-area technologies affordable for many small-and medium-sized companies.

The use of SANs can also form an integral part of an organization’s disaster recovery and business continuity plans, which is smart business.

Simply put, customers do not want to lose data and those operating in critical applications such as financial-services companies have built in multiple levels of redundancy to minimize potential losses of data. (see CNS, November/ December 2004.)

Such is the case with Reitmans, which operates two SAN systems. One is a disaster-recovery mirror site, linked by a 10-kilometres dark fiber line to the main site. The Fiber Channel connectivity has increased the flow of data.

“When we upgraded to Fiber Channel, we gained a 34% uptake in speed,” Martineau says. “Equally important, fiber connectivity allows us to do real-time replication. So, if one site goes down for any reason, there is a backup of all the data.”

To be sure, the connectivity between server and storage device is critical, and the type of connectivity depends on factors such as the size of an operation and cost.

There is no doubt that Fiber Channel is the incumbent connectivity solution, at least for now.

Even so, Ethernet technologies such as IP-based storage area networks (iSCSI) will likely make inroads in the next few years. “iSCSI will undoubtedly benefit companies that have offices spread out over many locations, allowing then to simplify and automate their operations,” says Foster.

Both will co-exist as viable solutions to manage and access information, notably for small and medium-sized businesses. There are storage devices that have both iSCSI and Fiber Channel Connectivity.

“As for iSCSI, some will use copper and some will use fiber optic — just as they do in Fiber Channel today,” says Robert Gray, vice-president, worldwide storage systems research for IDC, a technology-consulting firm based in Framingham, Mass. “It will depend on such factors as cost and future requirements.

Companies today have many choices, which might make the decision-making process difficult.

But, in this case, it just might be an embarrassment of riches – essentially, one where businesses employing SANs will see an increase in productivity and efficiency.

Perry Greenbaum is a writer based in Montreal and a rural community near Concord, N.H. He can be reached at