News

SQL Server: Ready for the Big Time?

The world’s largest databases are approaching the 50TB mark, performing functions driven by scientific applications, decision support and data warehousing. To date, just about all these installations were built on Teradata or Oracle databases running on Unix. However, Microsoft SQL Server may soon begin joining these behemoths in the not-too-distant future. In fact, SQL Server already supports a number of multi-terabyte installations. Brian Goldstein, test lead for the Microsoft SQL Server Reliability Test Team at Microsoft Corp., reported at a recent Web conference that there were approximately 15 SQL Server customers at the time that are scaling above a terabyte.

A number of analysts and database developers say that there is now little difference between SQL Server and its competitors in terms of performance. “We’re reaching the point of unsubstantiated differentiation,” says Mark Shainman, analyst with Meta Group. “An end user can buy a packaged app and run it on all three platforms [SQL Server, Oracle, DB2].”

Recent benchmarks, in fact, show that SQL Server now edges out many other databases in terms of scalability. In August, the Transaction Processing Performance Council posted a benchmark of almost 787,000 transactions per minute from a 64-bit version of SQL Server running on an Itanium-based HP Integrity Superdome server. This ranks a close second to the top performer of the moment, an Oracle 10G database running on top of an HP Superdome, churning out 824,000 transactions per minute. SQL Server occupied four of the top 10 database benchmarks posted on TPC.

Of course, analysts point out that few companies actually need this kind of processing power at this time. But with comparable performance and scalability between the leading database brands, the only key differentiation between databases is price – an area in which SQL Server is highly competitive, since it runs on commodity hardware. “That’s why Oracle jumped Linux full-force, because they realize that they’re not going to be competing against SQL Server on Sun boxes, but on Intel boxes,” says Shainman.

Consultants working with SQL Server report that while Oracle and Teradata still rule the high-end decision-support space, Microsoft’s database is coming on strongest in OLTP systems. “The VLDB market is pretty small,” observes Tyson Hartman, .NET practice director with Avanade. “More opportunities for SQL Server are coming up in high-volume, high-transaction rate environments, such as airline reservation systems, ATMs and point of sale systems.” Such systems typically scale into the hundreds of gigabytes range, Hartman adds.

Sage Telecom Inc., a regional phone carrier which services half a million customers, employs SQL Server to support almost 400 GBs of data on both 32-bit and 64-bit platforms. “As we grew initially, we were able to satisfy all our processing needs through SQL Server,” says Russell Clarkson, vice president of information systems for Sage Telecom. “Since then, we’ve been growing 200 percent to 300 percent a year. We’ve always felt Microsoft was going to stay out ahead of us. We never looked at Oracle, though many people outside of Sage felt we needed to, because at some point we were going to hit the scalability wall. We’re now running Windows 2000 Datacenter for our core customer database. With Windows Server 2003, and our 64-bit environment, there are absolutely no scalability issues.”

The availability of 64-bit capabilities within SQL Server is also dramatically upgrading the database’s competitiveness. “There are very large performance gains that we are seeing, both on the relational process side, and on the multi-dimensional, or OLAP processing side, particularly on the 64-bit platform,” says Rajiv Mistry, director of data warehousing and business intelligence at Avanade. “Some of the limitations that existed in SQL Server 2000 and Analysis Services 2000 on the 32-bit platform have been removed, enabling the building of extremely large dimensions and cubes.”

Sponsor -- Free Book: Applications of Industrial-Strength Business Intelligence
Discover how companies in every vertical, from Best Buy to JD Edwards, are deploying the most advanced, enterprise-class BI applications in the world. See examples of financial reporting applications, sales analysis, advanced transactional fraud detection and much more. All delivered by MicroStrategy, the only Industrial-Strength BI Platform available anywhere. Click here to obtain your free copy.

This will move SQL Server more strongly into the OLAP space within the next few years, analysts agree. “We’re seeing SQL Server compete today with Oracle on most applications, except on the very high end, in decision support,” says Meta Group’s Shainman. “Organizations that were once exclusively Unix-centric are now open to the idea of running major database platforms on Intel technology, because of the price of the platform.” This is increasing the presence of both Windows and Linux in large data centers, he adds. “You just can’t beat the price performance of cheap, commodity-based Intel hardware.”

At this point, lingering perceptions appear to be the only obstacle to SQL Server’s growth in the large database market. “Unfortunately, there’s the perception that SQL Server isn’t the equal of Oracle or DB2,” says Sal Terillo, database architect for Intrasphere Technologies. Ironically, ease of use can work against perceptions of a database’s sophistication, he points out. “SQL Server is significantly easier to install than Oracle or DB2. It’s very easy for developers that don’t necessarily have a DBA background to slap in an installation. There’s not as much understanding of security features, or of how to tune a large database. The ease-of-use thing is a double-edged sword.”

Scalability concerns that were evident in the late 1990s also continue to color current perceptions. In many cases, upper management still needs to be sold on moving mission-critical applications to a Microsoft platform. “I’m amazed at how many CIOs I’ve talked to that say they haven’t looked at the Microsoft platform since NT 4.0 and SQL 6.0,” says Avanade’s Hartman. “Everything has changed, including benchmarks, and the size and scale of deployments.”

In perhaps the largest SQL Server database deployment to date, SQL Server 2000 supports a 10TB database called the Human Genome project. The system is built on top of a Dell eight-way enterprise server with EMC storage arrays and Emulex host bus adapters, and supports 20 billion records. System planners expect to grow the database size to 100 TBs in the near future. “It’s a fairly complex architecture, and it did require some work to tune the system, for excellent throughput,” says Microsoft’s Goldstein.

One of the largest commercially available architectures for Windows systems on the market, a Unisys ES7000 machine, is able to process billions of records worth of relational data, and load multi-dimensional OLAP cubes in Analysis Services in record times. Throughputs were in the range of about a quarter-billion records per hour. “It’s definitely a very stiff competitor in the multi-dimensional space to multi-dimensional products,” says Mistry.

Sage Telecom brought in a Unisys 32-bit ES7000 to run billing, trouble management and other CRM activity on SQL Server. A second, 64-bit ES7000 stores actual call records by customers within the 400GB database. “We’re doing a lot of historical analysis of our call records, and will continue to grow that function relatively aggressively,” says Clarkson. “From a data warehousing perspective, we’re looking for the ability to better slice and dice call record data to look for trends and anomalies.”

In scoping out system requirements to deploy SQL Server as a very large database, Goldstein recommends first “identifying areas where your application will potentially be bottlenecked. If you’re doing a large amount of batch loading, or running complex queries, or have a relational data warehouse, having fast CPUs and large numbers of CPUs is really great investment. If you expect thousands of users, and require multiple databases, investing in additional memory or even looking at 64-bit technology is something to consider also. Or, if you want rapid backup and restore, an I/O with many disks, many disk controllers, and a fibre channel backbone could make sense.”

Featured