Microsoft Pushes Deeper Into Data Warehouse Market

data mart data warehouse SQL Server database

The SQL Server Fast Track Data Warehouse initiative is Microsoft's latest push into the market for data warehouse systems that can handle tens, and even hundreds, of terabytes of data. Sometime in the first half of 2010, Microsoft is slated to debut massively parallel data warehouse technology -- being developed under the code name Project Madison -- using technology it acquired in September when it bought data warehouse appliance maker DATAllegro.

The SQL Server Fast Track Data Warehouse effort will provide customers with preconfigured, pretested data warehouse systems that cost as little as $13,000 per terabyte of data, said Stuart Frost, who was CEO of DATAllegro and was named general manager of SQL Server data warehousing after the acquisition. Data warehouses built using the reference architectures will scale up to 32TB and process up to 200MBps per central processing unit core, according to Microsoft.

"It's part of the overall move by Microsoft and the SQL Server organization to address the high end of the data warehousing market," Frost said. IBM, Oracle and Teradata sell technology for assembling massive data warehouses that can cost millions, even tens of millions, of dollars. Systems based on the reference architectures will also compete with smaller vendors like Netezza in the market for data warehouse appliances.

By optimizing all hardware components for SQL Server, the reference architecture configurations will reduce the time, effort and cost of deploying data warehouse systems, according to Microsoft. The company is also partnering with systems integrators Avanade, Cognizant Technology Solutions, HP and Hitachi Consulting to provide solution templates tailored for the hardware reference configurations.

Sponsored post

Data warehouses built using the reference architectures will run on SMP servers, according to Frost. The "Madison" technology will sit on top of arrays of SQL Server databases, creating a massively parallel computing system that can handle multiple petabytes of data. That development work is progressing on schedule, Frost said.