Search
Homepage Rankings and Research Companies Channelcast Marketing Matters CRNtv Events WOTC Jobs Cisco Partner Summit Digital 2020 Lenovo Tech World Newsroom Dell Technologies World Digital Experience 2020 HPE Zone Masergy Zenith Partner Program Newsroom Dell Technologies Newsroom Fortinet Secure Network Hub Hitachi Vantara Digital Newsroom IBM Newsroom Juniper Newsroom The IoT Integrator Lenovo Channel-First NetApp Data Fabric Intel Tech Provider Zone

Gartner: Poor Data Quality Dooms Many IT Projects

Corporations routinely make decisions based on remarkably inaccurate or incomplete data, a bad habit that's a leading cause of the failure of high-profile and high-cost IT projects such as business intelligence and customer relationship management deployments, a research firm says.

"Most enterprises don't fathom the magnitude of the impact that data quality problems can have," said Ted Friedman, principal analyst with Gartner. According to his research, a quarter of the Fortune 1000 companies are working with poor-quality data.

Friedman isn't talking only about corrupted data -- although that can be a part of the problem -- when he points to the pitiful state of data. Instead, data quality is defined by a number of components, ranging from consistency --whether the data is identical when stored in multiple locations -- to accuracy and relevance. If there's data, but it's not relevant to the process or project at hand, it's worthless, said Friedman.

Other data problems stem from the fact that the data which was collected is often incomplete.

"More important is the fact that enterprises haven't done a good job of collecting all the data that they should have, or have, but it's not in sync across the enterprise," he said. Synchronization will become an even bigger issue for businesses in the near future as they struggle to integrate the enormous amounts of data gathered from RFID projects.

One of the biggest causes of poor data quality is the habit by company executives of passing along the data acquisition chore to IT, which may not have a clue as to what data actually is needed.

"None of these are new problems," said Friedman. "People have always avoided the data quality issue. It often starts when [corporate] leadership says 'We're going to do CRM,' or some other project, and tells IT to get the data. They just assume that IT knows what data to get."

Bad or incomplete data can have an enormous impact on major IT initiatives, such as business intelligence (BI) and customer relationship management (CRM) roll-outs, Friedman claimed.

"BI is the best example of the old adage, 'garbage in, garbage out,'" said Friedman. Executives who expect to make strategic decisions based on BI software using poor data will be in for a shock. "How can they have a high confidence in the decisions when the data is bad?"

Lousy data is a minefield for CRM as well. There the problem is exacerbated by the fact that these projects not only touch those inside the enterprise, but customers, too. "A company can get a very big black eye in short order if it doesn't pay attention to the quality of data."

Many companies are being forced to address data quality because of stricter regulations such as Sarbanes-Oxley, which has requirements that demand the data collected be relevant.

"But there isn't a silver bullet," said Friedman. "This isn't an IT problem, it's a business problem." Rolling out new technology alone won't fix things; instead, companies must change their cultures, and get the business side involved in data acquisition.

"If you're only throwing technology at the problem, at best you'll only get short-term, lukewarm benefits," he promised.

"People, processes, and technology have to come together on this."

This story courtesy of TechWeb.

Back to Top

Video

 

trending stories

sponsored resources