Cloud Security Risks Lurk In Big Data Projects

It is fairly common for businesses to use Amazon Elastic Compute Cloud, Microsoft Windows Azure or the myriad of other cloud infrastructure providers for big data analytics projects, which require the computing power to conduct large-scale data analysis. While they help capture, manage and analyze terabytes or more of structured and unstructured data, they introduce the potential for data loss, account or service hijacking, or abuse if systems aren't protected and overseen with due diligence, said David Barton, principal and practice leader of the technology assurance group at Atlanta-based UHY Advisors, a business consulting firm.

Speaking at the MIS Training Institute's Conference on Big Data Security, Barton said IT teams are often skirted by business units that can use a credit card to rent cloud infrastructure in minutes. Executives want to get the most value out of the data, and they want to do it quickly, he said.

[Related: Protecting Data In The Cloud: 10 Top Security Measures ]

"The primary driver of most big data projects is not going to be security, it's going to be about sales," Barton said. "Rapid analysis and deployment is the whole point of using the cloud, and data security and privacy slow things down. Rather than alert security, they will go around you, under you or over you to get it done."

id
unit-1659132512259
type
Sponsored post

The Cloud Security Alliance and other organizations have outlined the potential risks with cloud computing. Experts told CRN that an abundance of improper cloud deployments offers potential business opportunities to solution providers in the channel. Infrastructure-as-a-service providers are typically the cheapest option to rent computing power, which also carries with it the most risk and responsibilities, Barton said. Unless an organization opts to lease a private cloud, the infrastructure in a public cloud environment is typically shared among different users; the location of the data is often uncertain and open to an increased risk of exposure, Barton said.

"You've got the problem of the virtual nature of storage coupled with no perimeter to secure," Barton said. "Up until now, all of our security for most part has been perimeter security, but businesses today are connected in ways that they probably don't know."

Systems can also be open to shared technology vulnerabilities, making them ripe for attack by cybercriminals using automated tools. Denial of service attacks can result in cloud outages making systems inaccessible for extended periods of time, he said.

The tools associated with big data are also typically not very secure, Barton said. For example, Hadoop, the analytics framework commonly used to drive deep analytics of large amounts of data, lacks security features and often defaults to "all access."

Barton advocated for users of Hadoop to implement Kerberos, a network authentication protocol, to protect access to the information. The protocol is supported in Hadoop, but not used as often as it should, he said.

File encryption and properly implemented key management should also be employed to protect data, Barton said. Servers need to be validated to ensure each node comes online with the baseline security measures in place. "Before bringing on a new node, make sure it is preset with all the security that the prior nodes have," Barton said.

Log management and actively reviewing logs helps detect attacks, diagnose failures and assists IT teams when investigating an issue. SSL and TLS secure protocols should also be supported and used when transmitting sensitive information, he said.

As an auditor, Barton said he also looks for documentation that controls are reviewed regularly and policies are in place. Security policies must be communicated effectively and have enforcement mechanisms, he said.

"Big data and cloud go hand-in-hand so cloud risk equals big data risk," Barton said. "The key question is: Are the risks that your organization is undertaking acceptable?"

PUBLISHED JULY 17, 2013