The defining qualities of big data – volume, variety and velocity – make it a seemingly natural fit for cloud storage. Businesses investing in analytics must deal with an ever-growing amount of data that must be accessed in real-time. Amazon CTO Werner Vogels recently suggested public cloud services would be the next step for big data, highlighting the benefits of highly scalable infrastructure.
Despite Vogels' optimism, InfoWorld's David Linthicum said he is only half correct about the public cloud's ability to handle big data. It has one aspect of the technology covered: volume. However, the information held in corporate IT environments is also valuable and may become an easy target if the infrastructure housing it is not well guarded. Linthicum warned that public services are not yet ready to meet the needs of real-time big data analytics initiatives.
"[W]hile I think security and compliance are typically solvable problems in public cloud computing, they are easier to work with if the data is local," Linthicum wrote. "Moreover, performance is better with local data because you're not dealing with the latency of sending requests and returning data sets over the open Internet."
That does not mean businesses shouldn't consider cloud services for big data. Private cloud infrastructure solutions are able to deliver some scalability while meeting more stringent security demands, and Linthicum predicted that the public cloud may be better able to handle latency issues within the next five years.
Wikibon's Jeff Kelly recently argued that public cloud services may already be able to handle big data. He pointed out that much of the data companies want to analyze starts off in the cloud. As a result, it makes sense to use the technology for initiatives such as social media analytics. In addition, cloud providers have focused on features that more easily integrate information from unique data sets, potentially making it less expensive to form a comprehensive view of enterprise information.