July 26, 20197 Factors to Consider for Successful Cloud Transformation
Last week, my good friend David Menninger and I discussed the state of Big Data analytics and its future in our webinar “How Distributed, Diverse Data has Transformed Big Data Analytics”. As the Research Director at Ventana, David highlighted some important industry and market trends in Big Data. I was able to corroborate many of those trends with my own experience with AtScale customer use cases. We had a great discussion on a number of topics but I highlighted a few key ones here in this post.
How the Cloud Makes Big Data Easier
Ventana research shows that 86% of the organizations surveyed expect the majority of their data to be in the cloud and a whopping 99% expect to do their analytics in the cloud. One compelling reason enterprises are moving to the cloud en masse is to simplify their analytics stack. In the cloud, customers are freed from the burden of managing their own data platform clusters and can expand and contract their data resources to meet demand without the friction of dealing with traditional data center provisioning.
See how a major retailer and a Fortune 500 food processor migrated to the cloud to simplify their analytics stack (8 minute video):
What to Watch Out for When Migrating to the Cloud
Running Big Data in the cloud is not without its pitfalls. Migrating analytics workloads often means migrating data models that may not translate well for cloud-based data warehouses. I hear from customers all the time that they assumed that the cloud platforms would be faster than their current on-premise technologies. Often, that is not the case unless they re-architect their data models and schemas to take advantage of cloud data platforms’ characteristics. Finally, migrating to the cloud is rarely an all or nothing endeavor. It make take years to migrate applications and data to the cloud which means data resides in multiple places, complicating the lives of the business user and data scientist.
See how you can avoid the cloud migration gotchas and hear how a major retailer did it with AtScale (5 minute video):
The New Analytics Stack in the Cloud
It’s not a homogeneous data world anymore (was it ever?). Hadoop was the beginning of the data lake concept and challenged the dominance of the highly structured, traditional data warehouse. Now, each of the public cloud vendors has their own version of a data lake. Their respective distributed file systems (S3 for AWS, ADLS for Azure and Google Cloud Storage for Google Cloud) have become the landing zone for data in the cloud. The rebirth of the data warehouse in the cloud with the likes of Snowflake, Redshift and BigQuery, provide the comfort of the traditional data warehouse but with elasticity and operational ease of the cloud.
All this means that there is no one repository for data in the cloud. Yet customers strive for semantic consistency, security and governance and want to manage all of it centrally. To achieve this goal, a virtualization layer for analytics has emerged to complete the new cloud analytics stack.
Learn about the new cloud analytics stack and see how AtScale provides a single control plane for your analytics no matter where data is stored (10 minute video):
It’s a brave new world with the emergence of the public cloud for analytics. This bring us new opportunities and the potential for new pitfalls. Ventana and AtScale both agree that the public cloud is a game changer for analytics and we encourage you to challenge the status quo and consider a modern analytics stack that takes full advantage of the cloud’s capabilities.
Click here to watch the on-demand webinar AtScale recently co-presented with Ventana research, “How Distributed, Dynamic and Diverse Data Has Transformed Big Data Analytics”.
The Practical Guide to Using a Semantic Layer for Data & Analytics