Regardless of how data is stored or where it is stored, AtScale’s data abstraction layer insulates business users from the complexity of existing and future data platforms. AtScale’s intelligent data virtualization makes hybrid and multi-cloud data architectures a reality while protecting your users from disruption.
Take advantage of the flexibility, scalability and cost savings of a schema-on-read data lake whether you’re connecting to data on cloud storage like S3 or to HDFS on-premises or on the cloud.
Connect AtScale to traditional data platforms like Microsoft SQL Server and Teradata, and new cloud data warehouses such as Amazon Redshift, Google BigQuery and Snowflake.
Keep your options open by insulating your downstream business users from the disruption of a platform migration. Lift-and-shift to new data platforms without business users ever noticing.
Leverage your tools’ native drivers to connect up your BI and ML tools and your data platforms. No client-side software is needed to get live connections to your data today, regardless of whether that data resides on-premise or in the cloud.
A Fortune 50 retailer was using Microsoft SSAS cubes on Teradata, which was unscalable from both a cost and performance perspective. Analysts could view just 1 month of data at a time when they wanted to review ranges of 2.5 years. Analysts were also unable to bencmark stores in the same region or benchmark different regions against each other. Compounding the problem further, cube rebuilds took four days, and analysts could not use data during this manual process.
First, the retailer implemented Hadoop to replace Teradata and AtScale’s Adaptive Analytics Fabric to replace Microsoft SSAS. However, while moving to Teradata, the retailer decided to move to the cloud instead of sinking time and money into managing a large Hadoop cluster. The retailer was able to seamlessly switch from Hadoop to Google BigQuery by repointing their AtScale virtual cubes in a single weekend without disrupting business users.
Now that they are using A3, the retailer’s analysts can look at data in their preferred range of 2.5 years, enabling deeper insights on key business questions. Analysts can now benchmark stores and regions against one another. Data now refreshes in 20 minutes, meaning that analysts are able to work with current data as opposed to information that is almost a week old.
The retailer uses AtScale to deliver over 17,000 queries per day, in under 1 second, to 4,000 business users.
White Paper: The Rise of Intelligent Data Virtualization
Learn how the prevalence of cloud transformation as a critical enterprise initiative has led to the emergence of intelligent data virtualization as a new paradigm in enterprise data architecture.
Checklist: The Ultimate Data Virtualization Buyer’s Checklist
The Ultimate Data Virtualization Buyer’s Checklist is a great tool to help you determine the capabilities you need from a data virtualization vendor.