The leadership team of a Fortune 50 DIY retailer tasked their IT department with moving data from an on-premise enterprise data warehouse to a cloud enterprise data warehouse. The retailer needed an economical rollout of broad analytics use cases, with the focus being more on advanced analytics at a scalable price with specific improvements in their BI tools’ performance querying against a more modern data store.
Ultimately, the decision was made to select Google BigQuery. The retailer’s IT department needed to determine how to move their data from Teradata to Google BigQuery without disrupting their business users. Additionally, the retailer had to connect multiple frontend business intelligence tools to BigQuery simultaneously, while maintaining high levels of performance and data governance across each BI tool.
The retailer’s IT department was able to connect their different BI tools to their on-premise warehouse using AtScale’s Adaptive Analytics Fabric. When they had moved their data to the cloud, they were able to re-point the AtScale Adaptive Analytics from their on-premises data warehouse to their new cloud data warehouse within a single weekend. Not one BI user was impacted when performing their reporting tasks on the following Monday morning. Business analysts were able to run the exact same reports as they had on Friday without realizing that the data store they were querying against had changed.
AtScale’s capacity to help its customers move data to the cloud without business interruption enabled this retailer to save millions of dollars that were being spent on legacy data warehouses. Once data was moved into Google BigQuery, AtScale’s accelerated structures provided further cost savings, as queries did not have to be executed against raw fact tables.
AtScale also greatly improved the performance of the organization’s various BI tools. Business analysts are now able to receive insights in seconds, whereas prior to using AtScale it could take hours or even days to view the answer to a query. AtScale delivers 17,000 subsecond queries per day to a constituency of 3000 business users, regardless of whether those users are in Excel, Cognos, or Tableau.
Toyota was in the process of merging 35+ constituent North American companies into a single structure, forcing them to re-think how data warehousing and analytics would be architected across the business. As a result of the consolidation, the business tasked its IT department with creating a performant single semantic layer for its business analyst teams.
The primary goal behind this initiative was to accelerate the time to insight on queries written by the automakers’ finance and IOT analysts. Prior to the project’s implementation, analysts would need to wait weeks to see the results from a query, a delay that greatly hindered their ability to provide actionable insights on key business questions. Toyota’s backend infrastructure was partly to blame for the slow query response time, as data was siloed across thousands of individual data marts.
Toyota’s IT department migrated data from a variety of legacy tools to a Cloudera-based data lake. Once data was in the data lake, a solution was needed to ensure that Toyota’s primary BI tools—Tableau and PowerBI—did not frequently consume all available Cloudera resources due to simultaneous requests for full table scans.
Toyota leveraged AtScale’s Adaptive Analytics Fabric to write one multi-dimensional model of their data, and to connect that model with their Cloudera data lake on the back end and with Tableau and PowerBI on the front end.
With AtScale acting as a single semantic layer between Toyota’s data lake and its BI tools, business analysts are able to receive insight from their queries within seconds, whereas previously response times could take over four weeks. Additionally, AtScale has allowed the Toyota’s users of Tableau and PowerBI to utilize their preferred tools in a self-service, ad-hoc fashion.
The enhancements AtScale has provided Toyota’s operational analytics have enabled its dealers to be more competitive, as they are now able to make decisions based on a dynamic analytics environment.
Processing close to 25% of all US credit card transactions, this global financial services leader has more than 120 million customers worldwide. To better anticipate and identify fraudulent activity, the company sought to analyze credit-card activity in real time; but its traditional data architecture could no longer keep up with such hyper-speed analysis demands.
Historically, to analyze activity for risk and fraud, analytics professionals at this financial services company had to move anonymized credit card activity data into multi-dimensional cubes on a regular basis.
This process entailed moving raw card data into Hadoop, processing it via ETL into a data warehouse, structuring the data via SQL Server, and updating a cube via Microsoft Analysis Services. This multi-step process introduced recurring four day cube rebuilds and obstructed analysts’ ability to identify and get ahead of fraudulent trends. In addition to being cumbersome and inefficient, the organization’s data warehouse provider charged $40 million for an overage, increasing the incentive to modernize their operational analytics processes.
To be able to proactively decipher fraudulent activity, analysts require the ability to review transaction level card data in tandem with data collection, meaning all data had to be in the same place at the same time.
AtScale Adaptive Analytics has been implemented to solve for this need by enabling analysts to access and query all the data in Hadoop with sub-second response time, regardless of which front-end BI tool they are using.
AtScale has allowed this global Financial Services company to analyze multiple years of credit-card activity simultaneously, whereas previously they could only review data one fiscal quarter at a time. When data related to fraudulent or risky activity is identified, AtScale enables business users to immediately drill down to credit card transaction details. Prior to installing A3, drilling to transaction level data required business users to file an IT ticket and then wait hours, or even days, for the ticket to be resolved.
AtScale has removed the need to move data three times and to wait four days for a cube rebuild in order to access the data that analysts require. As a result, analysts are able to pinpoint fraudulent transactions almost immediately after they occur, and they can use that information to predict future fraud. Eliminating the need to move data has also facilitated better tracking of data lineage, a key regulatory requirement. As opposed to risking data interference with ETL and data movement, our customer can confidently accommodate regulators by quickly identifying the when, where and whom behind transaction data.
With real-time, complete, governed, self-service access to all credit card data in Hadoop, this global financial services company is better able to track and update fraud protection algorithms, to deliver consistent protection and services to customers worldwide.
This leading digital eCommerce company provides a fast and reliable online experience through which customers can receive digital cashback offers from retailers across the U.S. Working with 3,000+ merchants, they needed to deliver daily transaction + event tracking reports, across 5TB of usage data (growing to 30TB in 2017) and nearly $4M in cashback transactions per year. Retail merchant decision-makers depend on these reports to adjust offers to align with daily fluctuating market demand.
Report creation entailed moving all online-activity data, offer redemption and other data into Hadoop. Next came ETL (extract, transform, load) processes, taking unstructured data into structured within SQL Server. Finally data landed in 3 separate SQL Server Analysis Services (OLAP) cubes (marketing acquisition, shopping, and merchant); there was so much data it all couldn’t fit into just one cube.
Data grew at such pace and volume that delivering the latest day’s data via data and cube refreshes began to take 24+ hours. Not only was 24+ hours outside of reporting SLAs for retail clients, but insights revealed were in some cases no longer relevant; consumer desires and demands fluctuate quickly in the online offer market. To maintain pace with consumer behavior and merchant insight needs, this leading eCommerce leader recognized eliminating multiple data writes and data movement was key to faster analytics and reporting insights.
With AtScale, they capitalized on their Hadoop and BI investments; both tools and skills. Their analytics professionals are now able to execute Tableau and Excel queries without requiring IT moving data out of Hadoop and into a relational database or cube. AtScale’s Universal Semantic LayerTM adapts aggregates in response to user queries, so Tableau queries and reports respond at an interactive pace that analysis require. For the first time analysts can analyze current and historical usage data (not just data subsets pulled into cubes); and they do it in less time. They are now finding trends, outliers and opportunities they never did before.
By driving faster performing analytics on data that doesn’t have to be moved out of Hadoop, this digital cashback leader supports more offers that better meet end-customer desired cashback demands. They deliver better service to retail clients and a better end-customer experience that truly differentiates them from competition.