November 4, 2020The Three Opportunities for Cloud OLAP to Improve Enterprise Analytics
The data and analytics space continues to evolve at an incredibly rapid pace as enterprises use data from across the organization to report on and predict the future. As enterprises recognize new ways to drive revenue, reduce costs and/or mitigate risks, new products are being developed to realize these desired business outcomes.
Our team is in the fortunate position to work with the Global 2000, learning extensively about their businesses and understanding how technology can unlock the value of their data across the organization. Being able to work with these complex customers and partner with the leading technology providers in the Big Data space, we’ve identified the three keys that successful enterprises use for building a business case to purchase and implement enterprise data and analytics technology successfully.
1. Understand the “Why”
There are a bevy of innovative technology solutions in the market targeted to solve specific challenges. We all read about the latest, greatest data visualization product, data science application and cloud data platform every day. Typically, these articles are about how the technology and its features are so much better than what currently exists. Unfortunately, a list of features doesn’t help us understand “why” an enterprise needs to purchase the solution.
At AtScale, we have the pleasure of talking to many business leaders from companies around the world who are facing very complex data and analytics challenges. Throughout our initial conversations, we learn a lot about the role data and analytics plays in their businesses. These conversations usually uncover technical pains but struggle to identify the “why” of the business.
The person who will ultimately sign off on purchasing new technology is typically the one who manages a business unit’s profit and loss statement. This individual is commonly referred to as the “economic buyer”. The economic buyer has the ultimate authority for their organization’s use of funds. Understanding the “why” is about aligning the technology solution that solves your problem to the economic buyer’s top priorities.
As an example, when speaking with technical data engineering teams, we hear a lot about the needs to analyze data no matter where it’s stored or what it is doing with its multidimensional / Cloud OLAP analysis. During our conversations with those data engineering teams’ internal customers in Finance, Sales, Inventory, Supply Chain, Marketing, and Human Resources, we hear about two common needs:
- Reducing the reliance on data engineering teams when analyzing data.
- Enabling business users to work with their preferred business intelligence (BI) tools.
These challenges are very important and reflect what companies should want to achieve but overlook the overarching positive business impact or the “why”.
The economic buyer within an enterprise who will ultimately sign off on a technology purchase wants to understand its impact on the business in at least one of three ways. Therefore, the “why” needs to make the case of how the technology will drive revenue, lower costs or reduce risk for the enterprise.
While it’s great to work towards an ideal “after” scenario, it’s important to quantify the impact of a technology solution on a business. We tend to see a few common business outcomes that economic influencers prioritize including:
- Reducing the cost of labor to deliver Cloud OLAP analytics.
- Enabling businesses with the functionality to sell a reporting analytics platform resulting in an additional revenue stream to the organization.
- Eliminating the time to market risk by orders of magnitude when it comes to making data driven decisions.
As an example, one of our customers had a multi-million dollar labor budget and was looking for a solution like AtScale’s Intelligent Data Virtualization to deliver their Cloud OLAP analytics on Snowflake. They had already gone to their Board of Directors to allocate funds for a new technology stack and were hoping to reallocate funds from their labor budget in the case that AtScale could reduce the need to hire additional data engineers in order to secure the funding. Working with the prospect’s team, we were able to help them make the business case for “why” AtScale. We showed that by using AtScale to model their data for analysis as opposed to creating views of data, we saved the prospect roughly $1 million per year in labor costs. With the hiring market as competitive as ever, they appreciated a more cost efficient approach to achieve their analytics goals.
2. Identify the “How”
The “how” tends to be the most fun part of putting together a business case. This is where evaluators, technologists and practitioners get to put their hands on shiny new technology. With the “why” of the business case laid out in terms of revenue creation, cost reduction and risk reduction; the “how” explains the specific technology features that will unlock the ability to achieve the desired business outcome.
Most of the folks that we talk to have an idea of the “how” well before the “why”, and that’s okay. Understanding your challenge and the feature set that can alleviate it, likely coincides with a larger pain within the business. The hardest part of putting together a business case is adding the additional perspectives to the conversation to ensure a holistic enterprise solution. We often work with teams to help them include voices from technical data engineering teams, business stakeholders, and executive teams to uncover the real pain that exists within the organization and the impact that alleviating the pain will have on the business.
For teams that are trying to map their technical pain to the pain of the business, it’s helpful to talk with stakeholders who are most affected by the outcome of this solution. Within those conversations it’s important to understand:
- How long have they been dealing with the pain?
- What happens if the pain isn’t dealt with?
- Who else is feeling the impact of it?
- What is the ultimate cost of the pain to the business in terms of time, money and missed opportunities?
As an example, one of our customers told us that it took the company too much time to report on their data because it was stored in multiple data sources and required a lot of manual effort to prepare for analysis. As we discussed the challenge with our customer’s technical team, we learned that they were looking for a solution to virtualize their data to eliminate a lot of the steps for data preparation. Upon further discussion, we learned that the team who sparked the interest in solving this data virtualization challenge was in the Procurement Office that rolled up to the Chief Financial Officer (CFO). Our customer’s technical team was proactive to solve this technical issue to alleviate the extra work their colleagues in Procurement were placing on them.
While we could have pursued working with the technical team to solve their problem, we instead assisted the technical team to better understand the perspective of the Procurement Office. By including the perspective of the Procurement team and CFO, we were able to uncover how the inability to quickly report on financial data impacted the CFO’s ability to accurately report expenditures to their company’s board which ultimately affected the macro decisions of the business.
This simple, additional conversation unlocked both teams to be aligned on what technology was going to be important to solve their pains. We ultimately identified which capabilities of a solution were most important to ensure that a solution didn’t just solve a data virtualization challenge, but enabled the business to reduce their enterprise risk by making more accurate, board-level, business decisions.
3. Determine How to Measure Ongoing Success
There is a common mistake made among teams who purchase software to solely focus on the first two keys of building a successful business case. The pain is well identified amongst stakeholders across technical, business and executive teams. Business outcomes are quantified on how the solution is going to drive revenue, reduce costs or reduce risks. Solution requirements are prioritized and tested within the team to ensure success within the enterprise’s architecture. All of this preparation and diligence is important before purchasing a solution. But great investments are made by ensuring there are ongoing metrics that can measure the impact the solution makes across the enterprise.
At AtScale, we work with our customers’ teams to understand what success would look like over various periods of time. Depending on the enterprise and their goals, we’ll identify checkpoints every three to six months over the course of two to three years. At the outset of our partnership with a customer, we will have identified which metrics are most relevant to the enterprise’s goals and attempt to quantify what success will look like over these periods of time.
As an example, we work with many enterprises who are performing analytics on cloud data warehouses like Google BigQuery, Snowflake, Amazon Redshift, Databricks and Azure. Chief Data Officers (CDO) have made significant investments in modernizing their analytics architecture in the cloud. However, when they enable their teams to do multidimensional analysis against these cloud data warehouses, the compute costs quickly exceed the expected budget and the CDO then starts to put restrictions in place as to how many people can do analysis.
We work with the CDO, data engineering teams who manage the cloud architecture and the lines of business who actually perform the multidimensional analysis to identify what success looks like. Typically, we identify the expected budget spend (credits, slots, etc.) and how many business analysts need to be able to run analytics. From there we quantify the delta between how many users are able to perform analysis today within the expected analytics budget vs. the total number of analysts they would like to perform analysis. We then set a metric such as the percentage of additional analysts that will be able to do analysis while keeping their cloud spend constant.
In this specific example, we’ve been able to celebrate a lot of success with teams who purchase AtScale. We’ve seen orders of magnitude increase for the number of users running analysis without having to spend additional dollars on cloud compute. The executive team is elated because they were able to get the value of cloud without the excessive spend, data engineering teams are happy because they can serve more of their internal customers without having to teach analysts how to conserve compute and analysts are happy because they get the accelerated time to insights that enable them to do their job more efficiently.
Leveraging the Three Keys
Executives, technical teams and business stakeholders are always looking for ways to improve their ability to perform their jobs. Technology solutions offer incredible opportunities to realize the goals of the enterprise. When evaluating and purchasing technology, it’s important to identify the three keys as you outline the impact it will have across the enterprise. Understanding the “why”, identifying the “how”, and determining “how to measure ongoing success,” will be the difference to not only purchasing the solution of choice but the gateway to setting the enterprise up for success.