Request a Demo

How to Build Business Forecasts Using Microsoft Excel Without Data Limits or Extracts

Build robust models on billions of data points


In uncertain times, a forecast underlies a company’s success or failure. Whether through keeping prices low by optimizing business operations, including cash flow, production, staff, and financial management, or by increasing knowledge of the market, business forecasting gives a business an essential tool for adapting to change and fostering competitive advantage.

But relevant forecasting isn’t easy. It requires the right set of data, data that’s in a usable format, accessible in existing tools, and that can be digested in a reasonable amount of time. With the right data set and the right analysis tools, an organization can not only conduct initial analysis that helps them move quickly from confusion to clarity, it gives them a way to drill down, asking second and third questions for deeper insight, greater refinement, and accelerated business value.


Business forecasting provides the opportunity to anticipate how the business can capitalize on or circumvent economic trends. Executives and business leaders work with analysts to answer specific business questions. Analysts then collect and analyze historical data relevant to those business questions in order to identify patterns that can be used to make decisions impacting marketing operations, financial operations, demand planning, supply chain, production and more. Typically enterprises will forecast for different time horizons like daily, monthly, quarterly and yearly depending on the business use case. At a minimum, businesses develop annual forecasts. Annual forecasts are able to smooth out sudden economic changes that can affect the business short term but have little impact on the long term and access data across the enterprise for analysis.

Anyone who conducts business forecasts knows that they’re challenging. Most of these challenges are tied to the size of the data set being analyzed. Everyone knows that data sets are exponentially growing and often consist of billions of data points. The size of data sets, and their growth, puts pressure on existing forecasting tools. Many organizations find that they’re losing the ability to conduct full-fledged analysis quickly enough to support business decision-making. The tools they use aren’t agile enough or scalable enough to handle emerging data volumes and complexities.

Let’s consider an example — Microsoft Excel™ — which many organizations use for business forecasting. Excel is a powerful tool for analysis, but has a fundamental obstacle — it’s limited to one million rows. Traditionally, organizations address this problem through data extracts, extracting a subset or sample of data that fits within Excel’s row limit. Though this is a relatively quick way to move forward, using extracts for forecasting comes with its own set of problems:

  1. Getting an extract isn’t automatic, it’s a manual process, and process complexities and delays can impede time to insight.
  2. Selecting an extract is an art, rather than a science. The sample isn’t random, so forecasts could be skewed or incomplete.
  3. Different extracts can result in KPI inconsistencies from analyst to analyst and over time.

Excel has other problems. With Excel, refreshing data as you go isn’t easy because often it isn’t responsive enough to answer questions as quickly as they are asked. It’s also hard to do deep analysis in the short time frames most businesses demand. It’s not unusual for a forecast to take days of work instead of the desired minutes.


Excel does offer a way to move past these challenges. By using Excel’s MDX Connector and SQL with an OLAP tool that supports data virtualization, organizations can mitigate the weaknesses of Excel, reducing complexity and accelerating time to value. The right OLAP tool offers a path past the million-row limit, the need for extracts, consistency problems, update handling, and slowdowns in data analysis.

What happens when organizations use the right OLAP tool?

  1. The need for extracts is eliminated. By connecting Excel “live” via its built-in OLAP interface, analysts can continue using Excel to do an analysis based on full access to the data warehouse. Data access happens in minutes instead of days, and all the challenges of data extracts go away.
  2. By using Excel’s GETPIVOTDATA and CUBE functions with a live data connection, analysts have a new, robust, way to not only add data sets to their analysis, but also to automatically update and refresh all of the data in their analyses
  3. Finally, they end up with a “virtual cube” that creates a semantic layer for data’s business definition — enforcing consistency and data governance so that one analysis doesn’t differ from another no matter what BI tool is used.

But what’s the best tool that provides these capabilities and solves the challenges of Excel? It’s AtScale.


What is it?

AtScale powers the analysis used by the Global 2000 to make million dollar business decisions. AtScale is a powerful tool for virtualizing data sources without data movement that empowers organizations with scalable, accelerated business intelligence and insights.

How do analysts rely on AtScale to power better business forecasts in Excel?

Using AtScale, analysts have a “live” OLAP connection to data, rather than trying to use Excel as a database. AtScale offers virtual cubes (virtual because they don’t require data movement or pre-computation of data) that allow an analyst to talk straight to Snowflake, Google BigQuery, Azure Synapse, Amazon Redshift or other data sources. All the summarization, slicing and dicing of data happens server side.

Analysts also get to use Excel’s native OLAP functions to weave data into forecasts. AtScale offers full access to all the relevant data with the granularity needed to create a powerful forecast, and analysts can apply Excel’s powerful GETPIVOTDATA and CUBE functions without any limitations or added complexity.

Finally, AtScale provides a Universal Semantic Layer™ accessible by every analyst across the organization. Data analyzed in Excel isn’t a stale or partial extract, it isn’t inconsistent from user to user and over time, and isn’t delayed by data movement. Everyone who creates models and forecasts works with the same data and the same definitions of that data. Whether an organization has cloud data warehouses, on-premise data warehouses, data lakes, or a mix of the three, AtScale provides live access to the data, resulting in business forecasting that’s faster and more accurate.


All of this sounds good, but are there concrete advantages and how can you quantify them?

AtScale offers anywhere from 2.5-9x improvement of data and analytics ROI. This is a result of AtScale’s positive impact on query performance, user concurrency, query compute costs and SQL complexity.

Based on cloud benchmarks, AtScale has an order of magnitude impact on the ROI of each major cloud data warehouse. Customers powering their analysis with AtScale control the costs and complexity of their cloud analytics, maintain a consistent and compliant view of data across the enterprise and force multiply the effectiveness of their data and analytics teams.

With AtScale, forecasts are performed faster and delivered sooner, accelerating operational decision making from days to seconds. They’re also higher quality, built on consistent data, even across multiple users and over time. Businesses that use AtScale make smarter decisions faster, which gives them unprecedented opportunities to take advantage of evolving economic conditions.

AtScale Accelerates Business Forecasting

We wanted to move our fast-growing business to the cloud but didn’t want to lose the capabilities we had in our on-premises environment. AtScale helped us do that and was a major part of driving cloud adoption for our data consumers and data producers across our organization.

Author Image

Matt Hartwig

Associate Director, Data Infrastructure Team

One of our top priorities was to have the ability to run rapid-fire, multi-dimensional analytics at large scale, directly from the BI tools our data users prefer. With AtScale, users can run live queries, straight to Google BigQuery at great speeds. It is not something that we saw anyone else able to deliver.

Author Image

Maurice Lacroix

BI Product Owner,

Without AtScale, analytics is too slow. We would have to devote significant data engineering time and resources to even come close to what AtScale provides automatically. This is critical to our team’s ability to be successful with production-level analytics.

Author Image

Mark Stange-Tregear

VP of Analytics, Rakuten Rewards

AtScale Product Overview

AtScale provides the premier platform for data architecture modernization. AtScale connects you to live data using one set of semantics without having to move any data. Leveraging AtScale’s Autonomous Data Engineering™, query performance

is improved by order of magnitude. AtScale inherits native security and provides additional governance and security controls to enable self-service analytics with consistency, safety and control. AtScale’s Intelligent Data Virtualization™ and intuitive data modeling enables access to new data sources and platforms without ETL and or needing to call in data engineering.


The Global 2000 relies on AtScale – the intelligent data virtualization company – to provide a single, secured and governed workspace for distributed data. The combination of the company’s Cloud OLAP Engine, Autonomous Data Engineering™ and Universal Semantic Layer™ powers business intelligence and machine learning resulting in faster, more accurate business decisions at scale. For more information, visit

More Great Content