Conceiving Valuable Data Driven Insights

The Last Mile In Business Intelligence

How valuable is an insight if you don’t know what’s driving it?

The investment in big data made in recent years by companies has been significant. Many are now looking to capitalize on the insights to be discovered in their expansive data lakes. Developing an analytic solution is a difficult and laborious process. More than a few projects have been abandoned long before any conclusive benefit is realized. Some efforts end due to constraints on time and money, others as a result of bad design or poor end user adoption. That last point is significant. You’re going to spend considerable resources to empower your decision makers. If you build it, and they don’t come, then what?

See AtScale Intelligence Platform in action, Sign up for this webinar!

Are your users internalizing the data you present to them?

At the end of the day, any analytic solution should be consumable by it’s target audience. As data sets become larger, and the potential discovery of key insights in the overlap between data sources becomes more complicated, it is critical for business users to understand the underlying business logic employed to create the reports which they are consuming. Semantic layer transparency provides the data consumer a view into the principal drivers of an insight letting them conceive it’s true meaning.

Creating this mental model within the end user is vital to the success of an insight. Understanding the data presented empowers the decision maker to take action. They can now be responsive to conditions, isolate the principal drivers of a data visualization and make any necessary adjustments to affect change. Gone are the days of presenting charts and bubble graphs of data to consumers which seem to materialize from some cryptic black box. Audiences need to understand the pivotal indicators. Provide value to the data consumer, assist them in performing their daily business functions and user adoption will handle itself.

The tide on the data lake is shifting

As more departments within the business demand access to data, the typical stewards of data are waking up to a realization. They are going to have to support the increased adoption of self-service analytics across their organization. There is a definite market trend toward releasing more data to a process of exploratory analysis and experimentation by business centered audiences. Moving to a model in which people across functions use their expertise to discover a variety of data connections, which would otherwise go undiscovered, benefits everyone involved.

Exploratory analysis is labor intensive and usually the domain of some highly skilled (and paid) data scientist. Most companies have a limited supply of these data scientists and data engineers. They end up struggling to meet the multitude of requests for changes to OLAP/Schema leaving little time for new data models and becoming a barrier to progress. They also frequently miss the mark translating their hard work into visualizations and interfaces that make key findings clear.

Allowing the natural exploration of data by finance, sales, marketing, or any other business segment can turn your data lake into sets valuable insights that inform decision making processes specific to their job area. Providing these subject matter experts ways to assemble and combine large data sets and allowing them to decide what metrics and dimensions matter can make those insights sharper. Business users will now be able to traverse previously siloed data, discover new relationships and furnish the business a more contextual, relevant and valuable view.

Insightful conclusions evolve through iteration

Let’s face it, no one gets it right the first time and building a loop into your design process is your best bet to hone your insights with domain expertise and feedback. There’s an iterative nature to exploratory data analysis. The discovery of new data point connections can occur organically and a solution should be flexible enough to entertain continual improvement as business dynamics and priorities shift. The end goal is an ability to react and exploit advantages revealed by an insight and through these new connections.

How far into the semantic layer should I be able to reach?

Data should be purpose driven. This necessitates access to the a semantic layer where different segments of the business can iteratively model and mine insights. Many cloud implementations have very ridged constraints on the abstractions which are provided and getting changes made to those models can prevent business users from getting to the key drivers and aggregated metrics vital to their segment. Organizations should permit the business to align their process with the data points relevant to them.

Balancing the advantages of self-service exploratory insights with the pitfalls of the same exposes a need for flexibility, as well as, a need for a few constraints on your semantic layer. Matthew Baird presents a great case for a having a single layer in his blog “What is a ‘Semantic Layer’ and Why Would You Want One?” addressing business level abstraction and user perspectives.

Intelligent designs avoid pitfalls

There is no single model which can cover every business use case. Forging meaningful mental constructs from complex data sets requires numerous case focused abstractions. Opening up your semantic layer to diverse, intent based modeling, from various segments of the organization, allows a company to scale-out the relational data provided to the fantastic BI tool investments that you’ve already made.

All of this business user manipulation of data ‘on the fly’ does have its risks. Javier Guillen nicely points them out in his blog post “The Hidden Costs of Self-Service BI Initiatives “ It’s a tremendous act to be able to unleash this new self-service power to more users. It is however, paramount to the success of any analytics initiative, that there be some process constraints put in place to ensure that the models, built by the various business segments, don’t flood the production environment with redundancy or poorly designed relationships.

How does AtScale fit in?

Companies want their decision makers to work interactively with information stored in their data lakes. AtScale’s design center canvas extends this ability to build interactive virtual cubes to business designers. Permitting them to connect large data sets from various data sources in a way that relates to their business drivers. These multidimensional analytical models represent solutions to clear-cut questions which affect a specific business function and work directly on top of a variety of big data solutions (Hadoop, Google BigQuery, Amazon Redshift, and Microsoft Azure HDInsight)

Business modelers can use AtScale’s design center canvas to…

  • Visually assemble relational models using a library of your datasets and dimensions
  • Construct new datasets and dimensions to extend their library
  • Create new measures and calculated measures to extend business models
  • Get a preview of how their attributes and metrics will be presented to their BI tool
  • Design end user perspectives to constrain the list of attributes available by business function
  • Publish their models to selected environments

Queries need to run fast on big data. Interactive support for BI-style queries remains a challenge and multi-dimensional analytical queries often demand complex OLAP-style calculations and functions.

AtScale’s dimensional calculation engine enables complex business queries directly on your big data. Our algorithms optimize your BI query workloads on-demand to deliver performance. Your queries run fast because we automatically generate and maintain your aggregates.

AtScale’s advanced machine learning performance engine excels at:

  • Learning to anticipate queries across your data lake cluster
  • Generating automatic and dynamic ‘smart aggregates’ of these queries
  • Scaling to meet end user performance expectations
  • Control, with optional overrides, manual creation, and maintenance of aggregates

See AtScale Intelligence Platform in action, Sign up for this webinar?

Now you know! AtScale supports current business processes by allowing your analysts to use the BI tools they already know and love. Our world-class semantic layer allows business users to answer complex business questions at a fraction of the time that it would normally take, and best of all, live on Big Data.

We invite you to learn more about AtScale today!

Power BI/Fabric Benchmarks
TPC-DS Benchmark Result Report Download Now