See AtScale in Action
Request a Demo
Close

Request a Demo

Data Virtualization and Data Architecture Modernization

How enterprises are leveraging intelligent data virtualization for better, faster and more secure data

Enterprises that have moved to the cloud are implementing data virtualization to connect the dots of disparate analytics tools and systems, and create shared insights across the business.

Gartner projects that by 2020, 60 percent of organizations will implement data virtualization in their data integration tool sets. And according to Enterprise Strategy Group research, companies that have armed their business with shared data intelligence are 18x more likely to make better and faster data driven decisions than their competition.

Let’s first take a step back, however, and explain the concept at a high level.

Intelligent data virtualization is defined as an approach to data management that allows applications to use data without requiring technical details about it, such as how it is structured, where it is physically located, or how the data is accessed. All the various data sources are accessible from a common logical data access point.

Intelligent data virtualization is defined as an approach to data management that allows applications to use data without requiring technical details about it, such as how it is structured, where it is physically located, or how the data is accessed. All the various data sources are accessible from a common logical data access point.

Intelligent data virtualization provides a bridge across data warehouses, data marts, and data lakes, delivering a single view of an organization’s data without having to physically integrate it. A universal semantic layer abstracts away the technical aspects of stored data, such as location, storage structure, API, access language, and storage technology. This abstraction enables enterprises to use their data, no matter where it’s actually stored, to produce fast, timely insights.

With intelligent data virtualization, you can gain meaning from data across platforms, leveraging all of your data automatically without human assistance. The technology can read signals to understand intent, helping to avoid situations that create potential downside. You can set this up to work for an extended period of time in the background, to perform tasks that help your business users derive insights from data across the enterprise.

AtScale provides autonomous data engineering, which alleviates complex and time-consuming data movement and transformations, and adds additional security and governance for true self-service analytics—no matter where the data is stored or how it’s formatted—clearing the path to a shared organizational data intellect.

Agility, Security and Performance: Benefits of an Adaptive Analytics Fabric

Accelerated query performance

Think for a moment about the conundrum business decision makers are in: There is immense pressure to be agile in making data-driven decisions, yet queries on databases with billions of records can take days to return. Organizations can only make proactive business decisions that drive booming, start-uplike growth when they have agile tools at their disposal that can handle this query load quickly.

Shortening the cycle time required to query big data improves an organization’s ability to make agile decisions. With AtScale’s autonomous data engineering, as queries are run against datasets in the adaptive analytics fabric (the virtualized data and associated tools to aid analytics speed, accuracy, and ease of use), machine learning is applied to determine what data within the larger set is needed. AtScale’s A3 platform optimizes analytical workloads based on user behaviors, using artificial intelligence.

Because extraneous data is bypassed altogether during the query process, data is delivered between 2x and up to 1000x faster, depending on which data sets are queried. With query times shortened from days to hours or minutes, BI teams can deliver time-sensitive analytics to the line of business when they need them and have time left over to explore data more creatively beyond the minimal missioncritical needs.

One business-centric, shared view of data, agnostic of platform

According to Forbes, over 60% of enterprises use two or more business intelligence (BI) tools across their teams and company. Without normalizing different data results from disparate BI tools, too many cooks in the analytics platform kitchen could cause a costly breakdown during data analysis.

AtScale normalizes data with a common organizational business logic that is analytics/BI tool-agnostic. In other words, by applying a universal semantic layer on top of the data infrastructure, organizations can bring together disparate data systems. Users will be able to leverage the analytics tool of their preference, without needing to worry about the integrity of the data, as all BI tools are accessing the same data with the same business logic.

Because enterprises using an adaptive analytics fabric see their entire analytics software portfolio speaking the same language, they can easily gain a complete view of all data from a single portal, no matter where siloed data is stored. AtScale’s intelligent virtualization “stitches” data from multiple data sources into one fabric or “cube”. Unlike data federation, AtScale’s Intelligent Data Virtualization leaves data in place.

Remember when we mentioned earlier the importance of being agile in decision-making to deliver significant business results? It’s a lot easier when the line of business and BI teams are looking at the same — and consistently accurate — data regardless of the tool they used to get it.

Data security at rest and in flight

Every time BI teams work with data sets across multiple, siloed databases, each with their own security policies, they are opening up their organization to risk. While DataOps needs to be agile (a common theme here) in supplying insights to the business quickly, they can’t do so at the expense of their security practices.

AtScale’s security and data governance capabilities leverage a company’s best security practices and add another layer of capabilities to ensure data security, not limited to: end-to-end TLS to protect data in flight, LDAPS, Active Directors, IdP, and SAML for authentication, and JWT, CORS and REST for API access.

Also of note: When accessing data, security leakage can occur when enterprises utilize connection pools for BI tools or depend on security aggregation systems. An adaptive analytics fabric solves these challenges by checking security requirements at the source databases and applying those requirements to query results. User identities are considered, even when accessing data through a shared connection pool, and security policies from all data sources are collected and merged to filter results appropriately. These same security policies are applied to data aggregates, eliminating unintentional exposure of restricted or private data.

Liberate your data now

By applying AtScale’s intelligent data virtualization in the enterprise, organizations can overcome the challenges of siloed data, long query cycles, and the enterprise risk associated with multiple security policies across multiple databases. They can start getting value from data right away, whether it’s onpremises or in the cloud (e.g. multiple relational database management system (RDBMS) instances with different configurations or locations).

Regardless of how data is stored or how it is formatted, a data abstraction layer protects business users from the complexity of existing and future data platforms. Intelligent data virtualization makes hybrid and multi-cloud data architectures a reality and protects users from disruption.

Most importantly, the enterprise transforming their data operations is able to bring both business and BI users together with consistent data from the analytics platform of their choice. These shared insights across the business are ultimately what will fuel business now and well into the future.

More Great Content

ABOUT ATSCALE

AtScale is the leading provider of adaptive analytics for data architecture modernization, empowering citizen data scientists
to accelerate and scale their business’ data analytics and science capabilities and ultimately build insight-driven enterprises. For more information, visit us at atscale.com.

KEEP READING & DOWNLOAD THE PDF