The 2025 GigaOm Semantic Layer Radar Report recognized AtScale as the Leader and Fast Mover for our innovations in composable modeling, open semantics, and AI readiness. Composable models treat every metric and dimension as modular, reusable components. This architecture eliminates semantic drift, scales across BI and AI platforms, and provides the governance enterprises need without sacrificing speed or consistency.
The Future of the Semantic Layer
Enterprises are at a crossroads. As AI copilots, conversational BI, and real-time analytics move from experimentation to daily workflows, many organizations are discovering that traditional semantic layer approaches create more problems than they solve.
Definitions drift across tools. “Revenue” in Tableau doesn’t match “Revenue” in Power BI, and neither matches what an AI agent returns in Slack or Google Meet. Teams spend hours reconciling numbers that are “almost right” but never aligned. Inconsistency erodes trust, slows down decisions, and makes AI adoption riskier.
This is exactly why GigaOm’s 2025 Semantic Layer Radar Report named AtScale the Leader and Fast Mover. Their independent validation highlights what we’ve built our platform on: composability is the only sustainable way to scale semantics across BI and AI without drift, duplication, or chaos.
“Semantic layers help organizations achieve fundamental business objectives by making sure analytics results are thorough, meaningful, deterministic, and consistent.”
– GigaOm 2025 Radar Report
Composable Models: Modular, Governed, AI-Ready
Instead of eliminating semantic drift, embedded semantic layers often amplify it. Over time, they produce tightly coupled, brittle architectures that cannot evolve as definitions change.
“Semantic layers tied to individual platforms inevitably fragment. Without openness and modularity, drift is unavoidable.”
– GigaOm 2025 Radar Report
Composable semantic models take a fundamentally different approach. Every metric, dimension, and hierarchy is treated as a modular, reusable component. Using the open-source Semantic Modeling Language (SML), these objects can be versioned in Git, extended without overwriting, and governed independently.
Rather than fragile integrations, composable semantics become durable building blocks, objects that can be assembled and reassembled across BI platforms, AI copilots, and real-time applications. This aligns with Gartner’s vision of composable analytics, which they describe as an adaptable, modular framework that enables enterprises to adjust as business needs evolve.
The Cost of Semantic Drift
Semantic drift is more than an inconvenience; it’s a systemic risk. Imagine this common scenario:
• Finance reports “Revenue” as $10.2M in Power BI
• Marketing shows “Revenue” as $10.4M in Tableau
• An AI copilot surfaces “Revenue” as $9.8M in Slack
Which one do you bring to the board meeting?
Embedded semantic models don’t solve this. By excluding advanced objects and requiring separate governance layers, they often exacerbate drift. Over time, organizations are left reconciling “almost consistent” metrics that undermine trust in both BI and AI systems.
How Composability Eliminates Drift
Composable semantics solve drift at the architectural level. Business units can extend enterprise definitions without overwriting them. Updates propagate seamlessly across platforms through Git versioning, so “Revenue” is defined once and reused everywhere from Excel to Power BI to GenAI copilots.
Governance is embedded at the object level. Lineage and permissions travel with each metric and dimension, ensuring compliance and traceability by design.
Just as importantly, composability resolves the false tradeoff between speed and trust. BI dashboards and AI agents require low-latency responses, but those responses are only valuable if they are governed. The result is consistent, accurate answers delivered at the speed modern enterprises demand.
Bluemercury: A Composable Success Story
Bluemercury, a luxury beauty retailer, illustrates the business impact of composable semantics. Before AtScale, Finance, Marketing, and Store Operations all used different systems, ERP, CRM, and POS, each with its own definitions of sales, revenue, and margin. Meetings were dominated by reconciliation rather than strategy.
By adopting AtScale’s composable semantic layer, Bluemercury created a single source of truth across all platforms. Power BI, Tableau, and ERP users now access identical definitions, each object governed with role-based permissions and complete lineage. Importantly, the same objects also power their AI and real-time analytics use cases.
Today, 80 percent of Bluemercury’s enterprise data flows through AtScale. Conflicting metrics have been eliminated, and adoption of self-service analytics has soared. As Praful Deshpande, Managing Director of Data and Technology, explained:
“Once people understood that the semantic layer gives them a trusted version of the truth and that they didn’t have to manually reconcile reports anymore, the adoption just skyrocketed.”
Why Analyst Validation Matters
Independent validation is critical in a market where every vendor claims to “own” the semantic layer. GigaOm’s 2025 Radar Report underscored three themes that align directly with AtScale’s approach to composable modeling:
- Semantic Layers drive consistency
- Semantic Layers are increasingly essential for GenAI
- Semantic Layers ensure that agreed-upon metrics remain reliable across every platform
GigaOm also reminded readers that semantic layers aren’t new. They’ve been part of the enterprise stack for decades, but the industry has only recently rediscovered their full potential. As GigaOm put it, modern semantic layers “benefit from technologies and features that bring them squarely into the modern era.”
In other words, the modern semantic layer combines the best of both worlds: the proven dimensional modeling approaches that enterprises trust, and the scalability, automation, and interoperability required for today’s AI-driven environments. Or as GigaOm noted, “The modern semantic layer solution now represents the best of both worlds.”
AtScale’s placement as both the Leader and Fast Mover reflects our role in carrying this concept forward, not just preserving what worked in the past, but extending it with open standards, GenAI integration, and a composable foundation for scale.
Why Composability is the Future
The industry is moving from monolithic stacks to composable frameworks. Gartner refers to this as “data dynamism,” and it’s essential for organizations that need to adapt quickly to new tools, market conditions, and AI requirements.
By making semantics open, modular, reusable, and governed, AtScale delivers:
• Flexibility to avoid vendor lock-in
• Scalability to expand individual components instead of entire systems
• Agility to adapt to changing market and business demands
• Governance to ensure consistency across BI, AI, and real-time applications
This is why enterprises like Bluemercury and independent analysts, such as GigaOm, see composability as the future of the semantic layer.
The Bottom Line
By embedding governance into modular, reusable objects, AtScale delivers the foundation to eliminate semantic drift, enable AI readiness, and align business intelligence with generative AI at scale.
That is why GigaOm named AtScale the Leader and Fast Mover in the 2025 Semantic Layer Radar Report and why forward-looking enterprises are embracing composability as the foundation of their data strategy.
SHARE
ANALYST REPORT