Why GigaOm Recognized Universal Semantic Modeling as the Future of AI Analytics

Estimated Reading Time: 1 minutes

Enterprise teams today are juggling Power BI and Tableau, Python and R, plus a growing collection of AI copilots and agents. The problem? Every tool seems to have its own version of what “revenue per customer” actually means.

You’ve probably seen this mess firsthand. Marketing’s customer segmentation doesn’t match what Finance uses. The AI chatbot gives different numbers than the executive dashboard. And getting everyone aligned on new metrics? That’s a three-week project involving five other teams and at least two heated Slack threads.

The 2025 GigaOm Semantic Layer Radar Report positioned AtScale as both a Leader and Fast Mover, specifically recognizing our universal semantic modeling approach. As GigaOm noted: 

“AtScale sponsors and continues to develop SML, an object-oriented, YAML-based modeling language for defining semantic model objects and models with code.”

Universal Semantic Modeling Languages 

Here’s where the industry conversation gets interesting. There is considerable debate about what should drive the semantic model. Is it in your BI tool, or in your data warehouse, or somewhere else entirely? Avoiding vendor lock-in and proprietary semantic models is crucial for achieving enterprise-scale and flexibility. 

Leveraging an open-source option removes that lock-in, which is exactly what SML does for semantic modeling. By having an open-source repository that anyone can pull and contribute to, the definitions of semantics evolve organically as the market adopts the need for a universal semantic layer.

Open-source Semantic Modeling Language (SML)

When you define customer segmentation in SML, it works the same way in Power BI, Python, and your AI agents. No translation layer. No “close enough” approximations. Just consistent business logic everywhere.

GigaOm highlighted this advantage, noting SML’s “modularization and composability of semantic definitions” with “Git native” architecture enabling “version history, pull requests, and CI/CD.”

Why This Actually Matters for AI Analytics

GigaOm called out something important about AtScale: we’re not just retrofitting old BI technology for AI. Our platform was built to “serve as a semantic foundation for GenAI workloads” and “supply the business context for LLMs to produce refined and tailored output.”

In practical terms, your AI agents can leverage the same trusted business logic that powers your quarterly board presentations. When someone asks the AI chatbot about Q3 revenue, it’s pulling from the exact same definitions that the CFO uses in their executive dashboard.

We support all the protocols that matter: MCP, our AI-Link Python SDK, as well as direct integration with OpenAI, Anthropic, and frameworks like LangChain. But the real win isn’t the technical connectivity. It’s that your AI tools finally give answers you can trust because they’re using the same business rules as everything else.

Real Enterprise Impact

Take one of our Fortune 500 retail customers. Their executives were receiving revenue numbers from Power BI that didn’t align with what the merchandising team saw in Tableau: different teams, different tools, and different definitions of the same basic business metric. Every time they wanted to launch a new analytics project, it turned into a coordination nightmare. IT had to map out which definition each team was using, business analysts had to reconcile the differences, and by the time everyone agreed on what “revenue” meant, the business need had probably changed.

Define metrics once across an organization using the AtScale Semantic Layer and MCP server

Now? They define revenue once in their AtScale semantic layer, and it flows consistently everywhere. Their new AI demand forecasting system gives recommendations based on the exact same business logic the CEO sees in quarterly reviews. New analytics projects that used to take weeks now get deployed in days.

How We Actually Built This

AtScale’s approach comes down to a few key ideas that work in the real world:

Define once, use everywhere: Write your business logic in SML, and it automatically translates to whatever each tool needs: SQL for your database, DAX for Power BI, Python for data science work, or structured context for AI agents.

Let tools be tools: We don’t force everyone to use the same interface. Power BI users get native Power BI features, Python folks get their familiar APIs, and AI agents get the protocols they expect. But they’re all pulling from the same semantic foundation.

Build like software: Your semantic models work like code version control, collaboration, testing, and automated deployment. GigaOm specifically called out SML’s “Git native” approach that enables “version history, pull requests, and CI/CD.”

Mix and match freely: Business concepts combine however you need them. The same customer segments and product hierarchies that power your sales dashboard can drive marketing attribution analysis or financial reporting without any extra work.

Why This Beats the Alternatives

GigaOm noted that we’ve achieved “full parity with the modeling features of Power BI, including model inheritance, live connections, and client-side measures.” That’s not just a technical feature; it shows how universal semantic modeling actually works.

You get all the native capabilities of each tool, but without the semantic fragmentation that usually comes with it. Power BI users get Power BI features. Tableau users get Tableau features. But everyone’s working with the same underlying business definitions.

Vendor-specific formats can’t deliver that. They’re purpose-built to support narrow use cases, which can slow things down and dilute functionality. We provide best in breed capabilities that can be used across your entire enterprise.

The Bottom Line

The companies that will win in AI-driven analytics aren’t the ones with the fanciest individual tools; rather, it’s those that leverage them effectively. They’re the ones that figured out how to keep their business logic consistent across everything, traditional BI, data science work, and AI applications.

GigaOm’s recognition of AtScale as both Leader and Fast Mover validates something we’ve seen with our customers: universal semantic modeling isn’t just a nice architectural concept. It’s the practical foundation that enables you to move quickly with AI while actually trusting the results.

The decisions you make about semantic infrastructure today will determine whether your AI initiatives enhance your analytics or create new silos to manage. SML and universal semantic modeling provide a path forward that actually works.

Ready to see how universal semantic modeling accelerates your AI analytics? Download the 2025 GigaOm Semantic Layer Radar Report for the complete analysis.

SHARE
ANALYST REPORT
GigaOm 2025 Sonar Report Semantic Layers and Metric Stores - AtScale

See AtScale in Action

Schedule a Live Demo Today