It’s very easy to get a data model set up in AtScale, working with data in the Snowflake Data Cloud. In fact, if you follow these steps, you should be up and running in just a few minutes.
Step 1. Log into AtScale and open the AtScale Design Center.
This is the area of AtScale where you can see all the tools you have at your disposal. This is where you’ll build your data model.
Step 2. Choose the data model you want to work on.
You can either start from scratch to create a new model, import an XML file, or work on an existing project.
Here, we’ll add a new AtScale data model to an existing project. To do that, just select the Project, and then select Add New Cube. You’ll need to give it a name and add a description. Notice that you can also give access to all users – this means that anyone accessing this particular model will get access to every modification and improvement you make as soon as you hit Publish.
You’ll see that your new data cube model has been added to your Project.
Step 3. Select Enter Model to work on the main Cube Canvas.
This is where you’ll do all your work on your data models. We’re going to create a logical model that will ride on the physical tables that already exist within Snowflake.
Step 4. Set up a Preview Box.
The Preview box will let you see what your data is going to look like within your BI tools.
Right now, you can see we don’t have any measures or dimensions in place. Our next step will be to add those in.
Step 5. Select the data you want to see in the model.
Go into your Library and select the data you want to see by drag/dropping it into your canvas. Here we’ve selected Factinternetsales as the fact table.
Step 6. Create your measures.
Select the Create a Measure button and then select the columns from the database that you want as your measures. For instance, here we’re working with Unit Price.
Once you’ve saved your measure, you will need to set how you want to look at the data. Here we’ll set it to Average.
If you look in your Preview, you’ll see the measures that the BI tool can now easily access. It’s really as simple as a quick drag and drop.
Step 7. Enrich the data set.
Data modeling is iterative – you often want to go back and add to the data set. Without AtScale, you’d probably have to go back to the data administrator to ask for modifications to the data model, or to revise the ETL process. With AtScale, you can just do it yourself.
For instance, let’s say you want to add a calculated measure to your data model – meaning that you take a couple of measures and use them in formulas and functions to create higher level measures.
You don’t need to modify the underlying data set — you can create a virtual element called a calculated column. That way, you have the comfort of knowing that you’re not modifying the data model itself, just adding virtual elements.
With this process, you have a lower risk of introducing human error. You’re also making your data model richer and more useful for business users.
Anyone who accesses this data model will be able to access the calculated measures, with all the latest updated formulas and calculations. If you want to go back and change your formula again, you can – and the revision will be propagated across every instance where the calculated column is being used for analytics.
Here’s an example of what that looks like. We’re creating a virtual calculated column (Calculated Tax) by adding a simple SQL formula to create a new column. Don’t forget to check your syntax by clicking the Test SQL Syntax button.
Step 8. Add dimensions.
You can add pre-existing dimensions and create relationships within your data model with the same simple drag and drop process. Go into the Library to add the dimensions you want. Then use the Relationship creator to establish the relationships between the two data points. Again, the data model will now reflect these relationships in the BI tool of your choice.
Step 9. Publish your model.
When you publish, you’re allowing access to your data model. Again, everyone across the organization will now have access to the model – without moving or making any changes to your data in the Snowflake Data Cloud.
Step 10. Connect your data model to your BI tools.
Now for the fun part – using your model to create business insights in your BI tool of choice. To do that, just select your model and hit Connect. You’ll see all the different ways that you can connect your model.
One of the big differentiators with AtScale is that we’ve created a data model that’s accessing the data within the Snowflake database, but we can now have a variety of BI tools coming in and accessing the same data model.
In this example, we’ll connect with Tableau. Simply select Tableau and download the TDS file
(Tableau’s standard file format).
Without AtScale, you’d usually have to work with a TDE file (a Tableau data extract). That’s not ideal, because it involves actually bringing the data out of your Snowflake warehouse into Tableau. That means the data is no longer real-time, and you’ve created a data silo instead of leaving the data where it was. With AtScale — because you’re working with a live connection between your data model, the Snowflake data warehouse, and Tableau — data is always up to date. If you make any data changes in Snowflake, that new data will show up if you run a query in Tableau.
Step 11. Log into your BI tool.
When you log into Tableau, you’ll see all the data in your data model available to you in real-time. You can then run your query in Tableau as usual. The query will go to AtScale, and then AtScale will run the query against Snowflake.
Step 12. Monitor your queries in the Design Center.
You can track and view your queries back in your AtScale Design Center. Just go to the Queries tab and use the filter function to find the particular project you want.
Step 13. AtScale will automatically aggregate the data to dramatically speed up query times.
AtScale automatically learns over time. It will identify the kinds of queries you’re likely to run, and then optimize the data structures by creating and using aggregated data tables to make the queries run faster.
AtScale will then make decisions about when it will be optimal to use the aggregates to make queries faster.
The aggregates are stored as a table on the database (in this case, Snowflake’s server). Again, the data never needs to leave Snowflake. You’re not modifying the data structure in Snowflake; you’re virtually mapping the model in the way that you need. You can set up an automated API trigger to fire off a rebuild of the aggregates when new data is added to the data model.
And that’s really all it takes to build a data model in AtScale from a Snowflake cloud. Once you’ve built that model, you can access it immediately – there’s no wait time! The entire model is in the semantic layer (not just the formulas and calculations), meaning that every user will see the modifications as soon as you make them, no matter which BI tool they’re working on.
Even better, while you’re working with your data model, AtScale is learning from your behavior and creating aggregates to speed up query times.
For a full deep dive into using AtScale with Snowflake, join AtScale’s CTO and founder Dave Mariani and AtScale’s director of education and enablement Bob Kelly for an on-demand demo. Or if you’d like to experience the power of Snowflake + AtScale for yourself, schedule a live demo today!