In this ten minute demo, see Passerelle’s Snowflake Watch Dashboards in action. With Snowflake Watch, organizations gain clear visibility into their Snowflake Data Cloud in order to monitor Warehouse Usage, Snowflake Query Performance, Snowflake Security Set Up and Database Stats.
Hello, my name is Sarah Davis and I’m a consultant with Passerelle. Today I’m going to walk you through Passerelle Snowflake Watch. Snowflake Watch was created to build visibility into the Data Cloud, where we can quickly understand the cost associated with usage and data, how data is consumed and queried, and their security environment.
As part of Data Rocket, Snowflake Watch provides dashboards for warehouse usage, Query Performance and details, security setup and details and database stats. With Snowflake Watch, Snowflake managers can quickly gauge key volumetric stats with customizable visual dashboards and drill down to the query and user level for deeper insights.
We’ll begin with the Warehouse Usage Dashboard. This dashboard is comprised of metrics that help us understand the correlation between warehouse time compute credits, cloud service credits and cost. This can be used to understand credit burn and to plan for budget allocation by providing a clearer picture of usage and a way to monitor and optimize usage.
The metered cost provides a cumulative cost across the time period selected as well as the total cost for each day. The daily cost enables us to see daily usage patterns to identify warehouse usage anomalies or outliers, so we can reduce unnecessary spend. We can also identify days when utilization is lower or higher to help adjust and optimize warehouse sizing.
Below we can see the distribution of compute credits versus Cloud Credits. And next to this the cost total credits, cloud service credits and compute credits by warehouse and the start time in time. The filters provide options to customize the time period, the warehouse and the discount. For example, let’s look at the use over the last year for the consumer warehouse with a discount of 5%. Each metric is now filtered to only include the consumer warehouses data with a discount of 5%. We assume the cost of $2 per credit hour for our dashboard. However, this parameter can be changed to the actual dollar amount for your Snowflake account upon dashboard installation.
Now that we have filtered our data to only see the consumed warehouses data, we can drill down even further by clicking on a day in our metered cost view. We now see the usage for the consume warehouse for the day of February 1. Notice that the consume warehouse is broken out by each run. With the start time, the end time, the cost total credits cloud service credits and compute credits.
Query Performance Dashboards provide key insights into the queries that are processed by Snowflake as well as usage tracking across users for usage adoption and usage trends. The first Query Performance view is a summary view with adjustable filters that allow for exploration into queries by execution status, query type, warehouse size, warehouse, database and user for the last year our database has processed at 1000 queries over 3000 minutes with 12 warehouses using 15 cores. With a core runtime of almost 2000 minutes the processing comes out to approximately $385 in estimated cost based on query elapsed time. We then have our main metrics broken down by the month for number of queries, elapsed time, estimated cost and then by execution status, by warehouse size, number of queries by warehouse queries by database queries by query type and then queries by user.
The filters allow us to drill down deeper into our queries to answer specific questions about our query data. For example, we want to mitigate failed insert statements need to evaluate which queries are the problem. To do this, we change our execution status to fail and query type to insert From here we can see the metrics have all filtered to our specifications. If we want to view the field insert query details, we can then move to the query details view.
The Query Details View persist the filters from our summary view, we now see the individual query IDs of the failed insert statements for the last year. This view allows us to review the details and statistics for the overall query including the elapsed runtime, the error message and the user that ran the query.
The Database Stats Dashboard provides insight to answer critical questions such as how storage is growing over time, the quantity of database objects over time and how your data is being used across the organization. Let’s now review the Database Stats Dashboard. This dashboard provides insight into our Snowflake databases with customized timeframes, and the ability to drill down by database size and database owner. Let’s take a quick look at the account level summary of our database usage and storage. We have the number of databases the database size and failsafe size. In this view, we can tell that the majority of our databases were created between 2019 and 2020. With growth leveling out from 2020 on. Below, the databases are sorted by size from largest to smallest. And then by database owner and the number of databases that are owned. If we select the graph marker and the number of databases by created view, we can then drill down to see which databases were created on this day, as well as the owners that created the databases for example, the databases that were created on this day, and the owners that created the databases. If we select an owner, under Databases by owner, we can see which databases are owned by the owner. So why choose Data Rocket admin. These are the databases that this owner owns.
And lastly, we will navigate to the Security Setup Dashboard. The Snowflake Security Setup Dashboard provides one place to view permissions across your organization. Users can filter by role, object type, object or privilege. The Dashboard supports accelerated design of the data security model and auditing of existing frameworks. With the Snowflake security dashboard, we have the ability to map Snowflake user roles to Snowflake objects such as databases, schemas and tables. Map Snowflake user roles to inherited roles as you roll an object security mapping in tabular format in the Details tab. With the object type roles selected in the filter, we can quickly choose a role to investigate we can now see that the Data Rocket admin role inherits permissions for these five roles. If we choose the object type schema we can now see the privilege that the Data Rocket admin role has for these schemas. We can even go further and choose one privilege to view for a more in depth look into security. The security detail dashboard provides a comprehensive tabular view of the relationships you have the object type, and we start with role. And then we have the database and the privilege for the database file format function and so on. And it goes all the way to warehouse. This concludes our demo. Thank you for joining us to review Passerelle Snowflake Watch. We look forward to empowering your data journey to turn data into action. Ready to learn more? Request a complimentary consultation.