This stand alone Meltano project with the Matatika lab is a quick and easy way to monitor your Snowflake costs.
NB - Currently this project is only supported to work on Linux and MacOS
- Get Docker - https://docs.docker.com/get-docker/
- Your Snowflake database credentials
Using Matatika you can run this example with only docker and we create all the following for you:
- Postgres data warehouse
- Meltano jobs for running dbt models
- Lab (UI for Meltano) to run and schedule jobs
- Simple charts that can be embedded anywhere https://github.com/Matatika/dataset-component-example
You can run this as a stand alone Meltano project, you will need to provide all your Snowflake credentials through meltano config
or a .env
for the dbt
plugin. You will then need to run meltano run dbt:deps dbt:run
. Finally you can check your processed Snowflake cost data in new tables in your Snowflake database.
Or you can follow the steps below, use a UI to configure your project and see datasets in the Matatika Lab UI.
-
Clone and start up the project:
git clone [email protected]:Matatika/snowflake-cost-monitoring.git cd snowflake-cost-monitoring meltano install meltano invoke matatika lab
-
Your web browser automatically opens https://localhost:3443
-
You will see a task to complete
Complete your 'Snowflake' store configuration
, clickLET'S GO
, fill in your Snowflake credentials and clickSAVE
-
On the left hand menu go to the
Stores
screen, click the three dots at the end of theSnowflake
data store, and clickMake default
-
Go to the
Pipelines screen
, and click the Run (play) button on theCost analysis
pipeline. This should take less than 2 minutes, you can check to job logs by expanding the pipeline with the button to the right of the Run (play) button. -
When this pipeline has completed go to the
Datasets
screen to see insights into your Snowflake costs!
Join our community on the Matatika Slack to get help and updates.
You can read more about Matatika and our Lab in our Documentation.