Skip to content

datadb/WeatherDataProject

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DATASOURCE

GlobalLandTemperaturesByCountry.csv

from Climate Change: Earth Surface Temperature Data Kaggle dataset

SCOPE

INSTRUCTIONS

clone the repo to your local/virtual machine

cp dev.env .env

project name: MyWeatherDataProject

project ID: myweatherdataproject

CREATE A GOOGLE STORAGE BUCKET

CREATE A SERVICE ACCOUNT WITH ROLES

viewer

storage admin

storage object admin

bigquery admin

bigquery data owner

CREATE A JSON KEY FOR THIS SERVICE ACCOUNT

download and rename the file as my-creds.json

CREATE A /keys FOLDER in your home/ AND PLACE THE my-creds.json FILE THERE

COPY THE my-creds.json FILE INTO FOLDER mage/

follow terraform instructions

follow mage instructions

follow bigquery instructions

create your dashboard

Tools/ Resources Used

1. Terraform

2. Mage

3. GCP Cloud Storage

4. GCP Bigquery

5. Docker/docker-compose

6. Google Looker Studio

Locate the terraform file with cd and

terraform init

terraform plan

terraform apply

Locate the mage folrder with cs and

docker compose build

docker compose up -d

forward port 6789 if not automatically forwarded

run the .py scripts in the following order:

load_weather_data_dt.py

transform_weather_data_dt.py

export_weather_data_gs.py

load_weather_data_from_parguet.py

export_weather_data_bigquery.py

@ bigquery dashboard

run the sql.sql query to create 2 tables

(partitioned_clustered and report)

@ looker studio

create a datasource from the report table

create charts

do not forget to

terraform destroy

in order to delete your google cloud resources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published