Skip to content

kasna-cloud/countdown-ic-readme

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Countdown Innovation Challenge

This repo contains participant information for the Countdown Innovation Challenge event. Please use information in this repo to explore data and develop operational apps.

Summary

EMBRACE A SUSTAINABLE FUTURE AND HARNESS YOUR INNOVATIVE IDEAS TO CREATE A GREENER ENVIRONMENT FOR ALL The focus for the event will be anchored to our strategic theme of Good and Green. A few examples of topics that align with the theme include:

  1. Reduction of food waste from farm to fork
  2. Increased energy efficiency in stores
  3. Decarbonisation of our supply chain
  4. Reduction of water usage, and;
  5. Other environmental sustainability areas

In order to explore more ideas, please use this repo to form ideas and use examples for your team.

Please find the project links for your team.

Team Resources

Every team will be provisioned with a dedicated GCP project with project owner permission. The project naming convention is <team_name>-<random_suffix>.

By default, the following servers/APIs will be enabled for those team projects. Other services/APIs can be enabled when required.

  • "aiplatform.googleapis.com"
  • "artifactregistry.googleapis.com"
  • "bigquery.googleapis.com"
  • "compute.googleapis.com"
  • "cloudbuild.googleapis.com"
  • "cloudfunctions.googleapis.com"
  • "datacatalog.googleapis.com"
  • "dataflow.googleapis.com"
  • "datastudio.googleapis.com"
  • "dlp.googleapis.com"
  • "eventarc.googleapis.com"
  • "logging.googleapis.com"
  • "sourcerepo.googleapis.com"
  • "run.googleapis.com"
  • "pubsub.googleapis.com"
  • "monitoring.googleapis.com"
  • "notebooks.googleapis.com"

By default, each team will be provided a data landing zone cloud storage bucket . The naming convention is <team_name>-bigquery-csv-import. Please note that we only support CSV format at the moment, and those CSV files must be uploaded to the bucket root. We are working on supporting more data format and nested directory structures.

By default, each team will be provided a cloud source repository. Please use this command to clone it to your local. gcloud source repos clone --project=countdown-<team_name>-repo.

By default, each team will be provisioned a Vertex AI managed notebook instance (JupyterLab notebook) with owner permission. The imported datasets should be accessible via GCS bucket browser on the instance.

Datasets

These datasets are available to teams and loaded either into your BigQuery instance, or available in a storage bucket gs://<team_name>-bigquery-csv-import.

The following custom datasets as per category are available on the GCS bucket.

Retail Data Analytics

Environmental Store Data

Transport Data (How long it's in trucks)

Waste Data

Grocery Products

Emissions Data

Recipe Data

Public Datasets

In addition to the supplied data sets, Google also have a repository of samples datasets which can be found BigQuery Public Dataset.

Please feel free to add them via "ADD DATA" button on the top left of your BigQuery Explorer.

Sample Apps

These are available as a simple one-click deployment for hosting applications on managed Cloud Run.

Framework Description Deploy
[React.js](boilerplate-react React Sample Run on Google Cloud
Sapper.js Sapper Sample Run on Google Cloud
Svelte Kit Sveltekit with TailwindCSS Run on Google Cloud
Nuxt.js Nuxt.js with TailwindCSS and TypeScript Run on Google Cloud
Next.js Next.js with TailwindCSS Run on Google Cloud

Resources

Additional information and resources are available at the links below:

Additional Resources

About

Countdown Innovation Challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published