Skip to content

Latest commit

 

History

History
218 lines (145 loc) · 7.44 KB

README.md

File metadata and controls

218 lines (145 loc) · 7.44 KB

Seismic Alerts Streamer

A service that utilizes Apache Kafka to listen & collect data of Realtime Global Seismic events from various Producers and stream them to various Consumer clients that can alert and keep the users updated & safe in Realtime. Additionally all micro-services are fully Dockerized, ready to run & be deployed just about anywhere! Web UI

Table of Contents
  1. About The Project
  2. Future Prospects
  3. Technologies Used
  4. Getting Started

• About The Project

Seismic Alerts Streamer at its core, uses Apache Kafka to listen & collect data of Realtime Global Seismic events from Producers and streams them to Consumers.

Architecture diagram for this microservice

All Producers are managed via a Python interface. Producers consists of:

  1. A WebSocket endpoint by European-Mediterranean Seismological Centre (EMSC).
  2. A Flask Rest API that allows a user to Report any Seismic Activity around them (POST) or Fetch log archives from the Database (GET).

Intuitive Error Handling

These events are then published to two Kafka topics namely minor_seismic_events & severe_seismic_events based on their magnitude.

Consumers connected to the Kafka broker, subscribe to these topics and start receiving Seismic Logs. All the consumers are managed via a Multi-threaded Java interface as clients. Consumers consists of:

  1. A Live Log Feed that reads from both topics allowing the user to conveniently view a Realtime feed of all Seismic Activity around the world

  1. A Java SMTP client reads from severe_seismic_events and Alerts the user of potentially Dangerous Seismic Activity via Email

  1. A Postgres Database connected directly via a Kafka-JDBC Sink Connector (initialized by the Java interface) conveniently maintains an archive of all Seismic Activity recorded through Kafka.

  1. An Interactive Web UI (inspired by EMSC) featuring a Map View of all Seismic Events reading from the Postgres Database via our Rest API.

• Future Prospects

The overall Goal of the project is to build a portable, efficient and scalable system that can connect various Seismology Providers to various Consumer clients that can serve the users in innovative ways keeping them safe and alerted.

Apache Kafka because of its high through-put enables it to scale easily to huge traffic using many brokers & clusters. The Service is further fully Containerized making it ready for deployment.

This Service can also be extended and Scaled by adding more Seismic Data Providers, and Consumer Clients like Mobile Apps & other Safety Protocols and adding Data Analytics Tools to the Web UI.

• Technologies Used

The Project is developed using the following Technologies, Libraries & Frameworks:

  • Apache Kafka & Kafka-connect (Confluent)
  • Docker
  • Python
  • Java & Maven
  • React.js (Javascript)
  • Tornado
  • Flask
  • Leaflet.js
  • PostgreSQL
  • JavaMail API & Google SMTP Server
  • Shell

• Getting Started

To setup the project locally follow the steps below

  • Pre-requisites:

    • Docker : Told you.. Its fully Containerized!
  • Running the Project:

    • As a User

    1. Fork and clone the project to your local system
    2. Set necessary environment variables. Create a .env file in the root directory and write your Gmail credentials in it for the Gmail SMTP Server. This email will be used to send the Alerts to the users. Name them as:
    [email protected]
    SERVICE_EMAIL_PASSWORD=password
    

    If you have 2-step Authentication set up for your Google Account, Create a new app password and use that password in your .env file


    1. Now build and run the docker-compose file & exec -it into the consumers container. To do this, from project directory, run
    docker compose up -d --build   # Build & run the containers
    docker exec -it <containerId_of_consumers> sh   # Attach shell to consumers container

    This should run the following containers

To find the <container_id>, run docker ps or use Docker Desktop, and copy the container id of container named consumers or seismic-alerts-streamer-consumers


iv. Once inside the consumers container's shell, run

mvn -q exec:java   # Run the maven project

You have successfully entered the program as a user!

  • Usage of the Rest API

    The Rest API accessible at localhost:5000 has the following endpoints:

  1. POST /seismic_events: Self-Report Seismic Activity in you region

Request Body should adhere to the given format for a successful submission:

{
 "magnitude":"float",
 "region":"string",
 "time":"ISO-8601 string. Min: YYYY, Max: YYYY-MM-DD(T)hh:mm:ss.ssssss(Zone)",
 "co_ordinates":"array of floats"
}                                 

  1. GET /seismic_events: Returns entire recorded archive from Database.

  1. GET /seismic_events/minor or GET seismic_events/severe: Returns records based on severity of the Seismic activities.
  • For Developers

    All micro-services are set up with meaningful logs that can help with debugging and further development.

    To view log of other containers, run

    docker logs <container_id>

    If you wish to develop it locally without docker, manually install the technologies used. Run the following commands to install necessary dependencies:

    cd Producers
    pip install -r requirements.txt
    cd Consumers
    mvn clean install
    cd Producers/Web
    npm install