Skip to content

hypersay/pino-elasticsearch

 
 

Repository files navigation

pino-elasticsearch  Build Status Coverage Status

Load pino logs into Elasticsearch.

Install

npm install pino-elasticsearch -g

Usage

  pino-elasticsearch

  To send pino logs to elasticsearch:

     cat log | pino-elasticsearch --host 192.168.1.42

  If using AWS Elasticsearch:
    cat log | pino-elasticsearch  --host https://your-url.us-east-1.es.amazonaws.com --port 443 -c ./aws_config.json

  Flags
  -h  | --help              Display Help
  -v  | --version           display Version
  -H  | --host              the IP address of elasticsearch; default: 127.0.0.1
  -p  | --port              the port of elasticsearch; default: 9200
  -i  | --index             the name of the index to use; default: pino
                            will replace %{DATE} with the YYYY-MM-DD date
  -t  | --type              the name of the type to use; default: log
  -b  | --size              the number of documents for each bulk insert
  -l  | --trace-level       trace level for the elasticsearch client, default 'error' (info, debug, trace).
  -c  | --aws-credentials   path to aws_config.json (is using AWS Elasticsearch)

You can then use Kibana to browse and visualize your logs.

Setup and Testing

Setting up pino-elasticsearch is easy, and you can use the bundled docker-compose.yml file to bring up both Elasticsearch and Kibana.

You will need docker and docker-compose, then in this project folder, launch docker-compose up.

You can test it by launching node example | pino-elasticsearch, in this project folder. You will need to have pino-elasticsearch installed globally.

Acknowledgements

This project was kindly sponsored by nearForm.

License

Licensed under MIT.

About

🌲 load pino logs into Elasticsearch

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • JavaScript 58.8%
  • Shell 41.2%