From 841f5be18bc5a2a9c5f65dabafae4b9f8a4904b6 Mon Sep 17 00:00:00 2001 From: morenol <22335041+morenol@users.noreply.github.com> Date: Wed, 19 Jun 2024 23:07:26 +0000 Subject: [PATCH] [create-pull-request] automated change --- content/connectors/inbound/http.md | 2 +- content/connectors/inbound/kafka.md | 2 +- content/connectors/inbound/mqtt.md | 2 +- content/connectors/outbound/duckdb.md | 2 +- content/connectors/outbound/graphite.md | 2 +- content/connectors/outbound/sql.md | 2 +- embeds/connectors/inbound/kafka.md | 7 +- embeds/connectors/inbound/mqtt.md | 5 +- embeds/connectors/outbound/duckdb.md | 12 ++- embeds/connectors/outbound/graphite.md | 136 +----------------------- embeds/connectors/outbound/http.md | 6 +- embeds/connectors/outbound/kafka.md | 35 +++++- embeds/connectors/outbound/sql.md | 43 ++++++-- 13 files changed, 92 insertions(+), 164 deletions(-) diff --git a/content/connectors/inbound/http.md b/content/connectors/inbound/http.md index 8068ac982..30b849b23 100644 --- a/content/connectors/inbound/http.md +++ b/content/connectors/inbound/http.md @@ -1,5 +1,5 @@ --- -menu: HTTP +title: HTTP --- {{% inline-embed file="embeds/connectors/inbound/http.md" %}} \ No newline at end of file diff --git a/content/connectors/inbound/kafka.md b/content/connectors/inbound/kafka.md index fca54c8be..ae8bc1eb1 100644 --- a/content/connectors/inbound/kafka.md +++ b/content/connectors/inbound/kafka.md @@ -1,5 +1,5 @@ --- -menu: Kafka +title: Kafka --- {{% inline-embed file="embeds/connectors/inbound/kafka.md" %}} \ No newline at end of file diff --git a/content/connectors/inbound/mqtt.md b/content/connectors/inbound/mqtt.md index 668e85567..67ad50c5b 100644 --- a/content/connectors/inbound/mqtt.md +++ b/content/connectors/inbound/mqtt.md @@ -1,5 +1,5 @@ --- -menu: MQTT +title: MQTT --- {{% inline-embed file="embeds/connectors/inbound/mqtt.md" %}} \ No newline at end of file diff --git a/content/connectors/outbound/duckdb.md b/content/connectors/outbound/duckdb.md index 5ccdb55d2..1cf2ba5cd 100644 --- a/content/connectors/outbound/duckdb.md +++ b/content/connectors/outbound/duckdb.md @@ -1,5 +1,5 @@ --- -menu: DuckDB +title: DuckDB --- {{% inline-embed file="embeds/connectors/outbound/duckdb.md" %}} \ No newline at end of file diff --git a/content/connectors/outbound/graphite.md b/content/connectors/outbound/graphite.md index 41ba7b10c..c3115b1b9 100644 --- a/content/connectors/outbound/graphite.md +++ b/content/connectors/outbound/graphite.md @@ -1,5 +1,5 @@ --- -menu: Graphite +title: Graphite --- {{% inline-embed file="embeds/connectors/outbound/graphite.md" %}} \ No newline at end of file diff --git a/content/connectors/outbound/sql.md b/content/connectors/outbound/sql.md index dfd499022..c986f0b9a 100644 --- a/content/connectors/outbound/sql.md +++ b/content/connectors/outbound/sql.md @@ -1,5 +1,5 @@ --- -menu: SQL +title: SQL --- {{% inline-embed file="embeds/connectors/outbound/sql.md" %}} \ No newline at end of file diff --git a/embeds/connectors/inbound/kafka.md b/embeds/connectors/inbound/kafka.md index f989ebaf1..c4ea1a138 100644 --- a/embeds/connectors/inbound/kafka.md +++ b/embeds/connectors/inbound/kafka.md @@ -17,7 +17,7 @@ Example: ```yaml apiVersion: 0.1.0 meta: - version: 0.2.5 + version: 0.2.8 name: my-kafka-connector type: kafka-source topic: kafka-topic @@ -28,12 +28,9 @@ kafka: ``` ### Usage - To try out Kafka Source connector locally, you can use Fluvio CDK tool: - -%copy% ```bash -$ cdk deploy -p kafka-source start --config crates/kafka-source/sample-config.yaml +cdk deploy -p kafka-source start --config crates/kafka-source/config-example.yaml ``` ## Transformations diff --git a/embeds/connectors/inbound/mqtt.md b/embeds/connectors/inbound/mqtt.md index bf07925d5..38db4b0bd 100644 --- a/embeds/connectors/inbound/mqtt.md +++ b/embeds/connectors/inbound/mqtt.md @@ -50,7 +50,7 @@ This is an example of connector config file: # config-example.yaml apiVersion: 0.1.0 meta: - version: 0.2.5 + version: 0.2.7 name: my-mqtt-connector type: mqtt-source topic: mqtt-topic @@ -65,7 +65,6 @@ mqtt: ``` Run connector locally using `cdk` tool (from root directory or any sub-directory): - ```bash cdk deploy start --config config-example.yaml @@ -104,7 +103,7 @@ The previous example can be extended to add extra transformations to outgoing re # config-example.yaml apiVersion: 0.1.0 meta: - version: 0.2.5 + version: 0.2.7 name: my-mqtt-connector type: mqtt-source topic: mqtt-topic diff --git a/embeds/connectors/outbound/duckdb.md b/embeds/connectors/outbound/duckdb.md index 8bc504bba..58c2c799b 100644 --- a/embeds/connectors/outbound/duckdb.md +++ b/embeds/connectors/outbound/duckdb.md @@ -41,7 +41,7 @@ To connect to Motherduck server, use prefix: `md`. For example, `md://motherduc ```yaml apiVersion: 0.1.0 meta: - version: 0.1.0 + version: 0.1.3 name: duckdb-connector type: duckdb-sink topic: fluvio-topic-source @@ -99,7 +99,7 @@ Connector configuration file: apiVersion: 0.1.0 meta: version: 0.1.0 - name: duckdb-connector + name: json-sql-connector type: duckdb-sink topic: sql-topic create-topic: true @@ -127,14 +127,16 @@ transforms: ``` You can use Fluvio `cdk` tool to deploy the connector: - +```bash +fluvio install cdk +``` +and then: ```bash cdk deploy start --config connector-config.yaml ``` - To delete the connector run: ```bash -cdk deploy shutdown --name duckdb-connector +cdk deploy shutdown --config connector-config.yaml ``` After you run the connector you will see records in your database table. diff --git a/embeds/connectors/outbound/graphite.md b/embeds/connectors/outbound/graphite.md index 0a977bd1c..a8c23c378 100644 --- a/embeds/connectors/outbound/graphite.md +++ b/embeds/connectors/outbound/graphite.md @@ -1,21 +1,16 @@ -# InfinyOn Graphite Sink Connector -The [Graphite][4] Sink connector reads records from Fluvio topic and sends them to -the configured Graphite's Metric using the PlainText approach. +# graphite-sink +Graphite Metrics Server Fluvio Sink Connector -# Configuration +## Usage This connectors establishes a TCP Stream against the specified host on Graphite, -records are sent as UTF-8 encoded strings following [Graphite's PlainText][5] format. - -The following example connector configuration can be used to send records to -the Graphite's Metric `weather.temperature.ca.sandiego`, the Graphite's TCP -server address is specified on the `addr` field. +records are sent as UTF-8 encoded strings following Graphite's PlainText format. ```yaml # sample-config.yaml apiVersion: 0.1.0 meta: - version: 0.1.2 + version: 0.1.0 name: my-graphite-connector-test-connector type: graphite-sink topic: test-graphite-connector-topic @@ -24,124 +19,3 @@ graphite: metric-path: "weather.temperature.ca.sandiego" addr: "localhost:2003" ``` - -## Configuration Fields - -| Model | Data Type | Description | -|:----------------|:----------|:------------------------------------| -| `metric-path` | `String` | Graphite Metric to send records to | -| `addr` | `String` | Graphite TCP Adddress to stream out | - -## Usage - -This section will walk you through the process of setting up a Graphite -instance and using Fluvio to send metrics to this Graphite instance. - -> This section assumes you have Docker and Fluvio installed in your system. - -### Setting Up Graphite - -We will run our Graphite instance on Docker using the `docker compose` command -for simplicity. - -The Graphite container will setup [Carbon Configuration][6] files in your -working directory, we need to update these files to reduce Carbon's persisntance -intervals, making it more frecuent. - -Create a copy of our [`docker-compose.yml`][7] file and execute the container: - -```bash -docker compose up --build -d -``` - -This will generate a directory with the name `.graphite`, which contains -configuration files. - -Replace the contents of `.graphite/conf/storage-schemas.conf` to record on an -interval of 10 seconds and persist the last 12 hours of data. - -```conf -[all] -pattern = .* -retentions = 10s:12h -``` - -Now we need to re run the Graphite container so Carbon uses the new -configuration. - -```bash -docker compose down -docker compose up --build -d -``` - -You can visit `http://localhost:12345` in your browser to access the Dashboard. - -> Credentials for the Dashbord are User: `root` and Password: `root` - -With the Graphite instance set, we can move into [Setting Up Fluvio with Graphite Sink Connector][8]. - -### Setting Up Fluvio with Graphite Sink Connector - -In this section we are going use the CDK to spin up the Graphite Sink Connector -to send metrics from Fluvio Records to the Graphite instance. - -Make sure the Connector Development Kit is setup in your system by issuing the following command in your terminal. - -%copy% -```bash -cdk -``` - -> If you dont have the Fluvio CLI installed already visit the [CLI][2] section - -Create a YAML file with the name `weather-monitor-config.yaml` and specify connector settings: - -%copy% -```yaml -apiVersion: 0.1.0 -meta: - version: 0.1.2 - name: weather-monitor-sandiego - type: graphite-sink - topic: weather-ca-sandiego -graphite: - # https://graphite.readthedocs.io/en/latest/feeding-carbon.html#step-1-plan-a-naming-hierarchy - metric-path: "weather.temperature.ca.sandiego" - addr: "localhost:2003" -``` - -Deploy the Connector using the CDK - - -```bash -cdk deploy start --config weather-monitor-config.yaml -``` - -> Make sure your Graphite instance is running on `localhost:2003`, use the `cdk log` subcommand to read logs from the connector instance. - -Then produce records as usual: - -%copy% -```bash -echo 120 | fluvio produce weather-ca-sandiego -``` - -> Remember that Carbon's retention is set to `10s:12h`, this means that if will -> write metrics every 10s. - -Use Graphite's REST API to check on the stored data. - -%copy% -```bash -curl -o ./data.json http://localhost:12345/render\?target\=weather.temperature.ca.sandiego\&format\=json\&noNullPoints -``` - - -[1]: https://infinyon.cloud/login -[2]: https://www.fluvio.io/cli/ -[3]: https://github.com/infinyon/graphite-sink-connector/blob/main/CONTRIBUTING.md -[4]: https://graphiteapp.org/ -[5]: https://graphite.readthedocs.io/en/latest/feeding-carbon.html#the-plaintext-protocol -[6]: https://graphite.readthedocs.io/en/latest/config-carbon.html#storage-schemas-conf -[7]: https://github.com/infinyon/graphite-sink-connector/blob/main/docker-compose.yml -[8]: #setting-up-fluvio-with-graphite-sink-connector diff --git a/embeds/connectors/outbound/http.md b/embeds/connectors/outbound/http.md index d7d89ec8a..2bf49f2d3 100644 --- a/embeds/connectors/outbound/http.md +++ b/embeds/connectors/outbound/http.md @@ -159,15 +159,15 @@ In this case, additional transformation will be performed before records are sen Read more about [JSON to JSON transformations](https://www.fluvio.io/smartmodules/certified/jolt/). ### Offset Management -Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart. -To activate it, you need to provide the `consumer` name and set the `strategy: auto`. +Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart. +To activate it, you need to provide the `consumer` name and set the `strategy: auto`. See the example below: ```yaml apiVersion: 0.2.0 meta: version: 0.2.9 name: my-http-sink - type: http-sink + type: http-sink topic: meta: name: http-sink-topic diff --git a/embeds/connectors/outbound/kafka.md b/embeds/connectors/outbound/kafka.md index f481f9d6c..335671ac8 100644 --- a/embeds/connectors/outbound/kafka.md +++ b/embeds/connectors/outbound/kafka.md @@ -29,7 +29,7 @@ Example without security: ```yaml apiVersion: 0.1.0 meta: - version: 0.2.7 + version: 0.2.9 name: my-kafka-connector type: kafka-sink topic: kafka-topic @@ -44,7 +44,7 @@ Example with security enabled: ```yaml apiVersion: 0.1.0 meta: - version: 0.2.7 + version: 0.2.9 name: my-kafka-connector type: kafka-sink topic: kafka-topic @@ -68,9 +68,38 @@ kafka: ### Usage To try out Kafka Sink connector locally, you can use Fluvio CDK tool: +```bash +cdk deploy -p kafka-sink start --config crates/kafka-sink/config-example.yaml +``` + +### Offset Management +Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart. +To activate it, you need to provide the `consumer` name and set the `strategy: auto`. +See the example below: +```yaml +apiVersion: 0.2.0 +meta: + version: 0.2.9 + name: my-kafka-connector + type: kafka-sink + topic: + meta: + name: kafka-sink-topic + consumer: + id: my-kafka-sink + offset: + strategy: auto +kafka: + url: "localhost:9092" + topic: fluvio-topic + create-topic: true +``` +After the connector processed any records, you can check the last stored offset value via: ```bash -cdk deploy -p kafka-sink start --config crates/kafka-sink/sample-config.yaml +$ fluvio consumer list + CONSUMER TOPIC PARTITION OFFSET LAST SEEN + my-kafka-sink kafka-sink-topic 0 0 3s ``` ### Testing with security diff --git a/embeds/connectors/outbound/sql.md b/embeds/connectors/outbound/sql.md index 830b3d8de..06c99253a 100644 --- a/embeds/connectors/outbound/sql.md +++ b/embeds/connectors/outbound/sql.md @@ -40,7 +40,7 @@ in the config. If a SmartModule requires configuration, it is passed via `with` ```yaml apiVersion: 0.1.0 meta: - version: 0.3.3 + version: 0.4.2 name: my-sql-connector type: sql-sink topic: sql-topic @@ -62,7 +62,7 @@ The connector can use secrets in order to hide sensitive information. ```yaml apiVersion: 0.1.0 meta: - version: 0.3.3 + version: 0.4.2 name: my-sql-connector type: sql-sink topic: sql-topic @@ -71,6 +71,37 @@ meta: sql: url: ${{ secrets.DATABASE_URL }} ``` + +### Offset Management +Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart. +To activate it, you need to provide the `consumer` name and set the `strategy: auto`. +See the example below: +```yaml +apiVersion: 0.2.0 +meta: + version: 0.4.2 + name: my-sql-connector + type: sql-sink + topic: + meta: + name: sql-sink-topic + consumer: + id: my-sql-sink + offset: + strategy: auto + secrets: + - name: DATABASE_URL +sql: + url: ${{ secrets.DATABASE_URL }} +``` + +After the connector processed any records, you can check the last stored offset value via: +```bash +$ fluvio consumer list + CONSUMER TOPIC PARTITION OFFSET LAST SEEN + my-http-sink http-sink-topic 0 0 3s +``` + ## Insert Usage Example Let's look at the example of the connector with one transformation named [infinyon/json-sql](https://github.com/infinyon/fluvio-connectors/blob/main/smartmodules/json-sql/README.md). The transformation takes records in JSON format and creates SQL insert operation to `topic_message` table. The value from `device.device_id` @@ -95,7 +126,7 @@ Connector configuration file: # connector-config.yaml apiVersion: 0.1.0 meta: - version: 0.3.3 + version: 0.4.2 name: json-sql-connector type: sql-sink topic: sql-topic @@ -124,16 +155,12 @@ transforms: ``` You can use Fluvio `cdk` tool to deploy the connector: - ```bash cdk deploy start --config connector-config.yaml ``` - To delete the connector run: - ```bash cdk deploy shutdown --name json-sql-connector - ``` After you run the connector you will see records in your database table. @@ -155,7 +182,7 @@ Connector configuration file for upsert (assuming `device_id` is a unique column # connector-config.yaml apiVersion: 0.1.0 meta: - version: 0.3.3 + version: 0.4.2 name: json-sql-connector type: sql-sink topic: sql-topic