Skip to content

Commit

Permalink
[create-pull-request] automated change
Browse files Browse the repository at this point in the history
  • Loading branch information
morenol authored and github-actions[bot] committed Jun 19, 2024
1 parent 8889580 commit 841f5be
Show file tree
Hide file tree
Showing 13 changed files with 92 additions and 164 deletions.
2 changes: 1 addition & 1 deletion content/connectors/inbound/http.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: HTTP
title: HTTP
---

{{% inline-embed file="embeds/connectors/inbound/http.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/inbound/kafka.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: Kafka
title: Kafka
---

{{% inline-embed file="embeds/connectors/inbound/kafka.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/inbound/mqtt.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: MQTT
title: MQTT
---

{{% inline-embed file="embeds/connectors/inbound/mqtt.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/duckdb.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: DuckDB
title: DuckDB
---

{{% inline-embed file="embeds/connectors/outbound/duckdb.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/graphite.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: Graphite
title: Graphite
---

{{% inline-embed file="embeds/connectors/outbound/graphite.md" %}}
2 changes: 1 addition & 1 deletion content/connectors/outbound/sql.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
menu: SQL
title: SQL
---

{{% inline-embed file="embeds/connectors/outbound/sql.md" %}}
7 changes: 2 additions & 5 deletions embeds/connectors/inbound/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Example:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.8
name: my-kafka-connector
type: kafka-source
topic: kafka-topic
Expand All @@ -28,12 +28,9 @@ kafka:
```
### Usage
To try out Kafka Source connector locally, you can use Fluvio CDK tool:
%copy%
```bash
$ cdk deploy -p kafka-source start --config crates/kafka-source/sample-config.yaml
cdk deploy -p kafka-source start --config crates/kafka-source/config-example.yaml
```

## Transformations
Expand Down
5 changes: 2 additions & 3 deletions embeds/connectors/inbound/mqtt.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ This is an example of connector config file:
# config-example.yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.7
name: my-mqtt-connector
type: mqtt-source
topic: mqtt-topic
Expand All @@ -65,7 +65,6 @@ mqtt:
```

Run connector locally using `cdk` tool (from root directory or any sub-directory):

```bash
cdk deploy start --config config-example.yaml
Expand Down Expand Up @@ -104,7 +103,7 @@ The previous example can be extended to add extra transformations to outgoing re
# config-example.yaml
apiVersion: 0.1.0
meta:
version: 0.2.5
version: 0.2.7
name: my-mqtt-connector
type: mqtt-source
topic: mqtt-topic
Expand Down
12 changes: 7 additions & 5 deletions embeds/connectors/outbound/duckdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ To connect to Motherduck server, use prefix: `md`. For example, `md://motherduc
```yaml
apiVersion: 0.1.0
meta:
version: 0.1.0
version: 0.1.3
name: duckdb-connector
type: duckdb-sink
topic: fluvio-topic-source
Expand Down Expand Up @@ -99,7 +99,7 @@ Connector configuration file:
apiVersion: 0.1.0
meta:
version: 0.1.0
name: duckdb-connector
name: json-sql-connector
type: duckdb-sink
topic: sql-topic
create-topic: true
Expand Down Expand Up @@ -127,14 +127,16 @@ transforms:
```

You can use Fluvio `cdk` tool to deploy the connector:

```bash
fluvio install cdk
```
and then:
```bash
cdk deploy start --config connector-config.yaml
```

To delete the connector run:
```bash
cdk deploy shutdown --name duckdb-connector
cdk deploy shutdown --config connector-config.yaml
```
After you run the connector you will see records in your database table.
Expand Down
136 changes: 5 additions & 131 deletions embeds/connectors/outbound/graphite.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,16 @@
# InfinyOn Graphite Sink Connector
The [Graphite][4] Sink connector reads records from Fluvio topic and sends them to
the configured Graphite's Metric using the PlainText approach.
# graphite-sink
Graphite Metrics Server Fluvio Sink Connector

# Configuration
## Usage

This connectors establishes a TCP Stream against the specified host on Graphite,
records are sent as UTF-8 encoded strings following [Graphite's PlainText][5] format.

The following example connector configuration can be used to send records to
the Graphite's Metric `weather.temperature.ca.sandiego`, the Graphite's TCP
server address is specified on the `addr` field.
records are sent as UTF-8 encoded strings following Graphite's PlainText format.

```yaml
# sample-config.yaml
apiVersion: 0.1.0
meta:
version: 0.1.2
version: 0.1.0
name: my-graphite-connector-test-connector
type: graphite-sink
topic: test-graphite-connector-topic
Expand All @@ -24,124 +19,3 @@ graphite:
metric-path: "weather.temperature.ca.sandiego"
addr: "localhost:2003"
```
## Configuration Fields
| Model | Data Type | Description |
|:----------------|:----------|:------------------------------------|
| `metric-path` | `String` | Graphite Metric to send records to |
| `addr` | `String` | Graphite TCP Adddress to stream out |

## Usage

This section will walk you through the process of setting up a Graphite
instance and using Fluvio to send metrics to this Graphite instance.

> This section assumes you have Docker and Fluvio installed in your system.

### Setting Up Graphite

We will run our Graphite instance on Docker using the `docker compose` command
for simplicity.

The Graphite container will setup [Carbon Configuration][6] files in your
working directory, we need to update these files to reduce Carbon's persisntance
intervals, making it more frecuent.

Create a copy of our [`docker-compose.yml`][7] file and execute the container:

```bash
docker compose up --build -d
```

This will generate a directory with the name `.graphite`, which contains
configuration files.

Replace the contents of `.graphite/conf/storage-schemas.conf` to record on an
interval of 10 seconds and persist the last 12 hours of data.

```conf
[all]
pattern = .*
retentions = 10s:12h
```

Now we need to re run the Graphite container so Carbon uses the new
configuration.

```bash
docker compose down
docker compose up --build -d
```

You can visit `http://localhost:12345` in your browser to access the Dashboard.

> Credentials for the Dashbord are User: `root` and Password: `root`

With the Graphite instance set, we can move into [Setting Up Fluvio with Graphite Sink Connector][8].

### Setting Up Fluvio with Graphite Sink Connector

In this section we are going use the CDK to spin up the Graphite Sink Connector
to send metrics from Fluvio Records to the Graphite instance.

Make sure the Connector Development Kit is setup in your system by issuing the following command in your terminal.

%copy%
```bash
cdk
```

> If you dont have the Fluvio CLI installed already visit the [CLI][2] section

Create a YAML file with the name `weather-monitor-config.yaml` and specify connector settings:

%copy%
```yaml
apiVersion: 0.1.0
meta:
version: 0.1.2
name: weather-monitor-sandiego
type: graphite-sink
topic: weather-ca-sandiego
graphite:
# https://graphite.readthedocs.io/en/latest/feeding-carbon.html#step-1-plan-a-naming-hierarchy
metric-path: "weather.temperature.ca.sandiego"
addr: "localhost:2003"
```

Deploy the Connector using the CDK


```bash
cdk deploy start --config weather-monitor-config.yaml
```

> Make sure your Graphite instance is running on `localhost:2003`, use the `cdk log` subcommand to read logs from the connector instance.

Then produce records as usual:

%copy%
```bash
echo 120 | fluvio produce weather-ca-sandiego
```

> Remember that Carbon's retention is set to `10s:12h`, this means that if will
> write metrics every 10s.

Use Graphite's REST API to check on the stored data.

%copy%
```bash
curl -o ./data.json http://localhost:12345/render\?target\=weather.temperature.ca.sandiego\&format\=json\&noNullPoints
```


[1]: https://infinyon.cloud/login
[2]: https://www.fluvio.io/cli/
[3]: https://github.com/infinyon/graphite-sink-connector/blob/main/CONTRIBUTING.md
[4]: https://graphiteapp.org/
[5]: https://graphite.readthedocs.io/en/latest/feeding-carbon.html#the-plaintext-protocol
[6]: https://graphite.readthedocs.io/en/latest/config-carbon.html#storage-schemas-conf
[7]: https://github.com/infinyon/graphite-sink-connector/blob/main/docker-compose.yml
[8]: #setting-up-fluvio-with-graphite-sink-connector
6 changes: 3 additions & 3 deletions embeds/connectors/outbound/http.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,15 +159,15 @@ In this case, additional transformation will be performed before records are sen
Read more about [JSON to JSON transformations](https://www.fluvio.io/smartmodules/certified/jolt/).

### Offset Management
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
See the example below:
```yaml
apiVersion: 0.2.0
meta:
version: 0.2.9
name: my-http-sink
type: http-sink
type: http-sink
topic:
meta:
name: http-sink-topic
Expand Down
35 changes: 32 additions & 3 deletions embeds/connectors/outbound/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Example without security:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.7
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic: kafka-topic
Expand All @@ -44,7 +44,7 @@ Example with security enabled:
```yaml
apiVersion: 0.1.0
meta:
version: 0.2.7
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic: kafka-topic
Expand All @@ -68,9 +68,38 @@ kafka:
### Usage
To try out Kafka Sink connector locally, you can use Fluvio CDK tool:
```bash
cdk deploy -p kafka-sink start --config crates/kafka-sink/config-example.yaml
```

### Offset Management
Fluvio Consumer Offset feature allows for a connector to store the offset in the Fluvio cluster and use it on restart.
To activate it, you need to provide the `consumer` name and set the `strategy: auto`.
See the example below:
```yaml
apiVersion: 0.2.0
meta:
version: 0.2.9
name: my-kafka-connector
type: kafka-sink
topic:
meta:
name: kafka-sink-topic
consumer:
id: my-kafka-sink
offset:
strategy: auto
kafka:
url: "localhost:9092"
topic: fluvio-topic
create-topic: true
```
After the connector processed any records, you can check the last stored offset value via:
```bash
cdk deploy -p kafka-sink start --config crates/kafka-sink/sample-config.yaml
$ fluvio consumer list
CONSUMER TOPIC PARTITION OFFSET LAST SEEN
my-kafka-sink kafka-sink-topic 0 0 3s
```

### Testing with security
Expand Down
Loading

0 comments on commit 841f5be

Please sign in to comment.