Back to plugin list
kafka
Official

Kafka

This plugin is in preview.

This destination plugin lets you sync data from a CloudQuery source to Kafka in various formats such as CSV, JSON. Each table will be pushed to a separate topic

Publisher

cloudquery

Repositorygithub.com
Latest version

v4.1.2

Type

Destination

Platforms
Date Published

Price

Free

Overview

Kafka Destination Plugin

This destination plugin lets you sync data from a CloudQuery source to Kafka in various formats such as CSV, JSON. Each table will be pushed to a separate topic.

Example

This example configures connects to a Kafka destination using SASL plain authentication and pushes messages in JSON format.
The (top level) spec section is described in the Destination Spec Reference.
kind: destination
spec:
  name: "kafka"
  path: "cloudquery/kafka"
  registry: "cloudquery"
  version: "v4.1.2"
  write_mode: "append"
  spec:
    # required - list of brokers to connect to
    brokers: ["<broker-host>:<broker-port>"]
    # optional - if connecting via SASL/PLAIN, the username and password to use. If not set, no authentication will be used.
    sasl_username: "${KAFKA_SASL_USERNAME}"
    sasl_password: "${KAFKA_SASL_PASSWORD}"
    format: "json" # options: parquet, json, csv
    format_spec:
      # CSV-specific parameters:
      # delimiter: ","
      # skip_header: false

    # Optional parameters
    # compression: "" # options: gzip
    # client_id: cq-destination-kafka
    # verbose: false
    # batch_size: 1000
    # topic_details:
      # num_partitions: 1
      # replication_factor: 1
Note that the Kafka plugin only supports append write_mode. The (top level) spec section is described in the Destination Spec Reference.

Plugin Spec

This is the (nested) plugin spec
  • brokers ([]string) (required)
    List of brokers to connect to.
  • format (string) (required)
    Format of the output file. Supported values are csv, json and parquet.
  • format_spec (format_spec) (optional)
    Optional parameters to change the format of the file.
  • compression (string) (optional) (default: empty)
    Compression algorithm to use. Supported values are empty or gzip. Not supported for parquet format.
  • sasl_username (string) (optional) (default: empty)
    If connecting via SASL/PLAIN, the username to use.
  • sasl_password (string) (optional) (default: empty)
    If connecting via SASL/PLAIN, the password to use.
  • client_id (string) (optional) (default: cq-destination-kafka)
    Client ID to be set for Kafka API calls.
  • verbose (boolean) (optional) (default: false)
    If true, the plugin will log all underlying Kafka client messages to the log.
  • batch_size (integer) (optional) (default: 1000)
    Number of records to write before starting a new object.
  • topic_details (topic_details) (optional)
    Optional parameters to set topic details.
 

format_spec

  • delimiter (string) (optional) (default: ,)
    Character that will be used as want to use as the delimiter if the format type is csv.
  • skip_header (boolean) (optional) (default: false)
    Specifies if the first line of a file should be the headers (when format is csv).
 

topic_details

  • num_partitions (integer) (optional) (default: 1)
    Number of partitions for the newly created topic.
  • replication_factor (integer) (optional) (default: 1)
    Replication factor for the topic.


Subscribe to product updates

Be the first to know about new features.