Back to source plugin

Export from OracleDB to Kafka

CloudQuery is an open-source data integration platform that allows you to export data from any source to any destination.

The CloudQuery OracleDB plugin allows you to sync data from OracleDB to any destination, including Kafka. It takes only minutes to get started.

OracleDB
oracledb
Official
Premium

OracleDB

The CloudQuery OracleDB plugin syncs your OracleDB database to any of the supported CloudQuery destinations

Publisher

cloudquery

Latest version

v4.4.9

Type

Source

Platforms
Date Published

Kafka
kafka
Official

Kafka

This destination plugin lets you sync data from a CloudQuery source to Kafka in various formats such as CSV, JSON. Each table will be pushed to a separate topic

Publisher

cloudquery

Repositorygithub.com
Latest version

v5.1.1

Type

Destination

Platforms
Date Published

MacOS Setup

Step 1. Install CloudQuery

brew install cloudquery/tap/cloudquery

Step 2. Log in to CloudQuery CLI

cloudquery login

Step 3. Configure OracleDB source plugin

You can find more information about the configuration in the plugin documentation

kind: source
spec:
  name: oracledb
  path: cloudquery/oracledb
  registry: cloudquery
  version: "v4.4.9"
  tables: ["*"]
  destinations: ["v5.1.1"]
  spec:
    # Connection string to connect to the database in the format oracle://user:password@server:port/service_name.
    # To use the default 1521 port, you can omit it from the connection string, but still need to keep the :, for example oracle://user:password@server:/service_name.
    connection_string: "${ORACLE_DB_CONNECTION_STRING}"
    # Optional parameters
    # queries: []
    # rows_per_record: 500
    # concurrency: 100

Step 4. Configure Kafka destination plugin

You can find more information about the configuration in the plugin documentation

kind: destination
spec:
  name: "kafka"
  path: "cloudquery/kafka"
  registry: "cloudquery"
  version: "v5.1.1"
  write_mode: "append"
  spec:
    # required - list of brokers to connect to
    brokers: ["<broker-host>:<broker-port>"]
    # optional - if connecting via SASL/PLAIN, the username and password to use. If not set, no authentication will be used.
    sasl_username: "${KAFKA_SASL_USERNAME}"
    sasl_password: "${KAFKA_SASL_PASSWORD}"
    format: "json" # options: parquet, json, csv
    format_spec:
      # CSV specific parameters:
      # delimiter: ","
      # skip_header: false
      # Parquet specific parameters:
      # version: "v2Latest"
      # root_repetition: "repeated"

    # Optional parameters
    # compression: "" # options: gzip
    # verbose: false
    # batch_size: 1000
    # topic_details:
      # num_partitions: 1
      # replication_factor: 1

Step 5. Run Sync

cloudquery sync oracledb.yml kafka.yml