Report an issue
Back to plugin list
firehose
Official

Amazon Kinesis Firehose

This destination plugin lets you sync data from a CloudQuery source to Amazon Kinesis Firehose

Publisher

cloudquery

Repositorygithub.com
Latest version

v2.5.16

Type

Destination

Platforms
Date Published

Price

Free

Overview #

Amazon Kinesis Firehose Destination Plugin

This destination plugin lets you sync data from a CloudQuery source to Amazon Kinesis Firehose.

Authentication #

Authentication is handled by the AWS SDK. Credentials and configurations are sourced from the environment. Credentials are sourced in the following order:
  1. Environment variables. Static Credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN) Web Identity Token (AWS_WEB_IDENTITY_TOKEN_FILE)
  2. Shared configuration files. SDK defaults to credentials file under .aws folder that is placed in the home folder on your computer. SDK defaults to config file under .aws folder that is placed in the home folder on your computer.
  3. If your application uses an ECS task definition or RunTask API operation, IAM role for tasks.
  4. If your application is running on an Amazon EC2 instance, IAM role for Amazon EC2.

Example #

kind: destination
spec:
  name: "firehose"
  path: "cloudquery/firehose"
  registry: "cloudquery"
  version: "v2.5.16"
  write_mode: "append" # this plugin only supports 'append' mode
  spec:
    # Required parameters e.g. arn:aws:firehose:us-east-1:111122223333:deliverystream/TestRedshiftStream
    stream_arn: "${FIREHOSE_STREAM_ARN}"
    # Optional parameters
    # max_retries: 5
    # max_record_size_bytes: 1024000 # optional
    # max_batch_records: 500 # optional
    # max_batch_size_bytes: 4194000 # optional
The (top level) spec section is described in the Destination Spec Reference.
The Amazon Kinesis Firehose destination utilizes batching, and supports batch_size and batch_size_bytes.
It is important to note that Amazon Kinesis Firehose has the following limitations that cannot be changed:
  • The maximum size of a record sent to Kinesis Data Firehose, before base64-encoding, is 1,000 KiB.
  • The PutRecordBatch operation can take up to 500 records per batch or 4 MiB per batch, whichever is smaller.

Firehose Spec #

  • stream_arn (string) (required)
    Kinesis Firehose delivery stream ARN where data will be sent to.
    Format: arn:${Partition}:firehose:${Region}:${Account}:deliverystream/${DeliveryStreamName}.
  • max_retries (integer) (optional) (default: 5)
    Amount of retries to perform when writing a batch.
  • max_record_size_bytes (integer) (optional) (default: 1024000 (1000 KiB))
    Number of bytes (as Arrow buffer size) to write before starting a new record.
  • max_batch_records (integer) (optional) (default: 500 (1000 KiB))
    Number of records allowed in a single batch.
  • max_batch_size_bytes (integer) (optional) (default: 4194000 (~4000 KiB))
    Number of bytes allowed in a single batch.