Back to source plugin

Export from Bitly to S3

CloudQuery is an open-source data integration platform that allows you to export data from any source to any destination.

The CloudQuery Bitly plugin allows you to sync data from Bitly to any destination, including S3. It takes only minutes to get started.

Bitly
bitly
Official

Bitly

This plugin is in preview.

Sync Bitly links stats to a destination of your choice.

Publisher

cloudquery

Repositorygithub.com
Latest version

v1.1.0

Type

Source

Platforms
Date Published

S3
s3
Official

S3

This destination plugin lets you sync data from a CloudQuery source to remote S3 storage in various formats such as CSV, JSON and Parquet

Publisher

cloudquery

Repositorygithub.com
Latest version

v7.2.8

Type

Destination

Platforms
Date Published

MacOS Setup

Step 1. Install CloudQuery

brew install cloudquery/tap/cloudquery

Step 2. Log in to CloudQuery CLI

cloudquery login

Step 3. Configure Bitly source plugin

You can find more information about the configuration in the plugin documentation

kind: source
spec:
  name: "bitly"
  registry: "docker"
  path: "docker.cloudquery.io/cloudquery/source-bitly:v1.0.0"
  tables: ['*']
  destinations: ["sqlite"]
  spec:
    group_id: ${BITLY_GROUP_ID}     # mandatory
    api_token: ${BITLY_API_TOKEN}   # mandatory
    extract_utm: true               # optional. If set, extracts utm_tags from the long_url into separate columns
    # optional. unit to use to query last 1 {unit} of clicks by a country. Default: month. Values: hour, day, week, month.
    countries_summary_unit: "month" 
    # optional. unit to use to query last 1 {unit} of clicks by a referrer. Default: month. Values: hour, day, week, month.
    referrers_summary_unit: "month" 
    #optional: get data only for the links on the list 
    only: 
      - bit.ly/1234567
    # optional, includes only links created after the specified date. Supported formats:
    # - "YYYY-MM-DD"
    # - "-<number> <unit>" where number is integer and unit is "day", "week"
    created_after: "-1 day"

Step 4. Configure S3 destination plugin

You can find more information about the configuration in the plugin documentation

kind: destination
spec:
  name: "s3"
  path: "cloudquery/s3"
  registry: "cloudquery"
  version: "v7.2.8"
  write_mode: "append"
  # Learn more about the configuration options at https://cql.ink/s3_destination
  spec:
    bucket: "bucket_name"
    region: "region-name" # Example: us-east-1
    path: "path/to/files/{{TABLE}}/{{UUID}}.{{FORMAT}}"
    format: "parquet" # options: parquet, json, csv
    format_spec:
      # CSV-specific parameters:
      # delimiter: ","
      # skip_header: false

    # Optional parameters
    # compression: "" # options: gzip
    # no_rotate: false
    # athena: false # <- set this to true for Athena compatibility
    # write_empty_objects_for_empty_tables: false # <- set this to true if using with the CloudQuery Compliance policies
    # test_write: true # tests the ability to write to the bucket before processing the data
    # endpoint: "" # Endpoint to use for S3 API calls.
    # endpoint_skip_tls_verify # Disable TLS verification if using an untrusted certificate
    # use_path_style: false
    # batch_size: 10000 # 10K entries
    # batch_size_bytes: 52428800 # 50 MiB
    # batch_timeout: 30s # 30 seconds

Step 5. Run Sync

cloudquery sync bitly.yml s3.yml

© 2024 CloudQuery, Inc. All rights reserved.