Back to source plugin

Export from DigitalOcean to Databricks

CloudQuery is an open-source data integration platform that allows you to export data from any source to any destination.

The CloudQuery DigitalOcean plugin allows you to sync data from DigitalOcean to any destination, including Databricks. It takes only minutes to get started.

digitalocean
Official
Premium

DigitalOcean

The CloudQuery DigitalOcean plugin pulls configuration from DigitalOcean and loads it into any supported CloudQuery destination

Publisher

cloudquery

Repositorygithub.com
Latest version

v6.1.1

Type

Source

Platforms
Date Published

databricks
Official

Databricks

This plugin is in preview.

Sync your data from any supported CloudQuery source into the Databricks Data Intelligence Platform.

Publisher

cloudquery

Repositorygithub.com
Latest version

v1.0.3

Type

Destination

Platforms
Date Published

MacOS Setup

Step 1. Install CloudQuery

brew install cloudquery/tap/cloudquery

Step 2. Log in to CloudQuery CLI

cloudquery login

Step 3. Configure DigitalOcean source plugin

You can find more information about the configuration in the plugin documentation

kind: source
spec:
  # Source spec section
  name: digitalocean
  path: cloudquery/digitalocean
  registry: cloudquery
  version: "v6.1.1"
  tables: ["*"]
  destinations: ["databricks"]
  spec:
    # required, unless env variable DIGITALOCEAN_TOKEN or DIGITALOCEAN_ACCESS_TOKEN is set
    token: "${DIGITALOCEAN_ACCESS_TOKEN}"
    # Optional parameters
    # spaces_regions: ["nyc3", "sfo3", "ams3", "sgp1", "fra1", "syd1"]
    # spaces_access_key: ""
    # spaces_access_key_id: ""
    # spaces_debug_logging: false
    # concurrency: 10000

Step 4. Configure Databricks destination plugin

You can find more information about the configuration in the plugin documentation

kind: destination
spec:
  name: "databricks"
  path: "cloudquery/databricks"
  registry: "cloudquery"
  version: "v1.0.3"
  write_mode: "append"
  spec:
    hostname: ${DATABRICKS_HOSTNAME} # optionally it can include protocol like https://abc.cloud.databricks.com
    http_path: ${DATABRICKS_HTTP_PATH} # HTTP path for SQL compute
    staging_path: ${DATABRICKS_STAGING_PATH} # Databricks FileStore or Unity volume path to store temporary files for staging
    auth:
      access_token: ${DATABRICKS_ACCESS_TOKEN}
    # Optional parameters
    # protocol: https
    # port: 443
    # catalog: ""
    # schema: "default"
    # migration_concurrency: 10
    # timeout: 1m
    # batch:
    #   size: 10000
    #   bytes: 5242880 # 5 MiB
    #   timeout: 20s

Step 5. Run Sync

cloudquery sync digitalocean.yml databricks.yml
Subscribe to product updates

Be the first to know about new features.