Back to source plugin

Sync data from Azure to Snowflake

CloudQuery is the simple, fast data integration platform that can fetch your data from Azure APIs and load it into Snowflake
Azure
Snowflake

Trusted by

Self-hosted

Start locally, then deploy to a Virtual Machine, Kubernetes, or anywhere else. Full instructions on CLI setup are available in our documentation.

Cloud-hosted

Start syncing in a few clicks. No need to deploy your own infrastructure.

Fast and reliable

CloudQuery’s efficient design means our syncs are fast and a sync from Azure to Snowflake can be completed in a fraction of the time compared to other tools.

Easy to use, easy to maintain

Azure syncing using CloudQuery is easy to set up and maintain thanks to its simple YAML configuration. Once synced, you can use normal SQL queries to work with your data.

A huge library of supported destinations

Snowflake isn’t the only place we can sync your Azure data to. Whatever you need to do with your Azure data, CloudQuery can make it happen. We support a huge range of destinations, customizable transformations for ETL, and we regularly release new plugins.

Extensible and Open Source SDK

Write your own connectors in any language by utilizing the CloudQuery open source SDK powered by Apache Arrow. Get out-of-the-box scheduling, rate-limiting, transformation, documentation and much more.

Step by step guide for how to export data from Azure to Snowflake

MacOS Setup

Step 1: Install CloudQuery

To install CloudQuery, run the following command in your terminal:

brew install cloudquery/tap/cloudquery

Step 2: Create a Configuration File

Next, run the following command to initialize a sync configuration file for Azure to Snowflake:

cloudquery init --source=azure --destination=snowflake

This will generate a config file named azure_to_snowflake.yaml. Follow the instructions to fill out the necessary fields to authenticate against your own environment.

Step 3: Log in to CloudQuery CLI

Next, log in to the CloudQuery CLI. If you have't already, you can sign up for a free account as part of this step:

cloudquery login

Step 4: Run a Sync

cloudquery sync azure_to_snowflake.yaml

This will start syncing data from the Azure API to your Snowflake database! 🚀

See the CloudQuery documentation portal for more deployment guides, options and further tips.

FAQs

What is CloudQuery?
CloudQuery is an open-source tool that helps you extract, transform, and load cloud asset data from various sources into databases for security, compliance, and visibility.
Why does CloudQuery require login?
Logging in allows CloudQuery to authenticate your access to the CloudQuery Hub and monitor usage for billing purposes. Data synced with CloudQuery remains private to your environment and is not shared with our servers or any third parties.
What data does CloudQuery have access to?
CloudQuery accesses only the metadata and configurations of your cloud resources that you specify without touching sensitive data or workloads.
How is CloudQuery priced?
CloudQuery offers flexible pricing based on the number of cloud accounts and usage. Visit our pricing page for detailed plans.
Is there a free version of CloudQuery?
Yes, CloudQuery offers a free plan that includes basic features, perfect for smaller teams or personal use. More details can be found on our pricing page.
What data can CloudQuery sync from Azure to Snowflake?
The CloudQuery Azure plugin can sync a wide variety of Azure data to Snowflake. You can also limit the data that is synced to data contained in a particular subscription.
What permissions does the CloudQuery Azure plugin require?
We recommend that you only grant read permissions to the plugin as this is all that you will require to sync your data from Microsoft Azure to Snowflake. As this is a source plugin, CloudQuery will never write information to Azure.
Can I rate limit the requests that CloudQuery makes to the Azure API?
If you would like to reduce the number of requests that CloudQuery can make to the API when it is syncing data from Azure to Snowflake then you can set a rate limit in your account. These are managed on a per-subscription basis and further information can be found here.
Should I load information directly to Snowflake from Azure or load via remote storage?
For testing purposes, loading directly into Snowflake is the fastest way to get started and check you are achieving the desired results. However, we don’t recommend choosing this approach for a particularly large dataset or once you move to a production environment. At this stage, it is recommended to sync the data from your source to a CSV or JSON file in a remote storage environment such as S3 or Google Cloud Storage and then running a Snowflake cron job or using SnowPipe to load that information into a Snowflake stage.
What authentication options are available for the Snowflake integration?
You can choose between basic authentication using a username and password or use a private and public key pair to authenticate with Snowflake. If you choose to use a private key, you can either place this inline in the specification or reference a separate file where the private key is stored.