Leverage Azure CLI to automate cloud file transfers efficiently
The Azure CLI lets you automate file transfers and manage Azure Storage from your command line. This guide shows you how to streamline your cloud workflows with practical examples.
Installation and setup
Install Azure CLI on your system:
On Windows
winget install -e --id Microsoft.AzureCLI
On macOS
brew install azure-cli
On Linux (Ubuntu/Debian)
# The current recommended way to install on Ubuntu/Debian
curl -sL https://aka.ms/InstallAzureCLIDebianScript | sudo bash
After installation, authenticate with Azure:
az login
You can also use service principals for automated/headless authentication:
az login --service-principal \
--username APP_ID \
--password PASSWORD \
--tenant TENANT_ID
Verify the installation and check version:
az --version
Creating a storage account
Create a resource group and storage account:
# Create resource group
az group create --name myResourceGroup --location eastus
# Create storage account with security settings
az storage account create \
--name mystorageaccount \
--resource-group myResourceGroup \
--location eastus \
--sku Standard_LRS \
--min-tls-version TLS1_2 \
--allow-blob-public-access false \
--https-only true
Using connection strings for security
Get and use connection strings instead of account keys:
# Get the connection string
connection_string=$(az storage account show-connection-string \
--name mystorageaccount \
--resource-group myResourceGroup \
--query connectionString \
--output tsv)
# Use connection string for operations
az storage blob upload \
--container-name mycontainer \
--file /path/to/local/file.txt \
--name remote-file.txt \
--connection-string "$connection_string"
Transferring files using Azure CLI
Uploading files
Upload single or multiple files to Azure Storage:
# Get storage account key
account_key=$(az storage account keys list \
--resource-group myResourceGroup \
--account-name mystorageaccount \
--query '[0].value' -o tsv)
# Create a container
az storage container create \
--name mycontainer \
--account-name mystorageaccount \
--account-key $account_key
# Upload a file
az storage blob upload \
--container-name mycontainer \
--file /path/to/local/file.txt \
--name remote-file.txt \
--type block \
--account-name mystorageaccount \
--account-key $account_key
Downloading files
Retrieve files from Azure Storage:
az storage blob download \
--container-name mycontainer \
--name remote-file.txt \
--file /path/to/local/destination.txt \
--account-name mystorageaccount \
--account-key $account_key
Automating batch file transfers with Azure CLI
Create a script to handle multiple file transfers:
#!/bin/bash
# Configuration
source_dir="/path/to/local/files"
container_name="mycontainer"
account_name="mystorageaccount"
# Get account key
account_key=$(az storage account keys list \
--resource-group myResourceGroup \
--account-name $account_name \
--query '[0].value' -o tsv)
# Upload all files in directory
for file in "$source_dir"/*; do
filename=$(basename "$file")
az storage blob upload \
--container-name $container_name \
--file "$file" \
--name "$filename" \
--account-name $account_name \
--account-key $account_key
echo "Uploaded: $filename"
done
Optimizing transfer performance
For large files or many files, use concurrent uploads:
az storage blob upload-batch \
--source /path/to/source/directory \
--destination mycontainer \
--account-name mystorageaccount \
--account-key $account_key \
--max-parallel 10
Managing file access with Azure storage
Generate SAS tokens for secure file sharing:
Set an expiry time for the SAS token (e.g., 30 minutes from now). Note that the date
command
syntax differs between Linux and macOS.
On Linux:
end_date=$(date -u -d "30 minutes" '+%Y-%m-%dT%H:%MZ')
On macOS:
end_date=$(date -u -v +30M '+%Y-%m-%dT%H:%MZ')
Then generate the SAS token:
az storage blob generate-sas \
--container-name mycontainer \
--name remote-file.txt \
--account-name mystorageaccount \
--account-key $account_key \
--permissions r \
--expiry $end_date
Monitoring and logging
Track file transfer operations:
# List all blobs in a container
az storage blob list \
--container-name mycontainer \
--account-name mystorageaccount \
--account-key $account_key \
--output table
# Get properties of a specific blob
az storage blob show \
--container-name mycontainer \
--name remote-file.txt \
--account-name mystorageaccount \
--account-key $account_key
Best practices
-
Store credentials safely: Keep account keys and sensitive data in environment variables or secure configuration files.
-
Add retry logic: Handle network interruptions by adding retry mechanisms to your transfer scripts.
-
Check file integrity: Validate transfers using MD5 or SHA-256 checksums.
-
Implement proper error handling: Include error handling in your scripts to manage exceptions and provide informative messages. This makes troubleshooting easier and improves the reliability of your automation.
-
Use managed identities when possible: Instead of using account keys, utilize Azure Managed Identities for authentication. This enhances security by eliminating the need to store and manage access keys manually.
-
Use azcopy for large-scale transfers: For bulk transfers or very large files, consider using
azcopy
instead of basic Azure CLI commands:azcopy copy "/path/to/local/file.txt" "https://mystorageaccount.blob.core.windows.net/mycontainer/remote-file.txt"
-
Enable soft delete: Enable soft delete for blob data to protect against accidental deletions:
az storage blob service-properties delete-policy update \ --days-retained 7 \ --enable true \ --account-name mystorageaccount \ --account-key $account_key
-
Use lifecycle management: Set up lifecycle management rules to automatically move or delete blobs based on age:
az storage account management-policy create \ --account-name mystorageaccount \ --resource-group myResourceGroup \ --policy @policy.json
-
Enable versioning: Enable blob versioning to maintain multiple versions of your files:
az storage account blob-service-properties update \ --account-name mystorageaccount \ --enable-versioning true
Next steps
- Set up your first automated file transfer
- Configure lifecycle management for your storage account
- Enable versioning for critical files
Need to handle complex file processing workflows? Check out Transloadit for comprehensive file importing and exporting services.
Troubleshooting common issues
-
Check connectivity issues:
az network watcher check-connection \ --source sourceResourceId \ --dest storage_account_endpoint
-
View operation logs:
az storage logging show \ --account-name mystorageaccount \ --account-key $account_key \ --services b
-
Test storage account access:
az storage account check-name \ --name mystorageaccount