Using Azure Blob Storage
Using Azure Blob Storage with MinIO Azure Gateway
Azure like other cloud providers has support for an object store, called Azure Blob Storage. This solution is similar to S3 or GCS, but unfortunately, most applications out there will not have direct support for Azure Blob Storage.
We can get past this limitation using MinIO Azure Gateway, which will provide an S3 interface for the Azure Blob Storage.
This tutorial will walk you through how to create an Azure Blob Storage, and then use Docker-Compose to run a MinIO Azure Gateway with some sort of client to access our system.
The Tools
These are the tools required to fully use this tutorial:
- Azure CLI: allows programmatic access Azure Resource Manager from the command line.
- jq: used to parse JSON from the command line
- bash (bourne-again shell): used to run scripts
- Docker and Docker Compose: used to run MinIO docker container
Creating Azure Blob Storage
Creating object storage on Azure will involve the following steps:
- resource group: container to hold related resources
- storage account: container for Azure storage objects (names must be globally unique)
- authorize access: authorize with SAS or Azure Active Directory.
- container : contains properties, metadata, and zero or more blogs.
- blob: any single entity comprised of binary data, properties, and metadata.
The Script
Below is a script that will create the required components, and grant access to your current signed-in account access to the newly created storage account. Download the script below as create_blob.sh
:
To use this script you can run the following:
export MY_RESOURCE_GROUP=my-superfun-resources
export MY_LOCATION=eastus2
export MY_STORAGE_ACCT=my0new0unique0storage
export MY_CONTAINER_NAME=storage-blob-testbash create_blob.sh
Chose variable names that make sense to you. The one that you need to pay most attention is the storage account name (MY_STORAGE_ACCT
), which has to be unique globally.
Upload a File
Create a file and upload it:
touch helloworldaz storage blob upload \
--account-name ${MY_STORAGE_ACCT} \
--container-name ${MY_CONTAINER_NAME} \
--name helloworld \
--file helloworld \
--auth-mode login
Feel free to upload other files to make this more interesting.
Check the Results
Now we can verify that the file(s) exists:
az storage blob list \
--account-name ${MY_STORAGE_ACCT} \
--container-name ${MY_CONTAINER_NAME} \
--auth-mode login | jq '.[].name'
Launching MinIO Azure Gateway
We will use a docker environment managed through the tool Docker Compose to run the following:
- MinIO Azure Gateway container configured to use Azure Blob
- Ubuntu client container to test the gateway
Step 1: Build Compose Environment with Storage Credentials
First we need to get extract the AccountName
and AccountKey
from the connection string. These will be configured values that MinIO Azure Gateway will use to access Azure Blob Storage.
You can download this script presented below and save this as create_env.sh
:
Similar to the steps before to create the Azure Blob, and adjusted for alternative values you chose, we can run the script with the following:
export MY_RESOURCE_GROUP=my-superfun-resources
export MY_STORAGE_ACCT=my0new0unique0storagebash create_env.sh
This will setup a .env
that Docker Compose will use.
Step 1: Create Docker Compose Configuration
The following docker-compose.yml
file is the configuration we can use for this small demo:
The environment variables MINIO_ACCESS_KEY
and MINIO_SECRET_KEY
that are defaults in the .env
file are used to inject values in both containers at run time. The azure-gateway
container will use these to configure itself as well as access the Azure Blob. The azure-client
container will need these to configure access to the Azure Gateway.
Step 2: Create the Client Dockerfile
For the azure-client
, we want to install the MinIO client tool as well as s3cmd
tool. These can be used to access the gateway using S3 API. Download this Dockerfile
:
Step 3: Create Entrypoint Script
Because the azure-client
container needs access to the azure-gateway
container when initially starting up, we can use an entrypoint script to inject some values. Download this entrypoint.sh
Step 4: Build and Run
With all of the files above in place, we should have a directory structure that looks like the following:
.
├── create_blob.sh
├── create_env.sh
├── docker-compose.yml
├── Dockerfile
└── entrypoint.sh
Now we can build the environment with the following command:
docker-compose build
Once completed, we can bring up the environment with:
docker-compose up --detach
You can verify that the containers are running with docker-compose ps
. If the client did not come up, just run docker-compose up -d
again.
Step 5: Test with MinIO Client and s3cmd
Using the environment variable we setup before for the container name, we can run the mc
command inside the container with the following:
export MY_CONTAINER_NAME=storage-blob-test
docker exec --tty azure-client mc ls myazure/$MY_CONTAINER_NAME
Similarly, we can do the same with s3cmd
and run from inside the container:
export MY_CONTAINER_NAME=storage-blob-test
docker exec --tty azure-client s3cmd ls s3://$MY_CONTAINER_NAME
Cleaning Up
Docker Environment
When finished, you can remove the locally running docker containers created with Docker Compose using the following:
docker-compose stop && docker-compose rm
Azure Cloud Resources
To delete all the resources we created, you can use this script below as delete_blob.sh:
CAUTION: Only use this script for this tutorial. If you use this for other projects, you need to update the logic to not delete storage account or resource group if other resources reside in those containers. This script is tailored to cleanup resources created explicitly with the create_blob.sh
script above.
Using the same values we used with the creation of these resources (adjusted to your personal preferences):
export MY_RESOURCE_GROUP=my-superfun-resources
export MY_STORAGE_ACCT=my0new0unique0storage
export MY_CONTAINER_NAME=storage-blob-testbash delete_blob.sh
Resources
Source Code
- Blog Source Code: https://github.com/darkn3rd/blog_tutorials/tree/master/azure/blob/blob_storage/1.azure_cli
Azure Docs
- Resource Manager Overview: https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/overview#resource-groups
- Creating a Storage Account: https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create?toc=%2Fazure%2Fstorage%2Fblobs%2Ftoc.json&tabs=azure-portal
- Azure Blob Storage: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction
MinIO Server
- MinIO Azure Gateway: https://docs.min.io/docs/minio-gateway-for-azure.html
Client Tools
Docker Stuff
- Docker CLI Reference: https://docs.docker.com/engine/reference/commandline/cli/
- Docker Compose: https://docs.docker.com/compose/
- The env file: https://docs.docker.com/compose/environment-variables/#the-env-file
- Entrypoint: https://docs.docker.com/engine/reference/builder/#entrypoint
Articles
- MinIO Azure Gateway: https://docs.gitlab.com/charts/advanced/external-object-storage/azure-minio-gateway.html
- How to use s3cmd and any other Amazon S3-compatible app with Azure Blob Storage: https://withblue.ink/2017/10/29/how-to-use-s3cmd-and-any-other-amazon-s3-compatible-app-with-azure-blob-storage.html
Conclusion
This is my first article on topics related to Azure. I wanted to cover using object-stores this is by far the most often used cloud resource beyond instances (virtual machines) and containers.
I threw in some Docker usage as Docker is ubiquitous for local development environments and testing out new technologies, such as MinIO and Azure Blob.