Getting Started with GCloud SDK: Part 2

Automaton Funky Town Time

Joaquín Menchaca (智裕)
4 min readApr 21, 2018

--

In a previous article, I detailed how to get started with gcloud tool with a simple bash script and some of the configuration embedded into the code:

Now is a time to get a little insane and enter automaton funky town using a mean mix of bash, jq, gcloud.

In this article, I show how we can use IaC (Infrastructure As Code) pattern to craft our systems using a descriptive definition file using JSON. We design the systems using JSON, rather than explicitly putting it in the code.

The Tools

If you do not have jq tool, you really want to grab this tool, it is that awesome. On Ubuntu systems, you can grab this with apt-get install -y jq and on macOS with Homebrew, you can do brew install jq. These have been tested with jq version 1.5, so if you have an earlier version tool, grab the binary off the site.

Because bash v4, macOS will need to install this. Using Homebrew, you can type brew install bash. And of course, you should already have GCloud SDK installed and setup for your with credentials for your account.

The Setup

As before, we need to configure the project and service account. In this example we’ll use the defaults.

export GCP_SERVICE_ACCT=$(gcloud iam service-accounts list \
--filter='email ~ [0-9]*-compute@.*' \
--format='table(email)' \
| grep -v EMAIL
)
export GCP_PROJECT=$(gcloud config list \
--format 'value(core.project)'
)

The Data

The JSON here is a list of objects where the key represents our instance name. The format looks something like this (abbreviated for brevity):

{
"es-01": {
"zone": "us-east1-b",
"machine-type": "n1-standard-1",
"scopes": [
"storage-ro",
"logging-write",
"monitoring-write",
"pubsub",
"service-management",
"service-control",
"compute-rw"
],
"tags": [
"elasticsearch"
],
"image": "ubuntu-1404-trusty-v20180308",
"image-project": "ubuntu-os-cloud",
"boot-disk-size": "10GB",
"boot-disk-type": "pd-standard",
"metadata": {
"block-project-ssh-keys": "FALSE"
}
},
"es-02": {
"zone": "us-east1-c",
...
},
"es-03": {
"zone": "us-east1-d",
...
}
}

This is a schema of my own design that represents creating a variety of systems with basic options. The full JSON is available:

The Script

The script in the previous article declared some of the construction in the code logic, which generally you want to avoid as it is harder to maintain. This script creates all the systems dynamically completely from the JSON definition file.

Part 1: The Check

As with any good script, we want to check to make sure proper environment variables are configured and the data file is available.

Part 2: Data Structures

We’ll store are data in an associative array INSTANCES indexed the instance name, much like the structure of the JSON itself. We’ll also use two arrays, KEYS and SYSTEMS to index the instance names and keys in our structure.

Part 3: Read the JSON

This parts cycles through our list of systems and builds our structure. Any arrays are reduced to space delimited strings for easier processing later.

The tool jq is used to do some magic extraction from the JSON. This is the process I used when experimenting extracting keys and values:

KEYS=($(jq -r '.[system_name] | keys | .[]' file.json))
for KEY in ${KEYS[*]); do
VALUE=$(jq -r ".[system_name].$KEY | .[]" file.json)
echo "KEY=$KEY, VALUE=$VALUE"
done

Part 4: Create Some Systems

Now for the truly funky part: read through our data structure, and create some instances:

This requires some explanation.

In the first part, we extract the space delimited strings and turn them into proper arrays (lines 6 and 7). This is a trixy hobbitz way to use a split facility in bash:

ARRAY=(foo bar baz)
ARRAY=($(echo $STRING_WITH_SPACES))
ARRAY=($(echo ${ASSOC_ARRAY[$KEY]}))

Once we have an array, we can use another trick to do a join operation to create a comma separated string (lines 8 and 16):

COMMA_SEP_STRING=$(IFS=,; echo ${ARRAY[*]})

We need to handle a case where there are no tags (lines 8 and 18). With the gcloud compute instances create command, we cannot have a --tags parameter without an tags, for the command will error out. So we build a tags option line that is entirely blank or has the option with some tags.

The meta data is handled after the systems are created. We first get a status to see of the system (instance) is finished (lines 27 and 33), and then add the metadata if it is safe.

Where to Go from Here…

As you can imagine, Bash (Bourne Again Shell) is quite powerful and when combined parsing tools like jq, yq, or xq to slice through data, and combined tools that communicate cloud RESTful APIs like gcloud and curl, bash can do some serious kung-fu, but may, well, not be all that intuitive.

Other languages like Perl, Ruby, and Python either have a built in facility or many libraries that can convert JSON, YAML, or XML into the languages’ native data-structures.

Instead of a wrapper script that calls gcloud tool, which is what we are doing, you can use a library that communicates to Google Cloud directly. With Ruby you have google-api-client or fog-google, and Python has google-api-python-client and Apache libcloud. Google has documentation guide for these and other languages:

If you do not want to raw programming Google Cloud, you can use higher level tools that use a DSL to command Google Cloud. Both Terraform and Ansible have built-in support for Google Cloud, and there’s a Puppet gcompute module and a Chef google-gcompute cookbook. Google has this amazing solution guide about such tools:

The adventure continues.

--

--

Joaquín Menchaca (智裕)

DevOps/SRE/PlatformEng — k8s, o11y, vault, terraform, ansible