Deploying the Self-Hosted API Management Gateway to a Docker Container

At Ignite 2019, Microsoft announced a brand new capability of Azure API Management, allowing you to self-host the gateway component inside a container, running in your own on-premises environment or in any other cloud, while at the same time being able to manage the gateway from the central API Management instance in your Azure subscription. This is great, as it allows traffic towards your APIs stay in your local environment (makes security folks happy) and keeps latencies low (makes users happy), because there is no extra hop to the cloud.

This post is going to demonstrate how to deploy a self-hosted API Management gateway inside a Docker container, hooking it up to the API Management instance in Azure, checking gateway logs while accessing a backend service, and finally applying central policy configuration that will be picked up by the gateway on-the-fly changing its behavior. Last but not least, we will look at monitoring the gateway using Application Insights.

Overview

You might want to get started with an overview of the self-hosted API Management gateway by reading the official Azure documentation (quick read) or by checking out the more detailed whitepaper (12 pages). Especially the whitepaper does a great job in terms of explaining architecture and components involved in this hybrid scenario.

 

Create a local REST Service

In order to demonstrate the new hybrid capabilities of Azure API Management, we will create a backend REST service in terms of a container running locally on our development machine. The service just returns a static JSON string and is based on a simple Docker REST Service Image (nice small Golang image of 14MB). In this example I am using my Windows 10 laptop, running Docker Desktop Community edition and Windows Terminal with a WSL Zshell.

When you run the container with Docker it is going to listen on a localhost address on port 8080 (http://localhost:8080/test). CURL’ing it will provide a simple JSON response. This is the simulated backend service we will access via the self-hosted API gateway running in the same environment. Basically, it could be any REST endpoint being reachable in your networking environment via HTTP(S).

 

Configure API Management

Now let’s have a look at the API Management instance in Azure that we want to leverage as the management plane for the self-hosted gateway. As this feature is still in preview at the time of writing this blog post, it needs to be deployed in the Developer or Premium tier.

First, let’s create the API in the Azure portal. Make sure to specify the IP address of your dev box in the web service URL to allow the self-hosted gateway container to reach the backend API. Use 8080 as port and http as protocol scheme. In my example, I am using hybrid as suffix for my API. Also make sure you associate the API with a product (I chose Unlimited in this example).

Next, we need to add an operation to the API, using /test as the URL path suffix to our backend service.

For fun, let’s test the API call in the portal and check if the right backend URL gets hit. Of course, what we see is an HTTP 500 error, as the local service cannot be accessed by the standard API Management gateway in the cloud, right?

When checking the trace of the backend request, we can see that API Management is forwarding the call to the correct local URL:

In order to ‘fix’ this problem, in the next step let’s set up the self-hosted gateway in the API Management instance. There is a new ‘Gateway’ menu item in the ‘Settings’ section in the portal, allowing you to manage the gateways hosted outside of the Azure cloud.

The image below shows creation of a gateway containing our new HybridAPI.

Once that’s been created, you can select it in the portal and check out the ‘Deployment’ blade, showing the magic of self-hosted gateways: you can see the configuration settings of the gateway, as well as a Docker statement to actually deploy the instance in your local environment:

Deploy Self-Hosted Gateway

Now, as we have successfully configured our API Management instance in Azure we can deploy the self-hosted gateway. As shown in the portal instructions above, the gateway can be deployed in terms of a Docker container, associated with a piece of configuration. Before we start up the container, let’s download the env.conf file from the portal into your current working directory in Windows Terminal.

We can copy the docker statement from the portal as well, but need to tweak it a little, as our backend REST service is already listening on port 8080. So we change the mapping to 8081 and also remove the HTTPS mapping for now, as shown below:

The gateway image is based on Alpine Linux, image size is just around 280MB.

After the container has started, you can check out its logs via docker logs <containername>. First thing you will see is beautiful ASCII art, lifting the secret of the project’s original codename.

The logs also contain information about the federated API Management control plane in Azure. If you go back to the Azure portal, you should get indication that the self-hosted gateway is sending its heartbeat every minute or so.

If you shouldn’t see any sign of life, make sure the self-hosted gateway has outbound TCP/IP connectivity to Azure on port 443.

Test API Call

Now, it’s time to check if we can actually successfully hit our backend REST service via the self-hosted gateway. We can do so from the Terminal by running the curl statement below. Make sure to hit port 8081, which is the port the self-hosted gateway is listening on. Also note, that you have to copy a key from the ‘Subscriptions’ section of the API Management instance:

curl -H “Ocp-Apim-Subscription-Key: xxx…” http://localhost:8081/hybrid/test

And voilà, the local gateway is forwarding the call directly to the backend service, without any detour to the cloud! You can also see execution of the call in the gateway logs:

Configure Gateway Policy

Let’s see if we can actually apply a policy to the self-hosted gateway in the same way we would do it for a cloud gateway in API Management. Let’s add a rate limit policy to the Test operation and restrict it to 2 requests per 10 seconds:

or if you prefer to do it in XML:

The self-hosted gateway is actually checking every 10 seconds for config changes, and is applying configuration updates whenever they are available. You can see an entry in the container logs, showing the new configuration has been picked up:

To test out rate limiting, let’s run the curl statement 3 times in a row (within 10 seconds) and see if we will get an HTTP 429:

Indeed, the policy has been picked up immediately and applied to the running instance of the self-hosted gateway!

Monitoring in Application Insights

In case you configure an Application Insights workspace when creating your API Management instance, the self-hosted gateway will send telemetry data to the cloud. You can even check telemetry near real time in the portal by navigating to the App Insights workspace and opening ‘Live Metrics Stream’ from the menu. When executing a couple of curl statements in Windows Terminal, you can see the requests, including the HTTP 429 errors right away:

You can also leverage other features of Application Insights, including search and analysis on failures and performance records.

Summary

The self-hosted gateway in Azure API Management expands support for hybrid and multi-cloud scenarios and allows you to manage your distributed API landscape from a central API management instance in Azure. This approach is part of the new Azure Arc set of technologies.

While the self-hosted gateway in API Management is one of the first Azure services to be made available as a container running outside of Azure, don’t expect it to be the last. Rather consider it to be a harbinger of the future and many other services to follow.

Leave a Reply

Your email address will not be published. Required fields are marked *