Azure Functions running on Kubernetes using Keda

Arthur Ávila
7 min readFeb 18, 2021

--

Have you ever heard about Keda?

No? Soooo, just grab a coffee or whatever you want and follow me the goods!

What’s Keda?

Keda is a Event Driven Autoscaler based on Kubernetes. Was developed by Microsoft and Red Hat and now it’s a Cloud Native Computing Foundation (CNCF) sandbox project.
With Keda you can simply scale you application of any container in Kubernetes based on the number of events.
This means you can create any application that is Event-Drive and get it up when something arrives and get it down when it’s nothing there. This makes the cost of your application lower when running in a Cloud provider as AWS, Azure or GCP.
You can use Keda for scale based in a queue on Rabbit, Azure ServiceBus, Kafka. Using CPU usage, as a Cron, on MongoDB queries and many more

Awesome, isn’t it? 🤓

Great, but how does it works?

I will drop here my sample on github.
In this tutorial, I will show you how simple is create a Azure Function using Python ❤️ and deploy it on AKS. I also will show a pipeline that will deploy this functions automatically to your AKS.

Alright, talk is cheap and let’s code!

Pre-requisites

For start our Python functions we need a couple things.

  1. First of all, we need an Azure Subscription to create our AKS and the Azure ServiceBus. The free-trial it’s just great for create it.
  2. We gonna use the Azure Functions Core Tools for create, start and run our functions.
  3. Install Docker and a DockerHub account is essential.
  4. Kubectl to watch the beauty babies getting up.
  5. A repository on Github, Gitlab, Azure DevOps repos,…
  6. It’s not a must. But if you want to deploy it using the Azure pipeline from the example, you need to log in the Azure DevOps and run the pipeline. But if you have another CI/CD pipeline or want to deploy in a different way, it’s fair enough too.

Let’s start!!

First of all, we must create start the project on Azure DevOps repos, GitHubs or GitLab. You just have to configure your Azure DevOps to connect in your repository to run the pipeline when you want to.
After create, clone your project to your machine.

From here I will assume that you already have an AKS cluster created with Keda 2.1 installed, a ServiceBus with a queue and connection string

Starting a function

With the Azure Functions Tools it’s pretty easy to start a functions. We just need to run

func init . --docker

And that’s all folks! Thanks for reading…
Just kidding 🤣

Following this command, you will see in your terminal something like

Select a number for worker runtime:
1. dotnet
2. node
3. python
4. powershell
5. custom
Choose option:

Right here you will choose the option 3. After that, our project will be started with a Dockerfile that is prepared to run a Python function. But it’s not done yet. We need to create our function.

Creating a function

In the step 1 we just started our project. Now we gonna create our function running the command

func new

And now we will choose which template we need for this function. For this tutorial, you must choose the option 11.

Select a number for template:
1. Azure Blob Storage trigger
2. Azure Cosmos DB trigger
3. Durable Functions activity
4. Durable Functions HTTP starter
5. Durable Functions orchestrator
6. Azure Event Grid trigger
7. Azure Event Hub trigger
8. HTTP trigger
9. Azure Queue Storage trigger
10. RabbitMQ trigger
11. Azure Service Bus Queue trigger
12. Azure Service Bus Topic trigger
13. Timer trigger
Choose option:

After that, we must choose a name for our function. You can choose the name you want.

Azure Service Bus Queue trigger
Function name: [ServiceBusQueueTrigger]

After create our function, we must change the file local.settings.json and include the servicebus connection like:

And in the folder that was created our functions, we need to include the queue name as well

And for now, our function is ready!! 🎉🎊

Testing it locally

To run our function, just run

func start

and it will start. For test it, I created a python script that it’s in my github sample that I posted up here, but it’s simple do it and I show you.
We need to export the AzureWebJobsStorage and QUEUE_NAME to our OS like:

export AzureWebJobsStorage='<service-bus-connection>'
export QUEUE_NAME=<queue-name>

Exporting it, you can create or use the script:

This script will send 100 events to your ServiceBus queue and your function running locally will consume all this events.

Alright, for now we created our function and tested it! Now we need to deploy it to our AKS. As I wrote up here, I will use an azure-pipeline.yml to deploy.

Manifests files and deploy pipeline

In this project I created a folder called manifests and there we have 2 files. The deployment.yml that we gonna use to configure our pod on AKS and the scaledobject.yml that is the configuration file that Keda will use to understand when to scale the application. Let’s see what is up on deployment.yml

This is a simple deployment file that I used to configure my pod. In the container I’m using my container image from DockerHub. Feel free to use it or create your own, it’s up to you.

Now, let’s see how the scaledobject.yml looks like

This file is for Keda version 2.x. For version 1.x this may be different. Have a look on Documentation.
But explaining this file, on scaleTargetReF we set the minimum of replicas I want. I set 0 because when there is no event in the queue I assume that it’s not necessary have a pod running. And The maximum of 10 replicas.
In the triggers, I use the azure-servicebus. It may change when you want to scale from another service. But as this tutorial is about ServiceBus, we must use like that.
In the metadata we must give the name of the queue on queueName. In the messageCount I set 1 because when an event arrives, the pod get up for consume it. And finally the connectionFromEnv I set the AzureWebJobsStorage. This variable will use the connection string that we passed in the file local.settings.json.

Cool, now let’s talk about the pipeline. This pipeline is pretty simple at all.

This pipeline will trigger when merge to main branch, build an image with the dockerfile created by Azure Function Tools, push this image to our docker registry and deploy to AKS using the deployment.yml and scaledobject.yml files.
But we are not done yet, we must configure our Azure DevOps to run our pipeline.

Configuring Azure DevOps and Deploying to AKS

To run our pipeline and deploy it to our AKS, we must configure a couple things in Azure DevOps.
First of all, we start configuring Service connections. Here we configure our AKS connection, our repository for trigger when trigger to main branch and our docker registry connection.

After configuring this connections, we must configure our environments. As you may have note in the pipeline on the stage Deploy we have the environment. This environment we set on Pipelines. Here we set this environment to deploy to the AKS.

At the first time that we run a pipeline in Azure DevOps, we must trigger it manually.
To do it, we just go to Pipelines -> New pipeline and choose the repository where the project is and select the azure-pipelines.yml file and run the pipeline.

We are almost done, can you believe that?

After your pipeline run and deploy, you must see something like that:

It means that your functions has been deployed with success!! 🎊🥳

Alright, Alright… Let’s see it running mate!

If you run the command

kubectl get pods -n <your-name-space>

You will see something like this

It means that your pod is stopped because you don’t have any event in your queue and it’s not necessary to stay up consuming resources.

To watch how beautiful is your serveless function getting up with Keda, you can run the test that I show up here again.
After run that, use the command again to see your pods

kubectl get pods -n <your-name-space>

And you will see all pods getting up like this

Pretty nice, huh?
This is just one example of many other you can try out using Keda. You can check more examples on Keda’s github Sample’s repository

Well, I finish here… This was huge!
If you have any doubt or feedback, fell free to comment or contact me on linkedin. This is the first article I have ever wrote in my life, any feedback is welcome!

--

--