What are Durable Azure Functions?

Lately in the Azure Functions world, there has been a lot of hype around Durable Functions. Well, what are they? Durable Functions enable us to write stateful processes in a serverless environment. That’s a big deal because normal Azure Functions have a time limit of about 10 minutes for execution times. Durable functions unlock a whole new world of possibilities to move jobs requiring hours, days, or even weeks to the cloud while still only paying for execution time. For real? YES, for real.

Another key concept to get across with Durable Functions, is they are really orchestrator functions. At first, this concept didn’t immediately “click” in my head; with the name “durable” I was thinking their only use was for one long running function. This is not the case! Durable functions can be used to chain together a number of Azure Functions to be executed sequentially, handling everything from when human interaction is required, to making async HTTP API calls, and allows monitoring and executing the functions in parallel, waiting for the combined result.

A few use cases I can think of:

You have one Function that does some sort of data collection upfront (site scrapping, CSV, etc). Instead of having to put this data in to some temporary storage (blog, queue, table, etc) we can just pass this data to another Azure Function to do the other data manipulation/processing. We can eliminate using some storage accounts.
An approval process which requires different approvals from various departments. An event triggers the durable function which can then wait for external events, in this case the approvals, to come in. These events can come in, in any order and the function won’t complete until it has all of the approvals (or disapprovals).
Image Processing. You can have multiple Azure Functions that resize the image, optimize it, tag it, change formats, etc. Each one of those steps can be its own isolated Function. Using a Durable Function allows us to pass the data from one Function to another without putting something in a temporary storage.
There are so many applications for Durable Functions to move many of your long running, chained, stateless processes.

Durable Functions are the orchestrators of your pipeline.

This also brings up some questions though, all of which I will answer!

What happens with deployments? For example, I have a Durable Function running in Production and it requires some sort of human interaction which could take a day to complete. What happens when I deploy new code 12 hours in? Will the old function stick around until all of it’s traffic has drained and been completed?

Can I run these locally like normal Azure Functions? Are there any caveats with the dev tools/set up?

How does the Function know how to pick up where it left off a day later? Surely there must be some data stored somewhere. Where is that place?

Is there still a normal cold start time when the Function picks back up where it left off?

Can I combine the orchestration triggers with other triggers such as HTTP, queue, blob, etc?

How will application insights report Durable Functions? ProcessIds, consumption times, logging?

If you can think of any other questions, please comment here and I will do my best to get you an answer/example!