In the last mini-series inside the series (🙃), we will go through how to build dynamic pipelines in Azure Data Factory. In this post, we will look at parameters, expressions, and functions. Later, we will look at variables, loops, and lookups. Fun!
But first, let’s take a step back and discuss why we want to build dynamic pipelines at all.
(Pssst! There are now also global parameters, woohoo! They didn’t exist when I first wrote this blog post. I’ll be adding this shortly!)
Hardcoded Solutions
Back in the post about the copy data activity, we looked at our demo datasets. The LEGO data from Rebrickable consists of nine CSV files. So far, we have hardcoded the values for each of these files in our example datasets and pipelines.
Now imagine that you want to copy all the files from Rebrickable to your Azure Data Lake Storage account. Then copy all the data from your Azure Data Lake Storage into your Azure SQL Database. What will it look like if you have to create all the individual datasets and pipelines for these files?
Like this. It will look like this:
Hooboy! I don’t know about you, but I do not want to create all of those resources! 🤯
(And I mean, I have created all of those resources, and then some. I currently have 56 hardcoded datasets and 72 hardcoded pipelines in my demo environment, because I have demos of everything. And I don’t know about you, but I never want to create all of those resources again! 😂)
So! What can we do instead?
Dynamic Solutions
We can build dynamic solutions!
Continue reading →