Skip to content

Creating an Azure Data Factory

This post is part 2 of 10 in the series Beginner's Guide to Azure Data Factory

In the introduction to Azure Data Factory, we learned a little bit about the history of Azure Data Factory and what you can use it for. In this post, we will be creating an Azure Data Factory and getting familiar with the user interface.

Spoiler alert! Creating an Azure Data Factory is a fairly quick click-click-click process, and you’re done. But! Before you can do that, you need an Azure Subscription, and the right permissions on that subscription. Let’s get that sorted out first.

Azure Subscription and Permissions

If you don’t already have an Azure Subscription, you can create a free account on azure.microsoft.com/free. (Woohoo! Free! Yay!) Some of the Azure services will always be free, while some are free for the first 12 months. You get $200 worth of credits that last 30 days so you can test and learn the paid Azure services. One tip: Time your free account wisely :)

If you already have an Azure subscription, make sure that you have the permissions you need. To create an Azure Data Factory, you need to either:

  • Be a member of the Owner or Contributor role
  • Be a classic Service Administrator (but I totally recommend using the new roles above instead 👆🏻)

You can view your permissions in the Azure Portal:

  1. Click on your username in the top-right corner
  2. Click the more options ellipsis (…) next to “Switch directory”
  3. Select my permissions
Screenshot of the Azure Portal, highlighting the username in the top-right corner
Screenshot of the Azure Portal, showing the more options ellipsis after clicking on the username in the top-right corner
Screenshot of the Azure Portal, showing the My permission link after clicking on the more options ellipsis

Creating an Azure Data Factory

Alrighty! From this point, I assume that you have an Azure Subscription, and that you have the permissions you need on it. All set? Perfect! Let’s do this click-click-click thingy.

In the Azure Portal, click around until you get to the New Data Factory page. Look for “create a resource” or “Data Factories” and keep clicking until you get there. No, seriously, there are at least six different ways that I know of that you can get to that page. They also keep changing the portal, and I’m sure you hate outdated descriptions and screenshots as much as me. You could also use this fancy direct link: New Data Factory. That might be easier :)

On the New Data Factory page, fill out the required information:

Screenshot of the New Data Factory page in the Azure Portal for creating an Azure Data Factory

The name of your Azure Data Factory has to be unique across all of Azure. That means that you may have to try a few options before finding something that hasn’t already been taken. I can tell you right away that “adf” is unavailable. So is “asdfasdf” :) I used cathrinew-adf.

Use version v2. (I’m still pretending v1 doesn’t exist.)

A resource group is a logical container to group your resources. You can choose an existing one or create a new one. I created cathrinew-rg.

Choose the location closest to you. I chose West Europe.

Disable GIT for now. We’ll get back to this :)

Click create, and wait for the deployment to finish. Once finished, click go to resource:

Screenshot of the deployment overview page in the Azure Portal, which includes a button for navigating to the deployed resource

Navigating to Azure Data Factory

From the Azure Data Factory resource page, you can click on author & monitor to open the Azure Data Factory:

Screenshot of the Azure Data Factory resource page in the Azure Portal, which includes a button for navigating to the Azure Data Factory

This will open the Author page in Azure Data Factory. (More about this shortly.)

You can also access Azure Data Factory directly by navigating to adf.azure.com:

Screenshot of the Select Data Factory page

Here, you have to choose the Azure Data Factory you want to open. This will open the Home page in Azure Data Factory. (Again, more about this shortly.)

Navigating inside Azure Data Factory

Depending on how you open your Azure Data Factory, you will either open the Home page, or the Author page. There is also a third Monitor page. You can navigate between these three pages by using the left navigation menu. If you click the expand icon, you will also see a link to select another Azure Data Factory. This is a shortcut that will take you back to adf.azure.com:

Screenshot of the Home page inside Azure Data Factory, with the left navigation menu expanded and highlighted

Home Page

The Home page is your dashboard. From here, you can do some of the most common tasks, like creating a pipeline. creating a data flow, or copying data:

Screenshot of the entire Home page inside Azure Data Factory

You can also view Azure Data Factory videos, read Azure Data Factory tutorials, and find links to more resources. The videos highlight new features and announcements, so I recommend watching them when they get added.

Author Page

The Author page is your main development environment:

Screenshot of an empty Author page in Azure Data Factory

This is where you will be spending most of your time during development. On the left side, you will see all your factory resources. (Well, uh, after we create some factory resources, that is. We’ll get back to that soon!) You also have a search bar at the very top of this page. From there, you can search for all your factory resources.

Monitor Page

The Monitor page is where you can… uh… monitor stuff:

Screenshot of an empty Monitor page in Azure Data Factory

From here, you can view the overview dashboard, monitor your pipeline and trigger runs, and set up alerts. It’s completely empty right now, but we’ll get back to that as well in later posts!

Summary

In this post, we started by creating an Azure Data Factory. Then we navigated to it, and got familiar with the user interface and the three main Azure Data Factory pages.

In the next post, we will go through the Author page in more detail. Let’s look at the different Azure Data Factory components!

🤓

About the Author

Cathrine Wilhelmsen is a Microsoft Data Platform MVP, BimlHero Certified Expert, Microsoft Certified Solutions Expert, international speaker, author, blogger, and chronic volunteer who loves teaching and sharing knowledge. She works as a Senior Business Intelligence Consultant at Inmeta, focusing on Azure Data and the Microsoft Data Platform. She loves sci-fi, chocolate, coffee, craft beers, ciders, cat gifs and smilies :)

Comments

Hi! This is Cathrine. Thank you so much for visiting my blog. I'd love to hear your thoughts, but please keep in mind that I'm not technical support for any products mentioned in this post :) Off-topic questions, comments and discussions may be moderated. Be kind to each other. Thanks!

Great job, thank you!

Hi! This is Cathrine (again). Just a reminder. I'd love to hear your thoughts, but please keep in mind that I'm not technical support for any products mentioned in this post :) Off-topic questions, comments and discussions may be moderated. Be kind to each other. Thanks!

Share Your Thoughts?