Azure Devops Article

Azure Data Factory CI/CD with DevOps Pipelines

Contents


Continuous Integration and Deployment for Azure Data Factory allows you to keep the state of your Data Factories consistent. This minimizes the dependency on manual changes in numerous environments will enable a stable architecture.

A few months ago, I found myself troubleshooting an issue in our PROD Data Factory that was not present in our DEV Data Factory.

I had made the changes from DEV in PROD manually and was 100% sure that they were identical. Nevertheless, with each day the pipeline in PROD would take a few minutes longer to run. Within a week that additional time had accumulated to an extra hour.

After taking some time to troubleshoot, I had realized that the PROD pipeline was missing a pre-copy script in a copy activity. Without our knowledge, a table was growing daily which had a waterfall effect when loading other tables causing an increase in the pipelineโ€™s runtime.

If I had some sort of Continuous Integration and Deployment present, I would not have experienced such an issue.

I will be walking you through how to set up an Azure DevOps repository for your Data Factory and utilize Azure DevOps pipelines to push changes in DEV to your other environments. We will create all the necessary resources required for three environments: DEV, UAT, and PROD. These environments will be accompanied by a basic DevOps pipeline.

Prerequisites for Azure and DevOps CI/CD

There are several things to consider before starting. You will need access to an Azure Subscription with the ability to create resource groups as well as resources with the โ€œOwner” role assignment. Without the โ€œOwnerโ€ privileges, you will not be able to create a service principal that provides DevOps access to your Data Factories within your Resource groups.

It is also beneficial to have some exposure to creating basic Azure Data Factory pipelines.

Step 1: Setting up the Azure environment

We will start by creating three resource groups as well as three data factories. Each pair will resemble one of the three environments. This can be done through the Azure Portal. Keep in mind, Resource Group names have to be unique in your subscription, whereas Data Factories must be unique across Azure.

For this walkthrough, we will follow the following naming scheme for our resources:

<Your Initials>-<Project>-<Environment>-<Resource>

Microsoft provides some great documentation on naming conventions. I highly recommend having a look at these.

*Important* If you are already familiar with the Azure portal, you can skip Step 1 entirely by running a PowerShell script in this GitHub Repository. Make sure to make the appropriate changes to the variables within the script.

1.1 Creating Resource Groups

With the naming scheme in mind, we will create three Resource Groups with the following names (do not forget to use your initials):

โ€œ<Initials>-warehouse-dev-rgโ€  

โ€œ<Initials>-warehouse-uat-rgโ€

โ€œ<Initials>-warehouse-prod-rgโ€

1.1.1 Creating the DEV Resource Group

Once logged into Azure, on top of the home page click on โ€œCreate a resourceโ€

create a resource

Using the search bar, search for โ€œResource Groupโ€ and select โ€œResource Groupโ€ from the search results.

Once in the โ€œResource Groupโ€ resource information page, click โ€œCreateโ€.

On the โ€œCreate a resource groupโ€ page there will be three fields that need to be filled out:

  • Subscription โ€“ Choose your subscription from the list
  • Resource Group โ€“ โ€œ<initials>-warehouse-dev-rgโ€  
  • Region โ€“ Select the Region that is most appropriate to your current location.

The result should look something like this:

create a resource group

Click on โ€œReview + Createโ€.

You should see a green bar on the top of the page that says, โ€œValidation passedโ€.

Click โ€œCreateโ€ at the bottom of the page.

1.1.2 Creating the UAT and PROD Resource Groups

Now that we have created our first resource group, follow the same steps to create the UAT and PROD Resource Groups:

โ€œ<Initials>-warehouse-uat-rgโ€

โ€œ<Initials>-warehouse-prod-rgโ€

Once you have created all three Resource Groups, you should see them in your azure portal:

resource groups in azure

1.2 Creating Azure Data Factories

Now we will start creating the necessary Data Factories in each respective Resource Group that we created.

With the naming scheme in mind, we will create three Data Factories with the following names (do not forget to use your initials):

โ€œ<Initials>-warehouse-dev-dfโ€

โ€œ<Initials>-warehouse-uat-dfโ€

โ€œ<Initials>-warehouse-prod-dfโ€

*Important* Since Azure Data Factory names must be unique across all of Azure, you might need to add a random number(s) to the end of your initials for it to be unique. This will not cause any issues going forward.

1.2.1 Creating the DEV Data Factory

On top of the Azure home page click on โ€œCreate a resourceโ€

select the create a resource button

Using the search bar, search for โ€œData Factoryโ€ and select โ€œData Factoryโ€ from the search results.

Once in the โ€œData Factoryโ€ resource information page, click โ€œCreateโ€.

On the โ€œCreate Data Factoryโ€ page there will be five fields that need to be filled out:

  • Subscription – Choose your subscription from the list
  • Resource Group – Select โ€œ<Initials>-warehouse-dev-rgโ€ from the drop-down menu.
  • Region – Select the Region that is most appropriate to your current location.
  • Name -<Initials>-warehouse-dev-df
  • Version -V2

The result should look something like this:

create an azure data factory

At the bottom of the page click on โ€œNext: Git configuration >โ€

With regards to the Git configuration, it is something we will be setting up later. Make sure the โ€œConfigure Git Laterโ€ checkbox is checked off.

Click on โ€œReview + Createโ€.

You should see a green bar on the top of the page that says, โ€œValidation passedโ€.

Click โ€œCreateโ€ at the bottom of the page.

1.2.2 Creating UAT and PROD Data Factories

Now that we have created our first Data Factory, follow the same steps to create the UAT and PROD Data Factories in their corresponding Resource Group:

โ€œ<Initials>-warehouse-uat-dfโ€

โ€œ<Initials>-warehouse-prod-dfโ€

Once completed, you will be able to see one Data Factory in each Resource Group. The environment of the Data Factory should be identical to that of the Resource Group.

Step 2: Setting up the DevOps Environment

2.1 Creating a DevOps organization

In this section, we will be creating an Organization and Repo that will contain our Azure Data Factory code and our pipelines.

Go to the Azure DevOps website and click on โ€œSign in to Azure DevOpsโ€ below the blue โ€œStart for freeโ€ button.

azure devops sign up

Use the same credentials that were used to sign in to Azure.

You will be taken to a page confirming the directory. DO NOT CLICK CONTINUE. Follow the steps below based on the type of account you are currently using.

2.1.1 Personal Account

If you are using a personal account for both Azure and DevOps you will need to change your directory when logging in. This will allow you to connect Azure Services to DevOps and vice versa.

Once logged in, you will see the following screen:

get started with azure devops

Click on โ€œSwitch Directoryโ€ next to your e-mail address and make sure โ€œDefault Directoryโ€ is selected. The directory name might be different if you have made changes to your Azure Active Directory.

default directory with azure devops

Click โ€œContinueโ€.

2.1.2 Organizational Account

If you are using an Organizational account you will already be associated with a directory.

Click โ€œContinueโ€

2.2 Creating Your Project

Currently, you should be on the โ€œCreate a project to get startedโ€ screen, you will also notice the organization name that was automatically created for you in the top left-hand corner.

Before we create our project, letโ€™s first check and see that the DevOps organization is indeed connected to the correct Azure Active Directory.

At the bottom-left of the โ€œCreate a project to get startedโ€ click on โ€œOrganization Settingsโ€

In the left pane select โ€œAzure Active Directoryโ€ and make sure that this is the same tenant that was used when your Azure Services were created.

azure active directory

Keep in mind your Directory name might be different compared to what is shown in the screenshot.

This is not required but now is the best time to change the organization name within DevOps as doing so, later on, could cause issues.

While staying on the โ€œOrganization Settingsโ€ page, click on โ€œOverviewโ€  in the left pane.

Change the name of the organization and click โ€œSaveโ€.

azure organization settings

Go ahead and click on the Azure DevOps button in the top left-hand corner.

azure devops

We can now start creating our project within DevOps. Our project will be named โ€œAzure Data Factoryโ€

Leave the visibility as โ€œPrivateโ€ and select Git for โ€œVersion controlโ€ and Basic for โ€œWork item processโ€ and click on โ€œCreate projectโ€

create a project window

Microsoft has documentation that goes over the different types of version controls and work item processes.

Click on โ€œCreate projectโ€

We have now created an organization in DevOps as well as a project that will contain our repository for Azure Data Factory. You should be loaded into your project welcome page after creating it.

azure data factory UI

If this is your first time using Azure DevOps, take the next few minutes to explore the options within the project. Our focus will be on the โ€œReposโ€ and โ€œPipelinesโ€ services visible in the left menu.

Our next step is to connect our DEV Data Factory to the repository we have just created. This will be the only Data Factory that will be added to the repository. It will be the DevOps release pipelineโ€™s duty to push that code into UAT and PROD.

In the Azure portal go to your DEV Resource Group and click on your DEV Data Factory, then click โ€œAuthor & Monitorโ€.

On the DEV Data Factory home page, click on โ€œSet up code repositoryโ€

azure data factory lets get started

There will be several options for getting the DEV Data Factory connected to the repository.

Once the โ€œConfigure a repositoryโ€ pane opens, select the appropriate values in the settings:

  • Repository Type – Azure DevOps Git
  • Azure Active Directory – Select your Azure Active Directory from the list
  • Azure DevOps Account – Select the organization name that was created during the โ€œCreating an Azure DevOps organization and repoโ€ step.
  • Project Name – Azure Data Factory
  • Repository name – Select โ€œUse existingโ€, from the drop-down select Azure Data Factory
  • Collaboration branch – master
  • Publish branch – adf_publish
  • Root folder – Leave it as the default โ€œ/โ€
  • Import existing resource – Since this Data Factory is new and contains nothing in it, we will not be selecting โ€œImport existing resources to repositoryโ€

Your values for certain fields will be different, but this is what you should expect:

configure a repository window

Click โ€œApplyโ€.

Now, let us check to make sure that our Data Factory is indeed connected to our repository.

On the left-hand side click on the pencil โ€œAuthorโ€. If you get prompted to Select working branch, make sure you have โ€œUse Existingโ€ and โ€œmasterโ€ selected and click Save at the bottom.  

select a working branch window

Under โ€œFactory Resourcesโ€ click on the + sign and select โ€œNew Pipelineโ€. Name it โ€œWait Pipeline 1โ€ and insert a โ€œWaitโ€ activity into the pipeline canvas. Since a wait activity does not require any inputs or outputs, we can successfully continue without any validation issues.

data factory resources

At the top of the screen click on โ€œSave allโ€ and then โ€œPublishโ€. When the Pending changes pane comes up, click โ€œOKโ€

Go to the Azure Data Factory project in DevOps and select โ€œReposโ€. You will see that our recently created pipeline is now there.  

azure data factory pipeline

Now is a good time to go over the two branches that are currently present in our repository. The master branch was created when we created the repository will contain each asset of our Data Factory, such as datasets, integrationRuntimes, linkedServices, pipelines, and triggers. Each item will have a .json file with the properties.

So, where did adf_publish come from? Well, it got created automatically when we first published our โ€œWait Pipeline 1โ€. This branch will contain two extremely important files that will allow us to pass on what we develop in the DEV Data Factory to UAT and PROD.

ARMTemplateForFactory.json will contain all the assets within our Data Factory and all their properties.

ARMTemplateParametersForFactory.json will contain all the parameters used in any of the assets within our DEV Data Factory.

Now that our resources are created and connected, we can go ahead and create our Pipeline. Our pipeline will essentially grab the contents in their current state from the adf_publish branch and make it available for further releases into UAT and PROD.

Struggling with your analytics journey?

Let Iteration Insights get you on the right path, view our services to see how we can help you today.

Step 3: Creating an Azure DevOps Pipeline

The main responsibility of our pipeline will be to grab the contents of the adf_publish branch. This branch will contain the most recent version of our DEV Data Factory which we can later use to deploy to our UAT and PROD Data Factories.

3.1 Pre- and post-deployment PowerShell Script

There is an extremely important โ€œpre-and post-deploymentโ€ PowerShell script that needs to be run during each release. Without this script, if you were to delete a pipeline in your DEV Data Factory and deploy the current state of DEV into UAT and PROD, that deletion would NOT happen in UAT and PROD.

You can find this script in the Azure Data Factory documentation at the bottom of the page. Copy it over to your favorite code editor and save it as a PowerShell file. I called mine โ€œpre-and post-deploymentโ€.

Go to Azure DevOps and click on โ€œReposโ€ on the left-hand side. Make sure you are in the adf_publish branch and click on ellipses while hovering over your dev Data Factory folder.

Click โ€œUpload file(s)โ€ and upload the file from the location where it was saved and click โ€œCommitโ€.

azure devops pipeline

You should now see the โ€œpre-and post-deployment.ps1โ€ script in the adf_publish branch.

azure data factory files

3.2 Creating the Azure Pipeline for CI/CD

Within the DevOps page on the left-hand side, click on โ€œPipelinesโ€ and select โ€œCreate Pipelineโ€

create your first pipeline for CI/CD

On the next page select โ€œUse the classic editorโ€. We will use the classic editor as it allows us to visually see the steps that take place. Once everything is completed you can have a look at the YAML code that was generated.

use the classic editor

We will be connecting this pipeline to the adf_publish branch as it will contain the ARMTemplateForFactory and ARMTemplateParametersForFactory .json files.

On the โ€œSelect a sourceโ€ pane, make sure you have the following selected:

select azure repo git button

Click โ€œContinueโ€

At the top of โ€œSelect a templateโ€ click on โ€œEmpty jobโ€

select a template and empty job

The following task will grab the contents of the DEV Data Factory folder under the adf_publish branch that is in Repos > Files.

To the right of โ€œAgent job 1โ€ click on the + sign and search for โ€œPublish build artifactsโ€ and click โ€œAddโ€

add tasks to azure pipeline

Once you click on the โ€œPublish Artifact: dropโ€ task underneath Agent job 1, you will see additional settings in the pane on the right.

We will only be changing the โ€œPath to Publishโ€.

Click on the ellipsis button to the right of โ€œPath to Publishโ€ and select the <Initial>-warehouse-dev-df folder:

select a path for the pipeline

Click โ€œOKโ€

Your task properties should look like this:

publish artifact window

On the top of the pipeline, click on the โ€œTriggersโ€ ribbon. Make sure you have โ€œEnable continuous integrationโ€ checked-off and adf_publish set under โ€œBranch specificationโ€

enable continuous integration

With this setting enabled, anytime a user publishes new assets in the dev Data Factory this pipeline will run and grab the most recent state of that Data Factory.

Click โ€œSave & queueโ€ at the top and then โ€œSave and runโ€

You can click on the โ€œAgent job 1โ€ to get a more in-depth look into what is going on in the background.

jobs in pipelines

The job should have completed successfully.

On the left-hand side, click on โ€œPipelinesโ€

You will be able to see the pipeline that we created as well as the state of it. You will also notice that the most recent change to adf_publish was when we added the PowerShell script.

pipeline in azure data factory

Step 4: Creating a DevOps Release Pipeline for Continuous Integration And Deployment

The release pipeline will utilize the output of the pipeline we created above, and it will pass on the contents of that pipeline to our UAT and PROD Data Factories. Keep in mind, we have only created the Wait Pipeline in the dev Data Factory.

4.1 Creating the Release Pipeline

On the DevOps page, click on โ€œPipelinesโ€ and then on โ€œReleasesโ€

Click on โ€œNew pipelineโ€

create a new release pipeline

When the โ€œSelect a templateโ€ pane comes up, click on โ€œEmpty jobโ€

Rename the โ€œStage nameโ€ to โ€œUATโ€

stage window

Click โ€œSaveโ€ In the top right-hand corner. Select โ€œOKโ€ when the Save pop-up shows up.

4.2 Adding the Artifact

Now that we have the UAT stage created, we will now add the artifact which is the pipeline we created earlier. This pipeline will contain the necessary configurations that will be pass through to each stage.

Within the release pipeline, click on โ€œAdd an artifactโ€

In the โ€œAdd an artifactโ€ pane, make sure the following are selected:

add an artifact window

Click โ€œAddโ€

Our Release Pipeline should look like this:

view of the release pipeline

4.3 Building the Release Pipeline UAT Stage

Now we can start adding tasks to our UAT stage within our โ€œNew release pipelineโ€.

At the top of our Release Pipeline click on โ€œVariablesโ€. We will be adding a ResourceGroup, DataFactory and Location variable so that they can be passed through automatically to different tasks.

Click on he โ€œAddโ€ button and create the following pipeline variables:

view of the pipeline variables

*Important* Make sure your location variable contains the same location as your Resource Groups and Data Factories.

Note that those variables are currently scoped for the UAT stage and will not work in any other stage.

Make sure to save the changes and go to the pipeline view.

Click โ€œ1 job, 0 taskโ€ on the UAT box, then click on the + to the right of โ€œAgent jobโ€.

Search for โ€œAzure PowerShellโ€ and add it from the list.

add tasks view

We will give this task a display name of โ€œAzure PowerShell script: Pre-Deploymentโ€.

When it comes to selecting the Azure Subscription, we will create a Service Principal within the UAT Resource Group. This Service Principal will be added to the PROD Resource Group later.

From the Azure Subscription drop-down list, select your subscription. We want to limit what this service principal has access to, so instead of just clicking Authorize, click on the down arrow, and select โ€œAdvanced optionsโ€.

azure powershell window

In the pop-up window, we should already be in the โ€œService Principal Authenticationโ€ tab.

Make sure that the โ€œResource Groupโ€ option has the UAT Resource Group Selected. We can leave all the other options as they are.

azure resource manager window

Click โ€œOKโ€

Keep the โ€œScript Typeโ€ as โ€œScript File pathโ€.

Click on the ellipsis to the right of โ€œScript Pathโ€ and select the PowerShell script we uploaded earlier. It will be in the Artifact drop.

select a file window

Click โ€œOKโ€

We will be using the โ€œScript Argumentsโ€ provided in the Azure Data Factory documentation. For Pre-Deployment, we will use the following arguments:

-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" 
-ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $true -deleteDeployment $false

There are several adjustments we must make to this:

  • Arm template location which will be located within the artifact drop
  • Resource Group Name for our UAT environment
  • Data Factory Name within our UAT Resource Group

The easiest way to find the location of your arm template is to click on the ellipsis to the right of โ€œScript Pathโ€. Select โ€œARMTemplateForFactory.jsonโ€ and copy the location portion of that window. Click on โ€œCancelโ€ as we do not want to select the template.

select ARMTemplateForFactory.json

Replace <your-arm-template-location> with the location that we copied.

For the Resource Group and Data Factory, we will utilize the variables that we created earlier.

Replace <your-resource-group-name> with $(ResourceGroup) and <your-data-factory-name> with $(DataFactory)

The arguments should look like this:

-armTemplate "$(System.DefaultWorkingDirectory)/_Azure Data Factory-CI/drop/ARMTemplateForFactory.json" -ResourceGroupName $(ResourceGroup)
-DataFactoryName $(DataFactory) -predeployment $true -deleteDeployment $false

Select โ€œLatest installed versionโ€ for the โ€œAzure PowerShell Versionโ€

azure powershell window

Click โ€œSaveโ€

Let us add another task. Click on the + button and search for โ€œARM Templateโ€ and add โ€œARM template deploymentโ€ from the list.

ARM template deployment add task

Change the โ€œDisplay Nameโ€ to โ€œARM Template deployment: Data Factoryโ€

Select the Service Principle that we created earlier under โ€œAzure Resource Manager connectionโ€

azure resource manager connection select microsoft partner credits

Adjust the rest of the options as follows:

  • Subscription – Select your subscription
  • Action – Create or update resource group
  • Resource group – We will use the $(ResourceGroup) variable we created. Manually type in โ€œ$(ResourceGroup)โ€. This will recall the correct value when the pipeline runs.
  • Location – Manually type in โ€œ$(Location)โ€
  • Template location – Linked Artifact
  • Template – Using the ellipsis, select the ARMTemplateForFactory.json
select ARMTemplateForFactory.json
  • Template parameters – Using the ellipsis, select the ARMTemplateParametersForFactory.json
select ARMTemplateParametersForFactory.json
  • Override template parameters – Click on the ellipsis and change the โ€œfactoryNameโ€ parameter to โ€œ$(DataFactory)โ€
override template parameters
  • Deployment mode – Make sure this is set to Incremental, as complete will delete any resources that are not present in our arm template.

The ARM template task should look something like this:

ARM template deployment
Template

Click โ€œSaveโ€

Let us add another Azure PowerShell task and call it โ€œAzure PowerShell script: Post-Deploymentโ€

The only thing that will differ when compared to our initial PowerShell task is the Script Arguments. Our Pre-Deployment arguments contained โ€œ-predeployment $trueโ€ and โ€œ-deleteDeployment $falseโ€, but for the Post-Deployment script, we will change that to โ€œ-predeployment $falseโ€ and โ€œ-deleteDeployment $trueโ€

-armTemplate "$(System.DefaultWorkingDirectory)/_Azure Data Factory-CI/drop/ARMTemplateForFactory.json" -ResourceGroupName $(ResourceGroup) 
-DataFactoryName $(DataFactory) -predeployment $false -deleteDeployment $true

The UAT Stage should now look like this:

UAT Stage view in Azure PowerShell

Click โ€œSaveโ€

4.4 Adding our Service Principal to the PROD Resource Group

Now that are UAT stage is complete, let us create the PROD stage. During the development of the UAT stage, we created a service principal with access to our UAT Resource Group so that DevOps could make the intended changes to our UAT Data Factory. We now must give this service principal the same permissions in our PROD Resource Group before we begin creating the PROD stage within our Release Pipeline.

Go to your Azure portal and navigate to your UAT Resource Group and select on โ€œAccess Control (IAM)โ€

Click on โ€œViewโ€ under โ€œView access to this resourceโ€

resource group UI

You will notice that there is an entity under the Contributor role that contains the name of your DevOps organization as well as the project name. Copy as much of the name as you can:

Access Control

Now go to your PROD Resource Group and navigate to โ€œAccess Control (IAM)โ€

Click on โ€œViewโ€ under โ€œView access to this resourceโ€

You will see that that service principal is not present.

At the top of the window, click on the โ€œ+ Addโ€ button and select โ€œAdd role assignmentโ€

Select the Contributor role from the drop-down and leave the โ€œAssign Access toโ€ as โ€œUser, Group, or service principalโ€

Paste the recently copied service principal name into the โ€œSelectโ€ search bar and select the service principal.

add role assignment

Click โ€œSaveโ€

You should now see the service principal in your PROD Resource Group:

Access control view

Now that we have completed this step. We can go back to DevOps and complete the PROD stage within our Release Pipeline.

4.5 Building the Release Pipeline PROD Stage

Navigate back to our โ€œNew release pipelineโ€ and hover over the UAT stage. Once visible, click on โ€œcloneโ€

UAT Stage view, click on clone

Rename โ€œCopy of UATโ€ to โ€œPRODโ€. Your Release Pipeline should look like this:

artifacts and stages view

Since we created UAT with the use of variables, we can easily have the PROD stage function correctly without a lot of effort.

Let us have a look at the pipeline variables.

pipeline variables view

We can see that the ResourceGroup and DataFactory variables were duplicated but are under the PROD scope. We must change the actual values to reflect our PROD resources as follows:

pipelines variables view

Click โ€œSaveโ€

We initially set up our Azure Data Factory-CI pipeline to run whenever there was a change in adf_publish.

We can now adjust how the Release Pipeline runs. I want a new release to occur every time the Azure Data Factory-CI pipeline changes, essentially whenever there is a change in adf_publish. Afterwards, I will want that change to automatically apply to our UAT Data Factory. When it comes to the PROD Data Factory, I will require a Pre-Deployment approval, before the changes in DEV and UAT are present in PROD.

Within our โ€œNew release pipelineโ€ click on the โ€œContinuous deployment triggerโ€ button

artifacts, select continuous deployment trigger

Enable โ€œContinuous deployment triggerโ€

continuous deployment trigger enabled

Next, we will adjust the pre-deployment conditions of the UAT stage to automatically run. 

Click on the โ€œPre-deployment conditionsโ€ button on the UAT stage.

pre-deployment conditions for UAT stage

Make sure the โ€œAfter releaseโ€ option is selected

Click on the โ€œPre-deploymentโ€ conditions for the PROD stage and set the trigger to be โ€œAfter stageโ€, with the stage being set to UAT

triggers select after stage

We will also enable โ€œPre-deployment approvalsโ€. Enter your e-mail address as we will be the ones approving this trigger.

Click โ€œSaveโ€

Now that we have completed the crucial parts of our Release Pipeline, we can now test it. We will manually initiate the Azure Data Factory-CI pipeline which will trigger our Release Pipeline.

Keep in mind, we only have the โ€œWait Pipeline 1โ€ in our DEV Data Factory. Our UAT and PROD Data Factories are currently empty.

On the left-hand side, go to Pipelines and select the Azure Data Factory-CI.

Click on โ€œRun pipelineโ€ in the top left-hand corner. Click โ€œRunโ€ once more.

On the left-hand side of the screen, navigate to โ€œReleasesโ€. You should now be able to see our first release. We can also tell that the UAT stage of the Release Pipeline is currently running.

new release pipeline view

Our Release Pipeline run should now show us that the UAT stage completed successfully and the PROD stage is ready for approval.

new release pipeline view with PROD pending approval

Go to your UAT Data Factory and see if the โ€œWait Pipeline 1โ€ is present. If everything ran correctly, you will be able to see it.

factory resources

Now we can go approve the PROD stage and have the โ€œWait Pipeline 1โ€ deploy to our PROD Data Factory.

Navigate to our โ€œNew release pipelineโ€ and click on the PROD stage. Once the approval pane comes up, click on โ€œApproveโ€

PROD stage approval form

We can now see that the PROD stage is in progress.

new release pipeline view

Feel free to click on the PROD stage as you will see all the tasks and their progression:

agent job view

Upon checking our PROD Data Factory, we can see that the โ€œWait Pipeline 1โ€ is present.

factory resources

4.6 Running the Release Pipeline

Now, we will go make some changes in our DEV Data Factory and see how the entire process works from start to finish.

Navigate to your DEV Data Factory and do the following:

  • Create a pipeline called โ€œWait Pipeline 2โ€
    • Insert a Wait activity called โ€œWait 2โ€
  • Delete the โ€œWait Pipeline 1โ€ pipeline

The result should look like this:

factory resources view

Click โ€œSave allโ€ and โ€œpublishโ€ all the changes.

When we navigate to โ€œPipelinesโ€ in DevOps, we can see that the Azure Data Factory-CI pipeline has already been triggered, because the adf_publish has just changed.

azure devops pipeline view

You will also notice that in โ€œReleasesโ€, Release-2 has been created and the UAT stage is currently deploying.

new release pipeline view with release-1 and release-2

With the following image we can tell that our UAT Data Factory is on Release-2 as the stage has a green background. Our PROD Data Factory is currently on Release-1

new release pipeline view

Approve the PROD Stage and let us check the state of the UAT and PROD Data Factories.

Your UAT Data Factory should have the same changes that we had made in DEV:

factory resource activities view

Your PROD Data Factory should have the same changes that we had made in DEV:

factory activities view

We have now confirmed that all our resources are connected properly, and our Pipeline and Release Pipeline runs correctly. Any future development in the DEV Data Factory can be replicated in UAT and PROD. You will be able to avoid having small differences between your environments and save time troubleshooting issues that should not have been there in the first place.

I highly recommend going through each task we created and viewing the YAML code that was generated. This will give you a good idea as to how it should be structured and what it should contain.

Azure,CI/CD,Azure Pipelines,azure data factory

Stay Up to Date with Our Latest Blogs

Subscribe to our mailing list to be notified when a new article is released.

Share this post

Let's get started with a consultation

Get started with training

Sign up for our newsletter