The Microsoft Fabric REST APIs allow you to automate Fabric procedures and processes. In this blog post, we’ll focus on a specific API: “Run On Demand Item Job”

To dynamically call another data pipeline, we’ll utilise this API within a data pipeline web activity. By passing in the itemId of the data pipeline we want to invoke, we can trigger its execution. You can learn more here:

https://learn.microsoft.com/en-us/rest/api/fabric/core/job-scheduler/run-on-demand-item-job?tabs=HTTP

This API accepts 3 parameters and is a POST method.

  • itemId
  • workspaceId
  • jobType

The constructed URL for invoking the API will be:

https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}/jobs/instances?jobType={jobType}

Note: As of the current date (14/05/2024), Azure service principals are not supported for Fabric APIs. While some Fabric APIs do work with service principals, others do not. For this example, we’ll use a service account and authenticate using OAuth 2.0.

Implementation Steps:

Step 1: Create a Cloud Connection

  1. Connection Name : Fabric API
  2. Connection Type : Web V2
  3. Base URL : https://api.fabric.microsoft.com/v1
  4. Token Audience URI : https://api.fabric.microsoft.com
  5. Authentication: OAuth 2.0 (enter your service account credentials)

Note: The service account used must at least have a viewer role of the workspace where the pipeline(s) you want to call reside.

Step 2: Create a new data pipeline in your Fabric environment called “Execute Pipeline Dynamically”.

  1. Add a new web activity
  2. Add a new string parameter to the data pipeline called “itemId”
  3. In the settings pane of the activity select the newly created connection “Fabric API”
  4. In “Relative URL” option box add the following dynamic content: @concat(‘/workspaces/’,pipeline().DataFactory,’/items/’,pipeline().parameters.itemId,’/jobs/instances?jobType=Pipeline’)
  5. In the “Method” option box select “POST”
  6. In the “Body” option box add the following JSON dynamic content: @concat(‘{“executionData”: {}}’)
  7. In the “Headers”, select “+ New” and add a header “Name” of “content-type” and a header “Value” of “application/json”

The finished activity within the pipeline should now look like this:

Step 3: Run the “Execute Pipeline Dynamically” pipeline

To do this, you’ll need the itemId of the pipeline you want to call. The itemId serves as the unique identifier for any item in your Fabric environment. It is also referred to as the objectId when viewing the JSON code of a pipeline.

Of course, the itemId could also be obtained by calling another API and subsequently passed into this activity.

Below I have copied the itemId/objectId and pasted it into my itemId parameter of my pipeline and clicked “OK”.

Finally, in the monitoring tab, you can now see that we have automatically called a “DimensionDate” pipeline dynamically from our “Execute Pipeline Dynamically” pipeline.

At Purple Frog, we have extended the capabilities of this API and leveraged other Fabric APIs to make our Fabric ETL framework completely metadata driven.

Of course, it would be nice if the Fabric team enabled dynamic content in the standard “Exectute Pipeline” activity, but hopefully this offers a workaround.

Hopefully this helps you further develop your Fabric ETL solutions.

 

 

Tags: