Announcing MongoDB Atlas as an Azure Native Integration (Preview)
May 15, 2025[Launched] Generally Available: Azure Premium SSD v2 Disk Storage is now available in Japan West
May 15, 2025We’re excited to announce that Azure Data Factory now supports the orchestration of Databricks Jobs!
Databrick Jobs allow you to schedule and orchestrate a task or multiple tasks in a workflow in your Databricks workspace. Since any operation in Databricks can be a task, this means you can now run anything in Databricks via ADF, such as serverless jobs, SQL tasks, Delta Live Tables, batch inferencing with model serving endpoints, or automatically publishing and refreshing semantic models in the Power BI service.
And with this new update, you’ll be able to trigger these workflows from your Azure Data Factory pipelines.
To make use of this new activity, you’ll find a new Databricks activity under the Databricks activity group called Job. Once you’ve added the Job activity (Preview) to your pipeline canvas, you can connect to your Databricks workspace and configure the settings to select your Databricks job, allowing you to run the Job from your pipeline. |
We also know that allowing parameterization in your pipelines is important as it allows you to create generic reusable pipeline models.
ADF continues to provide support for these patterns and is excited to extend this capability to the new Databricks Job activity.
Under the settings of your Job activity, you’ll also be able to configure and set parameters to send to your Databricks job, allowing maximum flexibility and power for your orchestration jobs.
To learn more, read Azure Databricks activity – Microsoft Fabric | Microsoft Learn.
Have any questions or feedback? Leave a comment below!