top of page

L22-Automate an Azure Databricks Notebook with Azure Data Factory

Writer: Harini MallawaarachchiHarini Mallawaarachchi

You can use notebooks in Azure Databricks to perform data engineering tasks, such as processing data files and loading data into tables. When you need to orchestrate these tasks as part of a data engineering pipeline, you can use Azure Data Factory.



Before you start

You'll need an Azure subscription in which you have administrative-level access.

Review the What is Azure Data Factory?. article in the Azure Synapse Analytics documentation.



Import a notebook

Enable Azure Databricks integration with Azure Data Factory

Use a pipeline to run the Azure Databricks notebook




Recent Posts

See All

L19-Use Spark in Azure Databricks

DP-203-Labs-19 Azure Databricks is a Microsoft Azure-based version of the popular open-source Databricks platform. Azure Databricks is...

Comentários


bottom of page