top of page
  • Writer's pictureHarini Mallawaarachchi

L22-Automate an Azure Databricks Notebook with Azure Data Factory


You can use notebooks in Azure Databricks to perform data engineering tasks, such as processing data files and loading data into tables. When you need to orchestrate these tasks as part of a data engineering pipeline, you can use Azure Data Factory.



Before you start

You'll need an Azure subscription in which you have administrative-level access.

Review the What is Azure Data Factory?. article in the Azure Synapse Analytics documentation.



Import a notebook

Enable Azure Databricks integration with Azure Data Factory

Use a pipeline to run the Azure Databricks notebook




2 views0 comments

Recent Posts

See All

Comments


bottom of page