top of page

L22-Automate an Azure Databricks Notebook with Azure Data Factory

Writer's picture: Harini MallawaarachchiHarini Mallawaarachchi

You can use notebooks in Azure Databricks to perform data engineering tasks, such as processing data files and loading data into tables. When you need to orchestrate these tasks as part of a data engineering pipeline, you can use Azure Data Factory.



Before you start

You'll need an Azure subscription in which you have administrative-level access.

Review the What is Azure Data Factory?. article in the Azure Synapse Analytics documentation.



Import a notebook

Enable Azure Databricks integration with Azure Data Factory

Use a pipeline to run the Azure Databricks notebook




2 views0 comments

Recent Posts

See All

L20-Use Delta Lake in Azure Databricks

DP-203-Labs-20 Delta Lake is an open source project to build a transactional data storage layer for Spark on top of a data lake. Delta...

L19-Use Spark in Azure Databricks

DP-203-Labs-19 Azure Databricks is a Microsoft Azure-based version of the popular open-source Databricks platform. Azure Databricks is...

Comments


bottom of page