DatabricksTaskOperator¶
Use the DatabricksTaskOperator
to launch and monitor
task runs on Databricks as Airflow tasks. This can be used as a standalone operator in a DAG and as well as part of a
Databricks Workflow by using it as an operator(task) within the
DatabricksWorkflowTaskGroup
.
Examples¶
Running a notebook in Databricks using DatabricksTaskOperator¶
task_operator_nb_1 = DatabricksTaskOperator(
task_id="nb_1",
databricks_conn_id="databricks_conn",
job_cluster_key="Shared_job_cluster",
task_config={
"notebook_task": {
"notebook_path": "/Shared/Notebook_1",
"source": "WORKSPACE",
},
"libraries": [
{"pypi": {"package": "Faker"}},
{"pypi": {"package": "simplejson"}},
],
},
)
Running a SQL query in Databricks using DatabricksTaskOperator¶
task_operator_sql_query = DatabricksTaskOperator(
task_id="sql_query",
databricks_conn_id="databricks_conn",
task_config={
"sql_task": {
"query": {
"query_id": QUERY_ID,
},
"warehouse_id": WAREHOUSE_ID,
}
},
)