airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake

Module Contents

Classes

OracleToAzureDataLakeOperator

Runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

class airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake.OracleToAzureDataLakeOperator(*, filename, azure_data_lake_conn_id, azure_data_lake_path, oracle_conn_id, sql, sql_params=None, delimiter=',', encoding='utf-8', quotechar='"', quoting=csv.QUOTE_MINIMAL, **kwargs)[source]

Bases: airflow.models.BaseOperator

Runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

Parameters
  • filename (str) – file name to be used by the csv file.

  • azure_data_lake_conn_id (str) – destination azure data lake connection.

  • azure_data_lake_path (str) – destination path in azure data lake to put the file.

  • oracle_conn_id (str) – Source Oracle connection.

  • sql (str) – SQL query to execute against the Oracle database. (templated)

  • sql_params (dict | None) – Parameters to use in sql query. (templated)

  • delimiter (str) – field delimiter in the file.

  • encoding (str) – encoding type for the file.

  • quotechar (str) – Character to use in quoting.

  • quoting (int) – Quoting strategy. See csv library for more information.

template_fields: Sequence[str] = ('filename', 'sql', 'sql_params')[source]
template_fields_renderers[source]
ui_color = '#e08c8c'[source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?