airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake

Module Contents

Classes

OracleToAzureDataLakeOperator

Moves data from Oracle to Azure Data Lake. The operator runs the query against

class airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake.OracleToAzureDataLakeOperator(*, filename, azure_data_lake_conn_id, azure_data_lake_path, oracle_conn_id, sql, sql_params=None, delimiter=',', encoding='utf-8', quotechar='"', quoting=csv.QUOTE_MINIMAL, **kwargs)[source]

Bases: airflow.models.BaseOperator

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

Parameters
  • filename (str) -- file name to be used by the csv file.

  • azure_data_lake_conn_id (str) -- destination azure data lake connection.

  • azure_data_lake_path (str) -- destination path in azure data lake to put the file.

  • oracle_conn_id (str) -- Source Oracle connection.

  • sql (str) -- SQL query to execute against the Oracle database. (templated)

  • sql_params (Optional[dict]) -- Parameters to use in sql query. (templated)

  • delimiter (str) -- field delimiter in the file.

  • encoding (str) -- encoding type for the file.

  • quotechar (str) -- Character to use in quoting.

  • quoting (str) -- Quoting strategy. See unicodecsv quoting for more information.

template_fields :Sequence[str] = ['filename', 'sql', 'sql_params'][source]
template_fields_renderers[source]
ui_color = #e08c8c[source]
execute(self, context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?