airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake

Module Contents

class airflow.providers.microsoft.azure.transfers.oracle_to_azure_data_lake.OracleToAzureDataLakeOperator(*, filename: str, azure_data_lake_conn_id: str, azure_data_lake_path: str, oracle_conn_id: str, sql: str, sql_params: Optional[dict] = None, delimiter: str = ',', encoding: str = 'utf-8', quotechar: str = '"', quoting: str = csv.QUOTE_MINIMAL, **kwargs)[source]

Bases: airflow.models.BaseOperator

Moves data from Oracle to Azure Data Lake. The operator runs the query against Oracle and stores the file locally before loading it into Azure Data Lake.

Parameters
  • filename (str) -- file name to be used by the csv file.

  • azure_data_lake_conn_id (str) -- destination azure data lake connection.

  • azure_data_lake_path (str) -- destination path in azure data lake to put the file.

  • oracle_conn_id (str) -- source Oracle connection.

  • sql (str) -- SQL query to execute against the Oracle database. (templated)

  • sql_params (Optional[dict]) -- Parameters to use in sql query. (templated)

  • delimiter (str) -- field delimiter in the file.

  • encoding (str) -- encoding type for the file.

  • quotechar (str) -- Character to use in quoting.

  • quoting (str) -- Quoting strategy. See unicodecsv quoting for more information.

template_fields = ['filename', 'sql', 'sql_params'][source]
ui_color = #e08c8c[source]
_write_temp_file(self, cursor: Any, path_to_save: Union[str, bytes, int])[source]
execute(self, context: dict)[source]

Was this entry helpful?