airflow.contrib.operators.bigquery_to_bigquery¶
Module Contents¶
- 
class airflow.contrib.operators.bigquery_to_bigquery.BigQueryToBigQueryOperator(source_project_dataset_tables, destination_project_dataset_table, write_disposition='WRITE_EMPTY', create_disposition='CREATE_IF_NEEDED', bigquery_conn_id='bigquery_default', delegate_to=None, labels=None, encryption_configuration=None, *args, **kwargs)[source]¶
- Bases: - airflow.models.BaseOperator- Copies data from one BigQuery table to another. - See also - For more details about these parameters: https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.copy - Parameters
- source_project_dataset_tables (list|string) – One or more dotted - (project:|project.)<dataset>.<table>BigQuery tables to use as the source data. If- <project>is not included, project will be the project defined in the connection json. Use a list if there are multiple source tables. (templated)
- destination_project_dataset_table (str) – The destination BigQuery table. Format is: - (project:|project.)<dataset>.<table>(templated)
- write_disposition (str) – The write disposition if the table already exists. 
- create_disposition (str) – The create disposition if the table doesn’t exist. 
- bigquery_conn_id (str) – reference to a specific BigQuery hook. 
- delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled. 
- labels (dict) – a dictionary containing labels for the job/query, passed to BigQuery 
- encryption_configuration (dict) – - [Optional] Custom encryption configuration (e.g., Cloud KMS keys). Example: - encryption_configuration = { "kmsKeyName": "projects/testp/locations/us/keyRings/test-kr/cryptoKeys/test-key" }