airflow.contrib.operators.bigquery_to_gcs¶
Module Contents¶
- 
class airflow.contrib.operators.bigquery_to_gcs.BigQueryToCloudStorageOperator(source_project_dataset_table, destination_cloud_storage_uris, compression='NONE', export_format='CSV', field_delimiter=',', print_header=True, bigquery_conn_id='bigquery_default', delegate_to=None, labels=None, *args, **kwargs)[source]¶
- Bases: - airflow.models.BaseOperator- Transfers a BigQuery table to a Google Cloud Storage bucket. - See also - For more details about these parameters: https://cloud.google.com/bigquery/docs/reference/v2/jobs - Parameters
- source_project_dataset_table (str) – The dotted - (<project>.|<project>:)<dataset>.<table>BigQuery table to use as the source data. If- <project>is not included, project will be the project defined in the connection json. (templated)
- destination_cloud_storage_uris (list) – The destination Google Cloud Storage URI (e.g. gs://some-bucket/some-file.txt). (templated) Follows convention defined here: https://cloud.google.com/bigquery/exporting-data-from-bigquery#exportingmultiple 
- compression (str) – Type of compression to use. 
- export_format (str) – File format to export. 
- field_delimiter (str) – The delimiter to use when extracting to a CSV. 
- print_header (bool) – Whether to print a header for a CSV file extract. 
- bigquery_conn_id (str) – reference to a specific BigQuery hook. 
- delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled. 
- labels (dict) – a dictionary containing labels for the job/query, passed to BigQuery