airflow.contrib.operators.bigquery_to_mysql_operator
¶
This module contains a Google BigQuery to MySQL operator.
Module Contents¶
-
class
airflow.contrib.operators.bigquery_to_mysql_operator.
BigQueryToMySqlOperator
(dataset_table, mysql_table, selected_fields=None, gcp_conn_id='google_cloud_default', mysql_conn_id='mysql_default', database=None, delegate_to=None, replace=False, batch_size=1000, *args, **kwargs)[source]¶ Bases:
airflow.models.BaseOperator
Fetches the data from a BigQuery table (alternatively fetch data for selected columns) and insert that data into a MySQL table.
Note
If you pass fields to
selected_fields
which are in different order than the order of columns already in BQ table, the data will still be in the order of BQ table. For example if the BQ table has 3 columns as[A,B,C]
and you pass ‘B,A’ in theselected_fields
the data would still be of the form'A,B'
and passed through this form to MySQLExample:
transfer_data = BigQueryToMySqlOperator( task_id='task_id', dataset_table='origin_bq_table', mysql_table='dest_table_name', replace=True, )
- Parameters
dataset_table (str) – A dotted
<dataset>.<table>
: the big query table of originmax_results (str) – The maximum number of records (rows) to be fetched from the table. (templated)
selected_fields (str) – List of fields to return (comma-separated). If unspecified, all fields are returned.
gcp_conn_id (str) – reference to a specific GCP hook.
delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
mysql_conn_id (str) – reference to a specific mysql hook
database (str) – name of database which overwrite defined one in connection
replace (bool) – Whether to replace instead of insert
batch_size (int) – The number of rows to take in each batch