This module contains a Google BigQuery to MySQL operator.

Module Contents

class airflow.contrib.operators.bigquery_to_mysql_operator.BigQueryToMySqlOperator(dataset_table, mysql_table, selected_fields=None, gcp_conn_id='google_cloud_default', mysql_conn_id='mysql_default', database=None, delegate_to=None, replace=False, batch_size=1000, *args, **kwargs)[source]

Bases: airflow.models.BaseOperator

Fetches the data from a BigQuery table (alternatively fetch data for selected columns) and insert that data into a MySQL table.


If you pass fields to selected_fields which are in different order than the order of columns already in BQ table, the data will still be in the order of BQ table. For example if the BQ table has 3 columns as [A,B,C] and you pass ‘B,A’ in the selected_fields the data would still be of the form 'A,B' and passed through this form to MySQL


transfer_data = BigQueryToMySqlOperator(
  • dataset_table (str) – A dotted <dataset>.<table>: the big query table of origin

  • max_results (str) – The maximum number of records (rows) to be fetched from the table. (templated)

  • selected_fields (str) – List of fields to return (comma-separated). If unspecified, all fields are returned.

  • gcp_conn_id (str) – reference to a specific GCP hook.

  • delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

  • mysql_conn_id (str) – reference to a specific mysql hook

  • database (str) – name of database which overwrite defined one in connection

  • replace (bool) – Whether to replace instead of insert

  • batch_size (int) – The number of rows to take in each batch

template_fields = ['dataset_id', 'table_id', 'mysql_table'][source]
execute(self, context)[source]

Was this entry helpful?