airflow.providers.google.cloud.transfers.mysql_to_gcs

MySQL to GCS operator.

Module Contents

Classes

MySQLToGCSOperator

Copy data from MySQL to Google Cloud Storage in JSON or CSV format.

class airflow.providers.google.cloud.transfers.mysql_to_gcs.MySQLToGCSOperator(*, mysql_conn_id='mysql_default', ensure_utc=False, **kwargs)[source]

Bases: airflow.providers.google.cloud.transfers.sql_to_gcs.BaseSQLToGCSOperator

Copy data from MySQL to Google Cloud Storage in JSON or CSV format.

See also

For more information on how to use this operator, take a look at the guide: MySQLToGCSOperator

Parameters
  • mysql_conn_id -- Reference to mysql connection id.

  • ensure_utc -- Ensure TIMESTAMP columns exported as UTC. If set to False, TIMESTAMP columns will be exported using the MySQL server's default timezone.

ui_color = #a0e08c[source]
type_map[source]
query(self)[source]

Queries mysql and returns a cursor to the results.

field_to_bigquery(self, field)[source]

Convert a DBAPI field to BigQuery schema format.

convert_type(self, value, schema_type, **kwargs)[source]

Takes a value from MySQLdb, and converts it to a value that's safe for JSON/Google Cloud Storage/BigQuery.

  • Datetimes are converted to str(value) (datetime.isoformat(' ')) strings.

  • Times are converted to str((datetime.min + value).time()) strings.

  • Decimals are converted to floats.

  • Dates are converted to ISO formatted strings if given schema_type is DATE, or datetime.isoformat(' ') strings otherwise.

  • Binary type fields are converted to integer if given schema_type is INTEGER, or encoded with base64 otherwise. Imported BYTES data must be base64-encoded according to BigQuery documentation: https://cloud.google.com/bigquery/data-types

Parameters
  • value -- MySQLdb column value

  • schema_type (str) -- BigQuery data type

Was this entry helpful?