airflow.contrib.operators.mysql_to_gcs

MySQL to GCS operator.

Module Contents

airflow.contrib.operators.mysql_to_gcs.PY3[source]
class airflow.contrib.operators.mysql_to_gcs.MySqlToGoogleCloudStorageOperator(mysql_conn_id='mysql_default', ensure_utc=False, *args, **kwargs)[source]

Bases: airflow.contrib.operators.sql_to_gcs.BaseSQLToGoogleCloudStorageOperator

Copy data from MySQL to Google cloud storage in JSON or CSV format.

Parameters
  • mysql_conn_id (str) – Reference to a specific MySQL hook.

  • ensure_utc (bool) – Ensure TIMESTAMP columns exported as UTC. If set to False, TIMESTAMP columns will be exported using the MySQL server’s default timezone.

ui_color = #a0e08c[source]
type_map[source]
query(self)[source]

Queries mysql and returns a cursor to the results.

field_to_bigquery(self, field)[source]
convert_type(self, value, schema_type)[source]

Takes a value from MySQLdb, and converts it to a value that’s safe for JSON/Google cloud storage/BigQuery. Dates are converted to UTC seconds. Decimals are converted to floats. Binary type fields are encoded with base64, as imported BYTES data must be base64-encoded according to Bigquery SQL date type documentation: https://cloud.google.com/bigquery/data-types

Parameters
  • value (Any) – MySQLdb column value

  • schema_type (str) – BigQuery data type

Was this entry helpful?