airflow.providers.apache.spark.hooks.spark_jdbc_script

Module Contents

Functions

set_common_options(spark_source[, url, jdbc_table, ...])

Get Spark source from JDBC connection.

spark_write_to_jdbc(spark_session, url, user, ...)

Transfer data from Spark to JDBC source.

spark_read_from_jdbc(spark_session, url, user, ...)

Transfer data from JDBC source to Spark.

Attributes

SPARK_WRITE_TO_JDBC

SPARK_READ_FROM_JDBC

airflow.providers.apache.spark.hooks.spark_jdbc_script.SPARK_WRITE_TO_JDBC: str = 'spark_to_jdbc'[source]
airflow.providers.apache.spark.hooks.spark_jdbc_script.SPARK_READ_FROM_JDBC: str = 'jdbc_to_spark'[source]
airflow.providers.apache.spark.hooks.spark_jdbc_script.set_common_options(spark_source, url='localhost:5432', jdbc_table='default.default', user='root', password='root', driver='driver')[source]

Get Spark source from JDBC connection.

Parameters
  • spark_source (Any) – Spark source, here is Spark reader or writer

  • url (str) – JDBC resource url

  • jdbc_table (str) – JDBC resource table name

  • user (str) – JDBC resource user name

  • password (str) – JDBC resource password

  • driver (str) – JDBC resource driver

airflow.providers.apache.spark.hooks.spark_jdbc_script.spark_write_to_jdbc(spark_session, url, user, password, metastore_table, jdbc_table, driver, truncate, save_mode, batch_size, num_partitions, create_table_column_types)[source]

Transfer data from Spark to JDBC source.

airflow.providers.apache.spark.hooks.spark_jdbc_script.spark_read_from_jdbc(spark_session, url, user, password, metastore_table, jdbc_table, driver, save_mode, save_format, fetch_size, num_partitions, partition_column, lower_bound, upper_bound)[source]

Transfer data from JDBC source to Spark.

Was this entry helpful?