airflow.providers.google.cloud.triggers.bigquery

Module Contents

Classes

BigQueryInsertJobTrigger

BigQueryInsertJobTrigger run on the trigger worker to perform insert operation.

BigQueryCheckTrigger

BigQueryCheckTrigger run on the trigger worker.

BigQueryGetDataTrigger

BigQueryGetDataTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

BigQueryIntervalCheckTrigger

BigQueryIntervalCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

BigQueryValueCheckTrigger

BigQueryValueCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

BigQueryTableExistenceTrigger

Initialize the BigQuery Table Existence Trigger with needed parameters.

BigQueryTablePartitionExistenceTrigger

Initialize the BigQuery Table Partition Existence Trigger with needed parameters.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger(conn_id, job_id, project_id, location, dataset_id=None, table_id=None, poll_interval=4.0, impersonation_chain=None, cancel_on_kill=True)[source]

Bases: airflow.triggers.base.BaseTrigger

BigQueryInsertJobTrigger run on the trigger worker to perform insert operation.

Parameters
  • conn_id (str) – Reference to google cloud connection id

  • job_id (str | None) – The ID of the job. It will be suffixed with hash of job configuration

  • project_id (str) – Google Cloud Project where the job is running

  • location (str | None) – The dataset location.

  • dataset_id (str | None) – The dataset ID of the requested table. (templated)

  • table_id (str | None) – The table ID of the requested table. (templated)

  • poll_interval (float) – polling period in seconds to check for the status. (templated)

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account. (templated)

serialize()[source]

Serialize BigQueryInsertJobTrigger arguments and classpath.

get_task_instance(session)[source]
safe_to_cancel()[source]

Whether it is safe to cancel the external job which is being executed by this trigger.

This is to avoid the case that asyncio.CancelledError is called because the trigger itself is stopped. Because in those cases, we should NOT cancel the external job.

async run()[source]

Get current job execution status and yields a TriggerEvent.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryCheckTrigger(conn_id, job_id, project_id, location, dataset_id=None, table_id=None, poll_interval=4.0, impersonation_chain=None, cancel_on_kill=True)[source]

Bases: BigQueryInsertJobTrigger

BigQueryCheckTrigger run on the trigger worker.

serialize()[source]

Serialize BigQueryCheckTrigger arguments and classpath.

async run()[source]

Get current job execution status and yields a TriggerEvent.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryGetDataTrigger(as_dict=False, selected_fields=None, **kwargs)[source]

Bases: BigQueryInsertJobTrigger

BigQueryGetDataTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

Parameters

as_dict (bool) – if True returns the result as a list of dictionaries, otherwise as list of lists (default: False).

serialize()[source]

Serialize BigQueryInsertJobTrigger arguments and classpath.

async run()[source]

Get current job execution status and yields a TriggerEvent with response data.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryIntervalCheckTrigger(conn_id, first_job_id, second_job_id, project_id, table, metrics_thresholds, location=None, date_filter_column='ds', days_back=-7, ratio_formula='max_over_min', ignore_zero=True, dataset_id=None, table_id=None, poll_interval=4.0, impersonation_chain=None)[source]

Bases: BigQueryInsertJobTrigger

BigQueryIntervalCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

Parameters
  • conn_id (str) – Reference to google cloud connection id

  • first_job_id (str) – The ID of the job 1 performed

  • second_job_id (str) – The ID of the job 2 performed

  • project_id (str) – Google Cloud Project where the job is running

  • dataset_id (str | None) – The dataset ID of the requested table. (templated)

  • table (str) – table name

  • metrics_thresholds (dict[str, int]) – dictionary of ratios indexed by metrics

  • location (str | None) – The dataset location.

  • date_filter_column (str | None) – column name. (templated)

  • days_back (SupportsAbs[int]) – number of days between ds and the ds we want to check against. (templated)

  • ratio_formula (str) – ration formula. (templated)

  • ignore_zero (bool) – boolean value to consider zero or not. (templated)

  • table_id (str | None) – The table ID of the requested table. (templated)

  • poll_interval (float) – polling period in seconds to check for the status. (templated)

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account. (templated)

serialize()[source]

Serialize BigQueryCheckTrigger arguments and classpath.

async run()[source]

Get current job execution status and yields a TriggerEvent.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryValueCheckTrigger(conn_id, sql, pass_value, job_id, project_id, tolerance=None, dataset_id=None, table_id=None, location=None, poll_interval=4.0, impersonation_chain=None)[source]

Bases: BigQueryInsertJobTrigger

BigQueryValueCheckTrigger run on the trigger worker, inherits from BigQueryInsertJobTrigger class.

Parameters
  • conn_id (str) – Reference to google cloud connection id

  • sql (str) – the sql to be executed

  • pass_value (int | float | str) – pass value

  • job_id (str | None) – The ID of the job

  • project_id (str) – Google Cloud Project where the job is running

  • tolerance (Any) – certain metrics for tolerance. (templated)

  • dataset_id (str | None) – The dataset ID of the requested table. (templated)

  • table_id (str | None) – The table ID of the requested table. (templated)

  • location (str | None) – The dataset location

  • poll_interval (float) – polling period in seconds to check for the status. (templated)

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

serialize()[source]

Serialize BigQueryValueCheckTrigger arguments and classpath.

async run()[source]

Get current job execution status and yields a TriggerEvent.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryTableExistenceTrigger(project_id, dataset_id, table_id, gcp_conn_id, hook_params, poll_interval=4.0, impersonation_chain=None)[source]

Bases: airflow.triggers.base.BaseTrigger

Initialize the BigQuery Table Existence Trigger with needed parameters.

Parameters
  • project_id (str) – Google Cloud Project where the job is running

  • dataset_id (str) – The dataset ID of the requested table.

  • table_id (str) – The table ID of the requested table.

  • gcp_conn_id (str) – Reference to google cloud connection id

  • hook_params (dict[str, Any]) – params for hook

  • poll_interval (float) – polling period in seconds to check for the status

  • impersonation_chain (str | collections.abc.Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account. (templated)

serialize()[source]

Serialize BigQueryTableExistenceTrigger arguments and classpath.

async run()[source]

Will run until the table exists in the Google Big Query.

class airflow.providers.google.cloud.triggers.bigquery.BigQueryTablePartitionExistenceTrigger(partition_id, **kwargs)[source]

Bases: BigQueryTableExistenceTrigger

Initialize the BigQuery Table Partition Existence Trigger with needed parameters.

Parameters
  • partition_id (str) – The name of the partition to check the existence of.

  • project_id – Google Cloud Project where the job is running

  • dataset_id – The dataset ID of the requested table.

  • table_id – The table ID of the requested table.

  • gcp_conn_id – Reference to google cloud connection id

  • hook_params – params for hook

  • poll_interval – polling period in seconds to check for the status

  • impersonation_chain – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account. (templated)

serialize()[source]

Serialize BigQueryTablePartitionExistenceTrigger arguments and classpath.

async run()[source]

Will run until the table exists in the Google Big Query.

Was this entry helpful?