airflow.providers.google.cloud.sensors.gcs

This module contains Google Cloud Storage sensors.

Module Contents

Classes

GCSObjectExistenceSensor

Checks for the existence of a file in Google Cloud Storage.

GCSObjectExistenceAsyncSensor

Checks for the existence of a file in Google Cloud Storage.

GCSObjectUpdateSensor

Checks if an object is updated in Google Cloud Storage.

GCSObjectsWithPrefixExistenceSensor

Checks for the existence of GCS objects at a given prefix, passing matches via XCom.

GCSUploadSessionCompleteSensor

Return True if the inactivity period has passed with no increase in the number of objects in the bucket.

Functions

ts_function(context)

Act as a default callback for the GoogleCloudStorageObjectUpdatedSensor.

get_time()

Act as a wrapper of datetime.datetime.now to simplify mocking in the unittests.

class airflow.providers.google.cloud.sensors.gcs.GCSObjectExistenceSensor(*, bucket, object, use_glob=False, google_cloud_conn_id='google_cloud_default', impersonation_chain=None, retry=DEFAULT_RETRY, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks for the existence of a file in Google Cloud Storage.

Parameters
  • bucket (str) – The Google Cloud Storage bucket where the object is.

  • object (str) – The name of the object to check in the Google cloud storage bucket.

  • use_glob (bool) – When set to True the object parameter is interpreted as glob

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google Cloud Storage.

  • impersonation_chain (str | Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • retry (google.api_core.retry.Retry) – (Optional) How to retry the RPC

template_fields: Sequence[str] = ('bucket', 'object', 'impersonation_chain')[source]
ui_color = '#f0eee4'[source]
poke(context)[source]

Override when deriving this class.

execute(context)[source]

Airflow runs this method on the worker and defers using the trigger.

execute_complete(context, event)[source]

Act as a callback for when the trigger fires - returns immediately.

Relies on trigger to throw an exception, otherwise it assumes execution was successful.

class airflow.providers.google.cloud.sensors.gcs.GCSObjectExistenceAsyncSensor(**kwargs)[source]

Bases: GCSObjectExistenceSensor

Checks for the existence of a file in Google Cloud Storage.

This class is deprecated and will be removed in a future release.

Please use airflow.providers.google.cloud.sensors.gcs.GCSObjectExistenceSensor and set deferrable attribute to True instead.

Parameters
  • bucket – The Google Cloud Storage bucket where the object is.

  • object – The name of the object to check in the Google cloud storage bucket.

  • google_cloud_conn_id – The connection ID to use when connecting to Google Cloud Storage.

  • impersonation_chain – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

airflow.providers.google.cloud.sensors.gcs.ts_function(context)[source]

Act as a default callback for the GoogleCloudStorageObjectUpdatedSensor.

The default behaviour is check for the object being updated after the data interval’s end, or execution_date + interval on Airflow versions prior to 2.2 (before AIP-39 implementation).

class airflow.providers.google.cloud.sensors.gcs.GCSObjectUpdateSensor(bucket, object, ts_func=ts_function, google_cloud_conn_id='google_cloud_default', impersonation_chain=None, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks if an object is updated in Google Cloud Storage.

Parameters
  • bucket (str) – The Google Cloud Storage bucket where the object is.

  • object (str) – The name of the object to download in the Google cloud storage bucket.

  • ts_func (Callable) – Callback for defining the update condition. The default callback returns execution_date + schedule_interval. The callback takes the context as parameter.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google Cloud Storage.

  • impersonation_chain (str | Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • deferrable (bool) – Run sensor in deferrable mode

template_fields: Sequence[str] = ('bucket', 'object', 'impersonation_chain')[source]
ui_color = '#f0eee4'[source]
poke(context)[source]

Override when deriving this class.

execute(context)[source]

Airflow runs this method on the worker and defers using the trigger.

execute_complete(context, event=None)[source]

Return immediately and rely on trigger to throw a success event. Callback for the trigger.

class airflow.providers.google.cloud.sensors.gcs.GCSObjectsWithPrefixExistenceSensor(bucket, prefix, google_cloud_conn_id='google_cloud_default', impersonation_chain=None, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Checks for the existence of GCS objects at a given prefix, passing matches via XCom.

When files matching the given prefix are found, the poke method’s criteria will be fulfilled and the matching objects will be returned from the operator and passed through XCom for downstream tasks.

Parameters
  • bucket (str) – The Google Cloud Storage bucket where the object is.

  • prefix (str) – The name of the prefix to check in the Google cloud storage bucket.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google Cloud Storage.

  • impersonation_chain (str | Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • deferrable (bool) – Run sensor in deferrable mode

template_fields: Sequence[str] = ('bucket', 'prefix', 'impersonation_chain')[source]
ui_color = '#f0eee4'[source]
poke(context)[source]

Override when deriving this class.

execute(context)[source]

Overridden to allow matches to be passed.

execute_complete(context, event)[source]

Return immediately and rely on trigger to throw a success event. Callback for the trigger.

airflow.providers.google.cloud.sensors.gcs.get_time()[source]

Act as a wrapper of datetime.datetime.now to simplify mocking in the unittests.

class airflow.providers.google.cloud.sensors.gcs.GCSUploadSessionCompleteSensor(bucket, prefix, inactivity_period=60 * 60, min_objects=1, previous_objects=None, allow_delete=True, google_cloud_conn_id='google_cloud_default', impersonation_chain=None, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Return True if the inactivity period has passed with no increase in the number of objects in the bucket.

Checks for changes in the number of objects at prefix in Google Cloud Storage bucket and returns True if the inactivity period has passed with no increase in the number of objects. Note, this sensor will not behave correctly in reschedule mode, as the state of the listed objects in the GCS bucket will be lost between rescheduled invocations.

Parameters
  • bucket (str) – The Google Cloud Storage bucket where the objects are. expected.

  • prefix (str) – The name of the prefix to check in the Google cloud storage bucket.

  • inactivity_period (float) – The total seconds of inactivity to designate an upload session is over. Note, this mechanism is not real time and this operator may not return until a poke_interval after this period has passed with no additional objects sensed.

  • min_objects (int) – The minimum number of objects needed for upload session to be considered valid.

  • previous_objects (set[str] | None) – The set of object ids found during the last poke.

  • allow_delete (bool) – Should this sensor consider objects being deleted between pokes valid behavior. If true a warning message will be logged when this happens. If false an error will be raised.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google Cloud Storage.

  • impersonation_chain (str | Sequence[str] | None) – Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

  • deferrable (bool) – Run sensor in deferrable mode

template_fields: Sequence[str] = ('bucket', 'prefix', 'impersonation_chain')[source]
ui_color = '#f0eee4'[source]
is_bucket_updated(current_objects)[source]

Check whether new objects have been added and the inactivity_period has passed, and update the state.

Parameters

current_objects (set[str]) – set of object ids in bucket during last poke.

poke(context)[source]

Override when deriving this class.

execute(context)[source]

Airflow runs this method on the worker and defers using the trigger.

execute_complete(context, event=None)[source]

Rely on trigger to throw an exception, otherwise it assumes execution was successful.

Callback for when the trigger fires - returns immediately.

Was this entry helpful?