airflow.contrib.sensors.gcs_sensor

Module Contents

class airflow.contrib.sensors.gcs_sensor.GoogleCloudStorageObjectSensor(bucket, object, google_cloud_conn_id='google_cloud_default', delegate_to=None, *args, **kwargs)[source]

Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator

Checks for the existence of a file in Google Cloud Storage.

Parameters
  • bucket (str) – The Google cloud storage bucket where the object is.

  • object (str) – The name of the object to check in the Google cloud storage bucket.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google cloud storage.

  • delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

template_fields = ['bucket', 'object'][source]
ui_color = #f0eee4[source]
poke(self, context)[source]
airflow.contrib.sensors.gcs_sensor.ts_function(context)[source]
Default callback for the GoogleCloudStorageObjectUpdatedSensor. The default
behaviour is check for the object being updated after execution_date +
schedule_interval.
class airflow.contrib.sensors.gcs_sensor.GoogleCloudStorageObjectUpdatedSensor(bucket, object, ts_func=ts_function, google_cloud_conn_id='google_cloud_default', delegate_to=None, *args, **kwargs)[source]

Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator

Checks if an object is updated in Google Cloud Storage.

Parameters
  • bucket (str) – The Google cloud storage bucket where the object is.

  • object (str) – The name of the object to download in the Google cloud storage bucket.

  • ts_func (function) – Callback for defining the update condition. The default callback returns execution_date + schedule_interval. The callback takes the context as parameter.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google cloud storage.

  • delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

template_fields = ['bucket', 'object'][source]
ui_color = #f0eee4[source]
poke(self, context)[source]
class airflow.contrib.sensors.gcs_sensor.GoogleCloudStoragePrefixSensor(bucket, prefix, google_cloud_conn_id='google_cloud_default', delegate_to=None, *args, **kwargs)[source]

Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator

Checks for the existence of a objects at prefix in Google Cloud Storage bucket.

Parameters
  • bucket (str) – The Google cloud storage bucket where the object is.

  • prefix (str) – The name of the prefix to check in the Google cloud storage bucket.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google cloud storage.

  • delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

template_fields = ['bucket', 'prefix'][source]
ui_color = #f0eee4[source]
poke(self, context)[source]
airflow.contrib.sensors.gcs_sensor.get_time()[source]
This is just a wrapper of datetime.datetime.now to simplify mocking in the
unittests.
class airflow.contrib.sensors.gcs_sensor.GoogleCloudStorageUploadSessionCompleteSensor(bucket, prefix, inactivity_period=60 * 60, min_objects=1, previous_num_objects=0, allow_delete=True, google_cloud_conn_id='google_cloud_default', delegate_to=None, *args, **kwargs)[source]

Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator

Checks for changes in the number of objects at prefix in Google Cloud Storage bucket and returns True if the inactivity period has passed with no increase in the number of objects. Note, it is recommended to use reschedule mode if you expect this sensor to run for hours.

Parameters
  • bucket (str) – The Google cloud storage bucket where the objects are. expected.

  • prefix – The name of the prefix to check in the Google cloud storage bucket.

  • inactivity_period (int) – The total seconds of inactivity to designate an upload session is over. Note, this mechanism is not real time and this operator may not return until a poke_interval after this period has passed with no additional objects sensed.

  • min_objects (int) – The minimum number of objects needed for upload session to be considered valid.

  • previous_num_objects (int) – The number of objects found during the last poke.

  • inactivity_seconds (int) – The current seconds of the inactivity period.

  • allow_delete (bool) – Should this sensor consider objects being deleted between pokes valid behavior. If true a warning message will be logged when this happens. If false an error will be raised.

  • google_cloud_conn_id (str) – The connection ID to use when connecting to Google cloud storage.

  • delegate_to (str) – The account to impersonate, if any. For this to work, the service account making the request must have domain-wide delegation enabled.

template_fields = ['bucket', 'prefix'][source]
ui_color = #f0eee4[source]
is_bucket_updated(self, current_num_objects)[source]

Checks whether new objects have been uploaded and the inactivity_period has passed and updates the state of the sensor accordingly.

Parameters

current_num_objects (int) – number of objects in bucket during last poke.

poke(self, context)[source]

Was this entry helpful?