airflow.providers.amazon.aws.sensors.s3_prefix

Module Contents

class airflow.providers.amazon.aws.sensors.s3_prefix.S3PrefixSensor(*, bucket_name: str, prefix: Union[str, List[str]], delimiter: str = '/', aws_conn_id: str = 'aws_default', verify: Optional[Union[str, bool]] = None, **kwargs)[source]

Bases: airflow.sensors.base.BaseSensorOperator

Waits for a prefix or all prefixes to exist. A prefix is the first part of a key, thus enabling checking of constructs similar to glob airfl* or SQL LIKE 'airfl%'. There is the possibility to precise a delimiter to indicate the hierarchy or keys, meaning that the match will stop at that delimiter. Current code accepts sane delimiters, i.e. characters that are NOT special characters in the Python regex engine.

Parameters
  • bucket_name (str) -- Name of the S3 bucket

  • prefix (str or list of str) -- The prefix being waited on. Relative path from bucket root level.

  • delimiter (str) -- The delimiter intended to show hierarchy. Defaults to '/'.

  • aws_conn_id (str) -- a reference to the s3 connection

  • verify (bool or str) --

    Whether or not to verify SSL certificates for S3 connection. By default SSL certificates are verified. You can provide the following values:

    • False: do not validate SSL certificates. SSL will still be used

      (unless use_ssl is False), but SSL certificates will not be verified.

    • path/to/cert/bundle.pem: A filename of the CA cert bundle to uses.

      You can specify this argument if you want to use a different CA cert bundle than the one used by botocore.

template_fields = ['prefix', 'bucket_name'][source]
poke(self, context: Dict[str, Any])[source]
get_hook(self)[source]

Create and return an S3Hook

Was this entry helpful?