airflow.providers.amazon.aws.sensors.bedrock

Module Contents

Classes

BedrockBaseSensor

General sensor behavior for Amazon Bedrock.

BedrockCustomizeModelCompletedSensor

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

BedrockProvisionModelThroughputCompletedSensor

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockBaseSensor(deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]

Bases: airflow.providers.amazon.aws.sensors.base_aws.AwsBaseSensor[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook]

General sensor behavior for Amazon Bedrock.

Subclasses must implement following methods:
  • get_state()

Subclasses must set the following fields:
  • INTERMEDIATE_STATES

  • FAILURE_STATES

  • SUCCESS_STATES

  • FAILURE_MESSAGE

Parameters

deferrable (bool) – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_STATES: tuple[str, Ellipsis] = ()[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ()[source]
FAILURE_MESSAGE = ''[source]
aws_hook_class[source]
ui_color = '#66c3ff'[source]
poke(context)[source]

Override when deriving this class.

abstract get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockCustomizeModelCompletedSensor(*, job_name, max_retries=75, poke_interval=120, **kwargs)[source]

Bases: BedrockBaseSensor

Poll the state of the model customization job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock customize model job

Parameters
  • job_name (str) – The name of the Bedrock model customization job.

  • deferrable – If True, the sensor will operate in deferrable mode. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 120)

  • max_retries (int) – Number of times before returning the current state. (default: 75)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('InProgress',)[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed', 'Stopping', 'Stopped')[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('Completed',)[source]
FAILURE_MESSAGE = 'Bedrock model customization job sensor failed.'[source]
template_fields: Sequence[str][source]
execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

get_state()[source]

Implement in subclasses.

class airflow.providers.amazon.aws.sensors.bedrock.BedrockProvisionModelThroughputCompletedSensor(*, model_id, poke_interval=60, max_retries=20, **kwargs)[source]

Bases: BedrockBaseSensor

Poll the provisioned model throughput job until it reaches a terminal state; fails if the job fails.

See also

For more information on how to use this sensor, take a look at the guide: Wait for an Amazon Bedrock provision model throughput job

Parameters
  • model_id (str) – The ARN or name of the provisioned throughput.

  • deferrable – If True, the sensor will operate in deferrable more. This mode requires aiobotocore module to be installed. (default: False, but can be overridden in config file by setting default_deferrable to True)

  • poke_interval (int) – Polling period in seconds to check for the status of the job. (default: 60)

  • max_retries (int) – Number of times before returning the current state (default: 20)

  • aws_conn_id – The Airflow connection used for AWS credentials. If this is None or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).

  • region_name – AWS region_name. If not specified then the default boto3 behaviour is used.

  • verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html

  • botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html

INTERMEDIATE_STATES: tuple[str, Ellipsis] = ('Creating', 'Updating')[source]
FAILURE_STATES: tuple[str, Ellipsis] = ('Failed',)[source]
SUCCESS_STATES: tuple[str, Ellipsis] = ('InService',)[source]
FAILURE_MESSAGE = 'Bedrock provision model throughput sensor failed.'[source]
template_fields: Sequence[str][source]
get_state()[source]

Implement in subclasses.

execute(context)[source]

Derive when creating an operator.

Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

Was this entry helpful?