airflow.providers.amazon.aws.operators.bedrock
¶
Module Contents¶
Classes¶
Invoke the specified Bedrock model to run inference using the input provided. |
|
Create a fine-tuning job to customize a base model. |
- class airflow.providers.amazon.aws.operators.bedrock.BedrockInvokeModelOperator(model_id, input_data, content_type=None, accept_type=None, **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator
[airflow.providers.amazon.aws.hooks.bedrock.BedrockRuntimeHook
]Invoke the specified Bedrock model to run inference using the input provided.
Use InvokeModel to run inference for text models, image models, and embedding models. To see the format and content of the input_data field for different models, refer to Inference parameters docs.
See also
For more information on how to use this operator, take a look at the guide: Invoke an existing Amazon Bedrock Model
- Parameters
model_id (str) – The ID of the Bedrock model. (templated)
input_data (dict[str, Any]) – Input data in the format specified in the content-type request header. (templated)
content_type (str | None) – The MIME type of the input data in the request. (templated) Default: application/json
accept – The desired MIME type of the inference body in the response. (templated) Default: application/json
aws_conn_id – The Airflow connection used for AWS credentials. If this is
None
or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).region_name – AWS region_name. If not specified then the default boto3 behaviour is used.
verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html
- class airflow.providers.amazon.aws.operators.bedrock.BedrockCustomizeModelOperator(job_name, custom_model_name, role_arn, base_model_id, training_data_uri, output_data_uri, hyperparameters, ensure_unique_job_name=True, customization_job_kwargs=None, wait_for_completion=True, waiter_delay=120, waiter_max_attempts=75, deferrable=conf.getboolean('operators', 'default_deferrable', fallback=False), **kwargs)[source]¶
Bases:
airflow.providers.amazon.aws.operators.base_aws.AwsBaseOperator
[airflow.providers.amazon.aws.hooks.bedrock.BedrockHook
]Create a fine-tuning job to customize a base model.
See also
For more information on how to use this operator, take a look at the guide: Customize an existing Amazon Bedrock Model
- Parameters
job_name (str) – A unique name for the fine-tuning job.
custom_model_name (str) – A name for the custom model being created.
role_arn (str) – The Amazon Resource Name (ARN) of an IAM role that Amazon Bedrock can assume to perform tasks on your behalf.
base_model_id (str) – Name of the base model.
training_data_uri (str) – The S3 URI where the training data is stored.
output_data_uri (str) – The S3 URI where the output data is stored.
hyperparameters (dict[str, str]) – Parameters related to tuning the model.
ensure_unique_job_name (bool) – If set to true, operator will check whether a model customization job already exists for the name in the config and append the current timestamp if there is a name conflict. (Default: True)
customization_job_kwargs (dict[str, Any] | None) – Any optional parameters to pass to the API.
wait_for_completion (bool) – Whether to wait for cluster to stop. (default: True)
waiter_delay (int) – Time in seconds to wait between status checks. (default: 120)
waiter_max_attempts (int) – Maximum number of attempts to check for job completion. (default: 75)
deferrable (bool) – If True, the operator will wait asynchronously for the cluster to stop. This implies waiting for completion. This mode requires aiobotocore module to be installed. (default: False)
aws_conn_id – The Airflow connection used for AWS credentials. If this is
None
or empty then the default boto3 behaviour is used. If running Airflow in a distributed manner and aws_conn_id is None or empty, then default boto3 configuration would be used (and must be maintained on each worker node).region_name – AWS region_name. If not specified then the default boto3 behaviour is used.
verify – Whether or not to verify SSL certificates. See: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html
botocore_config – Configuration dictionary (key-values) for botocore client. See: https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html