airflow.providers.amazon.aws.example_dags.example_sagemaker

Module Contents

Functions

upload_dataset_to_s3()

Uploads the provided dataset to a designated Amazon S3 bucket.

build_and_upload_docker_image()

We need a Docker image with the following requirements:

cleanup()

Attributes

PROJECT_NAME

TIMESTAMP

S3_BUCKET

RAW_DATA_S3_KEY

INPUT_DATA_S3_KEY

TRAINING_OUTPUT_S3_KEY

PREDICTION_OUTPUT_S3_KEY

PROCESSING_LOCAL_INPUT_PATH

PROCESSING_LOCAL_OUTPUT_PATH

MODEL_NAME

PROCESSING_JOB_NAME

TRAINING_JOB_NAME

TRANSFORM_JOB_NAME

TUNING_JOB_NAME

ROLE_ARN

ECR_REPOSITORY

REGION

DATASET

SAMPLE_SIZE

KNN_IMAGE_URI

TASK_TIMEOUT

RESOURCE_CONFIG

TRAINING_DATA_SOURCE

SAGEMAKER_PROCESSING_JOB_CONFIG

TRAINING_CONFIG

MODEL_CONFIG

TRANSFORM_CONFIG

TUNING_CONFIG

PREPROCESS_SCRIPT

preprocess_raw_data

airflow.providers.amazon.aws.example_dags.example_sagemaker.PROJECT_NAME = iris[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TIMESTAMP = {{ ts_nodash }}[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.S3_BUCKET[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.RAW_DATA_S3_KEY[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.INPUT_DATA_S3_KEY[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRAINING_OUTPUT_S3_KEY[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.PREDICTION_OUTPUT_S3_KEY[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.PROCESSING_LOCAL_INPUT_PATH = /opt/ml/processing/input[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.PROCESSING_LOCAL_OUTPUT_PATH = /opt/ml/processing/output[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.MODEL_NAME[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.PROCESSING_JOB_NAME[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRAINING_JOB_NAME[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRANSFORM_JOB_NAME[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TUNING_JOB_NAME[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.ROLE_ARN[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.ECR_REPOSITORY[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.REGION[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.DATASET = Multiline-String[source]
Show Value
1        5.1,3.5,1.4,0.2,Iris-setosa
2        4.9,3.0,1.4,0.2,Iris-setosa
3        7.0,3.2,4.7,1.4,Iris-versicolor
4        6.4,3.2,4.5,1.5,Iris-versicolor
5        4.9,2.5,4.5,1.7,Iris-virginica
6        7.3,2.9,6.3,1.8,Iris-virginica
airflow.providers.amazon.aws.example_dags.example_sagemaker.SAMPLE_SIZE[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.KNN_IMAGE_URI = 174872318107.dkr.ecr.us-west-2.amazonaws.com/knn[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TASK_TIMEOUT[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.RESOURCE_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRAINING_DATA_SOURCE[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.SAGEMAKER_PROCESSING_JOB_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRAINING_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.MODEL_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TRANSFORM_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.TUNING_CONFIG[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.PREPROCESS_SCRIPT[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.upload_dataset_to_s3()[source]

Uploads the provided dataset to a designated Amazon S3 bucket.

airflow.providers.amazon.aws.example_dags.example_sagemaker.build_and_upload_docker_image()[source]
We need a Docker image with the following requirements:
  • Has numpy, pandas, requests, and boto3 installed

  • Has our data preprocessing script mounted and set as the entry point

airflow.providers.amazon.aws.example_dags.example_sagemaker.cleanup()[source]
airflow.providers.amazon.aws.example_dags.example_sagemaker.preprocess_raw_data[source]

Was this entry helpful?