airflow.providers.amazon.aws.example_dags.example_sagemaker
¶
Module Contents¶
Functions¶
Uploads the provided dataset to a designated Amazon S3 bucket. |
|
We need a Docker image with the following requirements: |
|
|
Attributes¶
- airflow.providers.amazon.aws.example_dags.example_sagemaker.PROCESSING_LOCAL_INPUT_PATH = /opt/ml/processing/input[source]¶
- airflow.providers.amazon.aws.example_dags.example_sagemaker.PROCESSING_LOCAL_OUTPUT_PATH = /opt/ml/processing/output[source]¶
- airflow.providers.amazon.aws.example_dags.example_sagemaker.DATASET = Multiline-String[source]¶
Show Value
1 5.1,3.5,1.4,0.2,Iris-setosa 2 4.9,3.0,1.4,0.2,Iris-setosa 3 7.0,3.2,4.7,1.4,Iris-versicolor 4 6.4,3.2,4.5,1.5,Iris-versicolor 5 4.9,2.5,4.5,1.7,Iris-virginica 6 7.3,2.9,6.3,1.8,Iris-virginica
- airflow.providers.amazon.aws.example_dags.example_sagemaker.KNN_IMAGE_URI = 174872318107.dkr.ecr.us-west-2.amazonaws.com/knn[source]¶
- airflow.providers.amazon.aws.example_dags.example_sagemaker.SAGEMAKER_PROCESSING_JOB_CONFIG[source]¶
- airflow.providers.amazon.aws.example_dags.example_sagemaker.upload_dataset_to_s3()[source]¶
Uploads the provided dataset to a designated Amazon S3 bucket.