airflow.providers.amazon.aws.hooks.glue

Module Contents

Classes

GlueJobHook

Interact with AWS Glue.

Attributes

DEFAULT_LOG_SUFFIX

FAILURE_LOG_SUFFIX

DEFAULT_LOG_FILTER

FAILURE_LOG_FILTER

airflow.providers.amazon.aws.hooks.glue.DEFAULT_LOG_SUFFIX = 'output'[source]
airflow.providers.amazon.aws.hooks.glue.FAILURE_LOG_SUFFIX = 'error'[source]
airflow.providers.amazon.aws.hooks.glue.DEFAULT_LOG_FILTER = ' '[source]
airflow.providers.amazon.aws.hooks.glue.FAILURE_LOG_FILTER = '?ERROR ?Exception'[source]
class airflow.providers.amazon.aws.hooks.glue.GlueJobHook(s3_bucket=None, job_name=None, desc=None, concurrent_run_limit=1, script_location=None, retry_limit=0, num_of_dpus=None, iam_role_name=None, create_job_kwargs=None, update_config=False, *args, **kwargs)[source]

Bases: airflow.providers.amazon.aws.hooks.base_aws.AwsBaseHook

Interact with AWS Glue. Provide thick wrapper around boto3.client("glue").

Parameters
  • s3_bucket (str | None) – S3 bucket where logs and local etl script will be uploaded

  • job_name (str | None) – unique job name per AWS account

  • desc (str | None) – job description

  • concurrent_run_limit (int) – The maximum number of concurrent runs allowed for a job

  • script_location (str | None) – path to etl script on s3

  • retry_limit (int) – Maximum number of times to retry this job if it fails

  • num_of_dpus (int | float | None) – Number of AWS Glue DPUs to allocate to this Job

  • region_name – aws region name (example: us-east-1)

  • iam_role_name (str | None) – AWS IAM Role for Glue Job Execution

  • create_job_kwargs (dict | None) – Extra arguments for Glue Job Creation

  • update_config (bool) – Update job configuration on Glue (default: False)

Additional arguments (such as aws_conn_id) may be specified and are passed down to the underlying AwsBaseHook.

JOB_POLL_INTERVAL = 6[source]
create_glue_job_config()[source]
list_jobs()[source]

Get list of Jobs.

get_iam_execution_role()[source]

Get IAM Role for job execution.

initialize_job(script_arguments=None, run_kwargs=None)[source]

Initializes connection with AWS Glue to run job.

get_job_state(job_name, run_id)[source]

Get state of the Glue job. The job state can be running, finished, failed, stopped or timeout.

Parameters
  • job_name (str) – unique job name per AWS account

  • run_id (str) – The job-run ID of the predecessor job run

Returns

State of the Glue job

Return type

str

print_job_logs(job_name, run_id, job_failed=False, next_token=None)[source]

Prints the batch of logs to the Airflow task log and returns nextToken.

job_completion(job_name, run_id, verbose=False)[source]

Waits until Glue job with job_name completes or fails and return final state if finished. Raises AirflowException when the job failed.

Parameters
  • job_name (str) – unique job name per AWS account

  • run_id (str) – The job-run ID of the predecessor job run

  • verbose (bool) – If True, more Glue Job Run logs show in the Airflow Task Logs. (default: False)

Returns

Dict of JobRunState and JobRunId

Return type

dict[str, str]

has_job(job_name)[source]

Checks if the job already exists.

Parameters

job_name – unique job name per AWS account

Returns

Returns True if the job already exists and False if not.

Return type

bool

update_job(**job_kwargs)[source]

Updates job configurations.

Parameters

job_kwargs – Keyword args that define the configurations used for the job

Returns

True if job was updated and false otherwise

Return type

bool

get_or_create_glue_job()[source]

Get (or creates) and returns the Job name.

:return:Name of the Job

create_or_update_glue_job()[source]

Creates (or updates) and returns the Job name.

:return:Name of the Job

Was this entry helpful?