airflow.providers.google.cloud.operators.looker

This module contains Google Cloud Looker operators.

Module Contents

Classes

LookerStartPdtBuildOperator

Submits a PDT materialization job to Looker.

class airflow.providers.google.cloud.operators.looker.LookerStartPdtBuildOperator(looker_conn_id, model, view, query_params=None, asynchronous=False, cancel_on_kill=True, wait_time=10, wait_timeout=None, **kwargs)[source]

Bases: airflow.models.BaseOperator

Submits a PDT materialization job to Looker.

Parameters
  • looker_conn_id (str) -- Required. The connection ID to use connecting to Looker.

  • model (str) -- Required. The model of the PDT to start building.

  • view (str) -- Required. The view of the PDT to start building.

  • query_params (Optional[Dict]) -- Optional. Additional materialization parameters.

  • asynchronous (bool) -- Optional. Flag indicating whether to wait for the job to finish or return immediately. This is useful for submitting long running jobs and waiting on them asynchronously using the LookerCheckPdtBuildSensor

  • cancel_on_kill (bool) -- Optional. Flag which indicates whether cancel the hook's job or not, when on_kill is called.

  • wait_time (int) -- Optional. Number of seconds between checks for job to be ready. Used only if asynchronous is False.

  • wait_timeout (Optional[int]) -- Optional. How many seconds wait for job to be ready. Used only if asynchronous is False.

execute(self, context)[source]

This is the main method to derive when creating an operator. Context is the same dictionary used as when rendering jinja templates.

Refer to get_template_context for more context.

on_kill(self)[source]

Override this method to cleanup subprocesses when a task instance gets killed. Any use of the threading, subprocess or multiprocessing module within an operator needs to be cleaned up or it will leave ghost processes behind.

Was this entry helpful?