DataflowJobStatusSensor

Google

Checks for the status of a job in Google Cloud Dataflow.

View Source

Last Updated: Jan. 21, 2021

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

job_idstrID of the job to be checked.
expected_statusesUnion[Set[str], str]The expected state of the operation. See: https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs#Job.JobState
project_idstrOptional, the Google Cloud project ID in which to start a job. If set to None or missing, the default project_id from the Google Cloud connection is used.
locationstrThe location of the Dataflow job (for example europe-west1). See: https://cloud.google.com/dataflow/docs/concepts/regional-endpoints
gcp_conn_idstrThe connection ID to use connecting to Google Cloud.
delegate_tostrThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled. See: https://developers.google.com/identity/protocols/oauth2/service-account#delegatingauthority
impersonation_chainUnion[str, Sequence[str]]Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).

Documentation

Checks for the status of a job in Google Cloud Dataflow.

Was this page helpful?