BigQueryInsertJobOperator

Google

Executes a BigQuery job. Waits for the job to complete and returns job id. This operator work in the following way:

View Source

Last Updated: May. 7, 2021

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

configurationDict[str, Any]The configuration parameter maps directly to BigQuery's configuration field in the job object. For more details see https://cloud.google.com/bigquery/docs/reference/v2/jobs
job_idstrThe ID of the job. It will be suffixed with hash of job configuration unless force_rerun is True. The ID must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), or dashes (-). The maximum length is 1,024 characters. If not provided then uuid will be generated.
force_rerunboolIf True then operator will use hash of uuid as job id suffix
reattach_statesSet of BigQuery job's states in case of which we should reattach to the job. Should be other than final states.
project_idstrGoogle Cloud Project where the job is running
locationstrlocation the job is running
gcp_conn_idstrThe connection ID used to connect to Google Cloud.
delegate_tostrThe account to impersonate using domain-wide delegation of authority, if any. For this to work, the service account making the request must have domain-wide delegation enabled.
impersonation_chainUnion[str, Sequence[str]]Optional service account to impersonate using short-term credentials, or chained list of accounts required to get the access_token of the last account in the list, which will be impersonated in the request. If set as a string, the account must grant the originating account the Service Account Token Creator IAM role. If set as a sequence, the identities from the list must grant Service Account Token Creator IAM role to the directly preceding identity, with first account from the list granting this role to the originating account (templated).
cancel_on_killboolFlag which indicates whether cancel the hook's job or not, when on_kill is called

Documentation

Executes a BigQuery job. Waits for the job to complete and returns job id. This operator work in the following way:

  • it calculates a unique hash of the job using job’s configuration or uuid if force_rerun is True

  • creates job_id in form of

    [provided_job_id | airflow_{dag_id}_{task_id}_{exec_date}]_{uniqueness_suffix}

  • submits a BigQuery job using the job_id

  • if job with given id already exists then it tries to reattach to the job if its not done and its

    state is in reattach_states. If the job is done the operator will raise AirflowException.

Using force_rerun will submit a new job every time without attaching to already existing ones.

For job definition see here:

See also

For more information on how to use this operator, take a look at the guide: Execute BigQuery jobs

Was this page helpful?