SalesforceToGcsOperator

Google

Submits Salesforce query and uploads results to Google Cloud Storage

View Source

Last Updated: Mar. 22, 2021

Access Instructions

Install the Google provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

querystrThe query to make to Salesforce.
bucket_namestrThe bucket to upload to.
object_namestrThe object name to set when uploading the file.
salesforce_conn_idstrthe name of the connection that has the parameters we need to connect to Salesforce.
include_deletedboolTrue if the query should include deleted records.
query_paramsdictAdditional optional arguments
export_formatstrDesired format of files to be exported.
coerce_to_timestampboolTrue if you want all datetime fields to be converted into Unix timestamps. False if you want them to be left in the same format as they were in Salesforce. Leaving the value as False will result in datetimes being strings. Default: False
record_time_addedboolTrue if you want to add a Unix timestamp field to the resulting data that marks when the data was fetched from Salesforce. Default: False
gzipboolOption to compress local file or file data for upload
gcp_conn_idstrthe name of the connection that has the parameters we need to connect to GCS.

Documentation

Submits Salesforce query and uploads results to Google Cloud Storage

See also

For more information on how to use this operator, take a look at the guide: SalesforceToGcsOperator

Was this page helpful?