Submits a Salesforce query and uploads the results to AWS S3.

View on GitHub

Last Updated: Sep. 15, 2021

Access Instructions

Install the Amazon provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.


salesforce_queryRequiredstrThe query to send to Salesforce.
s3_bucket_nameRequiredstrThe bucket name to upload to.
s3_keyRequiredstrThe object name to set when uploading the file.
salesforce_conn_idRequiredstrThe name of the connection that has the parameters needed to connect to Salesforce.
export_formatstrDesired format of files to be exported.
query_paramsdictAdditional optional arguments to be passed to the HTTP request querying Salesforce.
include_deletedboolTrue if the query should include deleted records.
coerce_to_timestampboolTrue if you want all datetime fields to be converted into Unix timestamps. False if you want them to be left in the same format as they were in Salesforce. Leaving the value as False will result in datetimes being strings. Default: False
record_time_addedboolTrue if you want to add a Unix timestamp field to the resulting data that marks when the data was fetched from Salesforce. Default: False
aws_conn_idstrThe name of the connection that has the parameters we need to connect to S3.
encryptboolIf True, the file will be encrypted on the server-side by S3 and will be stored in an encrypted form while at rest in S3.
gzipboolIf True, the file will be compressed locally.
acl_policystrString specifying the canned ACL policy for the file being uploaded to the S3 bucket.


Submits a Salesforce query and uploads the results to AWS S3.

See also

For more information on how to use this operator, take a look at the guide: Overview

Was this page helpful?