SSHHook

SSH

Hook for ssh remote execution using Paramiko. ref: https://github.com/paramiko/paramiko This hook also lets you create ssh tunnel and serve as basis for SFTP file transfer

View Source

Last Updated: Apr. 25, 2021

Access Instructions

Install the SSH provider package into your Airflow environment.

Import the module into your DAG file and instantiate it with your desired params.

Parameters

ssh_conn_idstrconnection id from airflow Connections from where all the required parameters can be fetched like username, password or key_file. Thought the priority is given to the param passed during init
remote_hoststrremote host to connect
usernamestrusername to connect to the remote_host
passwordstrpassword of the username to connect to the remote_host
key_filestrpath to key file to use to connect to the remote_host
portintport of remote host to connect (Default is paramiko SSH_PORT)
timeoutinttimeout for the attempt to connect to the remote_host.
keepalive_intervalintsend a keepalive packet to remote host every keepalive_interval seconds

Documentation

Hook for ssh remote execution using Paramiko. ref: https://github.com/paramiko/paramiko This hook also lets you create ssh tunnel and serve as basis for SFTP file transfer

Example DAGs

Improve this module by creating an example DAG.

View Source
  1. Add an `example_dags` directory to the top-level source of the provider package with an empty `__init__.py` file.
  2. Add your DAG to this directory. Be sure to include a well-written and descriptive docstring
  3. Create a pull request against the source code. Once the package gets released, your DAG will show up on the Registry.

Was this page helpful?