Skip to content
Go back

Safe, Deterministic SFTP / FTP Deploys: 'gsupload-python'

Published:  at  03:27 AM

If you’ve ever done “just upload these files to the server” often enough, you know what happens next: manual steps, inconsistent excludes, someone overwrites the wrong thing, or a deploy goes out with node_modules/ by accident.

This post introduces gsupload, a small Python CLI I built to make FTP/SFTP uploads repeatable:

Repository: https://github.com/guspatagonico/gsupload-python


Who this is for

If you need a full artifact pipeline with rollbacks, canaries and immutable releases: keep using your CD platform. If you need a fast, predictable “sync these files into that remote base path” tool: this fits.


Mental model (what gsupload actually does)

At a high level, gsupload does four things:

  1. Loads configuration (global + layered .gsupload.json files)
  2. Selects a binding (explicit -b or auto-detected from your current directory)
  3. Expands patterns (recursive globbing by default) and applies excludes
  4. Optionally compares remote vs local and asks for confirmation
  5. Uploads matching files in parallel (threads)

Architecture overview


gsupload is packaged as a Python CLI. The cleanest way to use it globally is uv tool:

uv tool install --editable /path/to/gsupload-python
uv tool update-shell

# new shell session

gsupload --help

For local development:

uv pip install -e ".[dev]"
python src/gsupload.py --help

Configuration: layered, mergeable, and portable

gsupload’s killer feature is that configuration is discovered and merged.

Config discovery order

Merge rules (the “additive config” part)

Example config (one binding)

Create a .gsupload.json at your repo root:

{
  "global_excludes": [".DS_Store", "*.log", ".git", "node_modules"],
  "bindings": {
    "frontend": {
      "protocol": "sftp",
      "hostname": "example.com",
      "port": 22,
      "username": "deploy",
      "key_filename": "~/.ssh/id_ed25519",
      "max_workers": 10,
      "local_basepath": ".",
      "remote_basepath": "/var/www/html"
    }
  }
}

Notes:

Inspect the merged config

This is your “what will happen if I run this here?” command:

gsupload --show-config

It prints the merge order and which file contributed each key.


Excludes: config + ignore files (additive)

Excludes come from three places, combined together:

  1. global_excludes in config
  2. excludes inside a binding
  3. .gsupload_ignore files (collected walking up from cwd to local_basepath)

This is intentionally close to .gitignore ergonomics.

User flow for ignores

To debug ignores:

# auto-detect binding
gsupload --show-ignored

# current dir only
gsupload --show-ignored -nr

# explicit binding
gsupload --show-ignored -b=frontend

CLI usage you’ll actually use

The CLI shape is intentionally small:

gsupload [OPTIONS] PATTERNS...

Key defaults:

Useful inspection flags:

Always quote patterns

You must quote globs so your shell doesn’t expand them before gsupload sees them:

# correct
gsupload "*.css"

gsupload -b=frontend "src/**/*.js"

# wrong (shell expands before gsupload runs)
gsupload *.css

Typical workflows

Pre-flight review (default behavior):

# recursive + complete tree comparison + confirmation
gsupload "*.css"

Changes-only visual check (faster scan output; doesnt list remote-only files):

gsupload -vc "*.css"

Fast mode for automation / CI (no remote scan, no prompt):

gsupload -f -b=frontend "dist/**/*"

Alternative “no pre-flight” mode (still expands patterns and applies excludes, but skips remote listing and confirmation):

gsupload -nvcc -b=frontend "dist/**/*"

Tune parallelism:

# override per run
gsupload --max-workers=10 "*.css"

# debug / conservative
gsupload --max-workers=1 "*.css"

FTP active mode (only when you know you need it):

gsupload --ftp-active "*.html"

Data flow: from patterns to remote paths

Understanding the data flow helps you reason about safety and predictability.

Remote paths are computed as:

remote_path=remote_basepath+"/"+relative(local_file, local_basepath)\text{remote\_path} = \text{remote\_basepath} + "/" + \text{relative(local\_file, local\_basepath)}

That means you can safely move between machines as long as your local_basepath matches the local project root.


The visual check: practical “diff before deploy”

This is what makes gsupload feel safer than “just upload and hope”:

Three modes:


Performance notes (why it’s fast enough)

gsupload is optimized for the common “many small web assets” case:

Rule of thumb:


Security posture: don’t commit secrets

SFTP is encrypted; FTP is not.

Recommended practices:

A nice split for teams:


Using it in automation (CI / scripts)

For non-interactive runs, use -f:

# Example: deploy compiled assets
cd /repo
npm run build

# Upload build output without prompting
gsupload -f -b=frontend "dist/**/*"

Pair this with a dedicated deploy user and least-privilege remote paths.


Troubleshooting checklist


Closing

gsupload is intentionally small: one CLI, layered config, predictable ignore rules, a safety pre-flight diff, and parallel uploads. It’s the kind of tool that pays for itself the third time you avoid uploading the wrong directory.

If you try it and you want improvements, the best next steps are usually:

I’m also open to collaborations and networking around this kind of tooling. If you’re using gsupload (or building something similar) and you want to discuss workflows, edge cases, or potential improvements:


Star the repo and contribute: guspatagonico/gsupload-python



Next Post
Algoritmi, desideri e bisogni umani