How to Manage SQLite Databases on a VPS ( with Coolify )
The Problem
Keeping a SQLite database consistent between a Docker‑based VPS and local development requires a shared persistent volume, correct path configuration in the app, and a reliable sync mechanism.
Services and Libraries
Coolify – manages persistent storage volumes for Docker services.
Docker – mounts the volume into the container at
/app/data
.scp/ssh – transfers the
.db
file between host and local machine.sqlite3 – lightweight embedded database engine.
import.meta.env – injects environment variables into a Node/Nixpacks build.
Implementation Details
All of this steps will be done from the configuration page for the resource where the db is deployed
Add Persistent Volume
Create volume
(ex: samperalabsvolume)
in Coolify’s Persistent Storage tab, leave the source path empty and the destination path to/app/data
.In Docker (add the following before the dependencies are installed):
# Create non-root user for security RUN useradd --create-home --shell /bin/bash app && chown -R app:app /app # Create database directory with proper permissions RUN mkdir -p /app/data && chown -R app:app /app/data # Create volume mount point for database persistence VOLUME ["/app/data"] USER app
Sync Database with a Python Script
The script usesscp
over SSH to pull or push the SQLite file.
import subprocess import sys import os
# Config SSH_USER = "default" SSH_HOST = "XXX.XXX.XXX.XXX"
# SSH_KEY = os.path.expanduser("~/.ssh/my_ssh_key")
# REMOTE_DB_PATH = "/var/lib/docker/volumes/my_volume/_data/my_db.db"
# current_dir = os.path.dirname(os.path.abspath(file))
# LOCAL_DB_PATH = os.path.join(current_dir, "my_db.db")
def pull_db(): """Download the DB from the VPS to local machine.""" print(f"Downloading {REMOTE_DB_PATH} from {SSH_HOST} → {LOCAL_DB_PATH}") subprocess.run([ "scp", "-i", SSH_KEY, f"{SSH_USER}@{SSH_HOST}:{REMOTE_DB_PATH}", LOCAL_DB_PATH ], check=True) print("✅ Download complete.")
def push_db(): """Upload the local DB back to the VPS.""" if not os.path.exists(LOCAL_DB_PATH): print(f"❌ Local DB not found: {LOCAL_DB_PATH}") sys.exit(1) print(f"Uploading {LOCAL_DB_PATH} → {REMOTE_DB_PATH} on {SSH_HOST}") subprocess.run([ "scp", "-i", SSH_KEY, LOCAL_DB_PATH, f"{SSH_USER}@{SSH_HOST}:{REMOTE_DB_PATH}" ], check=True) print("✅ Upload complete.")
if name == "main": if len(sys.argv) != 2 or sys.argv[1] not in ["pull", "push"]: print("Usage: python index.py [pull|push]") sys.exit(1)
Related Posts
How to Fix Docker Orphaned Layers and Reclaim Lost Disk Space
Learn how to identify and fix Docker's with Coolify orphaned layer problem that wastes disk space. Step-by-step guide to safely reset Docker data directory and reclaim gigabytes of storage on your VPS server.
Minimal self hosted E2E Setup with Playwright and Allure (Docker)
his minimal stack shows how to do that with Playwright for tests, Allure for reports, Cronicle for scheduling, and optional self‑hosting via Coolify
Deploy vite frontend to coolify
Learn how to deploy a vite project in coolify
Let's connect !!
Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.