In case it's useful to anyone, I've been running the cheapest possible implementation of Git-based revision history for my blog's PostgreSQL database for a few years now.
I run a GitHub Actions workflow every two hours which grabs the latest snapshot of the database, writes the key tables out as newline-delimited JSON and commits them to a Git repository.
This gives me a full revision history (1,500+ commits at this point) for all of my content and I didn't have to do anything extra in my PostgreSQL or Django app to get it.
If you need version tracking for audit purposes or to give you the ability to manually revert a mistake, and you're dealing with tens-of-thousands of rows, I think this is actually a pretty solid simple way to get that.
That's not a full revision history, just snapshots (basically a backup solution). Multiple changes in a 2-hour window will get conflated into a single one.
pgreplay parses not the WAL Write Ahead Log but the log file, " extracts the SQL statements and executes them in the same order and relative time against a PostgreSQL database cluster": https://github.com/laurenz/pgreplay
Are you overwriting the previous version of the JSON when committing then merging? Or are you just archiving a new copy of the JSON alongside all the old ones?
I run a GitHub Actions workflow every two hours which grabs the latest snapshot of the database, writes the key tables out as newline-delimited JSON and commits them to a Git repository.
https://github.com/simonw/simonwillisonblog-backup
This gives me a full revision history (1,500+ commits at this point) for all of my content and I didn't have to do anything extra in my PostgreSQL or Django app to get it.
If you need version tracking for audit purposes or to give you the ability to manually revert a mistake, and you're dealing with tens-of-thousands of rows, I think this is actually a pretty solid simple way to get that.