Hacker News new | past | comments | ask | show | jobs | submit login

In case it's useful to anyone, I've been running the cheapest possible implementation of Git-based revision history for my blog's PostgreSQL database for a few years now.

I run a GitHub Actions workflow every two hours which grabs the latest snapshot of the database, writes the key tables out as newline-delimited JSON and commits them to a Git repository.

https://github.com/simonw/simonwillisonblog-backup

This gives me a full revision history (1,500+ commits at this point) for all of my content and I didn't have to do anything extra in my PostgreSQL or Django app to get it.

If you need version tracking for audit purposes or to give you the ability to manually revert a mistake, and you're dealing with tens-of-thousands of rows, I think this is actually a pretty solid simple way to get that.




That's not a full revision history, just snapshots (basically a backup solution). Multiple changes in a 2-hour window will get conflated into a single one.


pgreplay parses not the WAL Write Ahead Log but the log file, " extracts the SQL statements and executes them in the same order and relative time against a PostgreSQL database cluster": https://github.com/laurenz/pgreplay

"A PostgreSQL Docker container that automatically upgrades your database" (2023) https://news.ycombinator.com/item?id=36748041 :

pgkit wraps Postgres PITR backup and recovery: https://github.com/SadeghHayeri/pgkit#pitr :

  $ sudo pgkit pitr backup <name> <delay>
  
  $ sudo pgkit pitr recover <name> <time>
  $ sudo pgkit pitr recover <name> latest


Are you overwriting the previous version of the JSON when committing then merging? Or are you just archiving a new copy of the JSON alongside all the old ones?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: