If you're serving static content, installing Apache, nginx, or any other web server will do just fine. Make sure to set the document root to a directory you're fine being public.
If you're running something dynamic like WordPress, stay extremely on top of patches, unfortunately, and be super cautious about what plugins you use. (This is one of the better reasons to use a static website.)
If you want to run a Postgres for your dynamic website, configure it to listen only to localhost or only via UNIX sockets.
Make sure you keep your software up-to-date. unattended-upgrades is a great idea for OS-provided software.
Be careful about where you get software from. More than just "get it from somewhere trustworthy," the big concern here is to get it from someone who is applying software updates. For most OS-ish things, you want to get them from your distro; try to avoid downloading e.g. PHP from some random website, because you won't get automatic updates. For a few things - especially things like WordPress - I wouldn't trust the distro to keep up, largely because the common practice is to release security fixes by releasing new versions, and distros are going to want to backport the fixes, which is slower and not always guaranteed to work.
As another commenter mentioned, turn off remote password logins and set up SSH keys. (Most VPS providers will have some form of console / emergency access if you lose access to your SSH keys.)
I run my sites, all static on a VPS, but I do the authoring in a single multi-site wordpress install and use 'Simply Static' plugin to publish the result. The benefits are pretty awesome:
heaps of templates (because I'm often lazy), 1 stop shop for patches, locked down plugins (child sites can't install plugins, only enable/disable), and only one place to look for problems (& you can lock the wordpress site to a single IP if you always want to use it from a single place).
FWIW, I never groked AWS & it's ping times in my country are about 1/2 as good as local providers. (15-30ms for local, vs 50-100ms for AWS local). Speed matters.
Also, my use case is to 'fall over' (meaning: fail/stop working/be unresponsive) wrt DDOS, whereas I know many here are 'must not fail' (with varying levels of acceptability). So, I write concise, low bandwith consuming websites that appear instantly (to my local market users).
Thank you for the advice. I've tried out passwordless login and found it more convenient, so that's not a problem. I'd want to be deploying a Python app I wrote myself, and some static files.
If you're running something dynamic like WordPress, stay extremely on top of patches, unfortunately, and be super cautious about what plugins you use. (This is one of the better reasons to use a static website.)
If you want to run a Postgres for your dynamic website, configure it to listen only to localhost or only via UNIX sockets.
Make sure you keep your software up-to-date. unattended-upgrades is a great idea for OS-provided software.
Be careful about where you get software from. More than just "get it from somewhere trustworthy," the big concern here is to get it from someone who is applying software updates. For most OS-ish things, you want to get them from your distro; try to avoid downloading e.g. PHP from some random website, because you won't get automatic updates. For a few things - especially things like WordPress - I wouldn't trust the distro to keep up, largely because the common practice is to release security fixes by releasing new versions, and distros are going to want to backport the fixes, which is slower and not always guaranteed to work.
As another commenter mentioned, turn off remote password logins and set up SSH keys. (Most VPS providers will have some form of console / emergency access if you lose access to your SSH keys.)