Hacker News new | past | comments | ask | show | jobs | submit login
The joys and sorrows of maintaining a personal website (cheapskatesguide.org)
196 points by airhangerf15 on May 6, 2022 | hide | past | favorite | 115 comments



> The enemy possesses every advantage. Personal website owners are nearly always just ordinary individuals. They often have glacially slow upload bandwidths, [...]. The enemy has seemingly unlimited funds, massive resources, acres of server farms,

One example of this is Googlebot "attacks". I have a personal, mostly static website that I've been serving from a home connection for over 25 years, upgrading the hardware and moving it with me as I relocated. Out of these 25 years, Google has been number #1 user of bandwidth, CPU time, and as a consequence the #1 generator of headaches.

During the mid 2000s I had numerous problems keeping the site online since the Google crawler would just effectively DDoS it, sending tens of concurrents requests completely ignoring the Crawl-Delay in robots.txt (and most of robots.txt, fwiw). Mind you, this was not the most powerful hardware you could find, since it had to be quite low power to be left 24h on, but it was perfectly able to serve the handful of connections per hour I expect out of my website.

Obviously your only alternative is to just ban the entire Google IP range and as a consequence disappear from the Internet. Nowadays the problem is less noticeable because my 1-2W server can handle the onslaught brought by Googlebot just fine, but they still ignore the Crawl-Delay.


From what I've been told, the Google Bot absolutely tries to respect the robots.txt. The advice I got was to double check that my syntax and expressions were really correct. A lot of people screw it up so it has to be basically perfect otherwise it won't be parsed correctly. Are you willing to share your robots.txt file?


Googlebot supports their set of robots.txt instructions https://developers.google.com/search/docs/advanced/robots/cr...

Crawl Delay is not part of it.


The technical solution is to apply a rate limit on your web server. Then Google bots will receive a HTTP status code 429 "too many requests" that will force them to slow down.

Nginx has `limit_req_zone` that allows fine-tuning the rate limits.


Why not use something like Cloudflare as a caching layer between you and Google?


Quote from the article:

"From time to time, I consider shielding my website with Cloudflare's services. I resist the temptation, because I strongly suspect that, despite Cloudflare's current good intentions, in the long-run, this is playing into the hands of the corporate-controlled web. Relying on commercial web services limits the freedom we have as personal website owners. Once they have us relying on them, we are forced to play by their rules, and at that point the battle is over. We may not have realized it yet, but we have been captured. Unless we manage to free ourselves, in a short time, we will be working for the enemy. [...]"


Just right within the last few days Cloudflare's DDoS protection has started blocking older and/or niche browsers, so all I can say about them is #*'@~?!§%$§#@!!


What are we blocking? Can you email details to me? (jgc@cloudflare.com)


Thank you, reply sent.


Unrelated, but that is a hilarious username.


Because Google is the one that needs to change.


While that may very well be true, it's also wildly impractical. I would imagine the outcome of that stance will be the skeleton meme "still waiting and google still DDoSes my site."


That company did not exist until several years after the events in the story occurred.


Google does not support crawl delay.

You can set your crawl rate lower every 90 days here https://www.google.com/webmasters/tools/settings

other possibilities are here https://support.google.com/webmasters/answer/48620?hl=en there is a way to lower crawling without total blocking (returning HTTP 503 if there is too much load)


I know it amounts to same, but from the users perspective it would be more accurate to say that Google doesn't respect crawl delay.


crawl delay was an extension of /robots.txt and never a part of the original "recommendation"

and also never made it into the standard draft https://datatracker.ietf.org/doc/html/draft-rep-wg-topic-00#...


I really wonder if this is an area that would be best served by creating a cooperative, allowing you to spread some of this knowledge and troubleshooting across a larger number of humans.

But that's fundamentally a social problem, and the field of software is nothing if not consistent in its resolve to try every other solution before resorting to emotional work. Co-ops that try to come up with a set of rules to handle all situations end up failing, because rules-lawyering turns into a game to be won by some people, at the expense of everyone else (see also, HOAs).


Odd, I have always seen the google bot respect the robots.txt. If I disallowed it, it disappeared from my traffic.

Of course, generally people do not want to do this, as they want to appear in google.


There was a brief time where I thought i could write regularly and make money off of having a commercialized personal site. It turned out that it wasn't for me. I pay the bills every year and I write somewhere between 0 and 5 posts a year. They're usually write-ups about projects I have completed, or interests I'm pursuing.

It's really nice to not have to worry about how popular something is or isn't on my site. And occasionally I'll have someone send me a note saying thank you for some helpful piece of information they found there.


> And occasionally I'll have someone send me a note saying thank you for some helpful piece of information they found there.

This is related to one of the reasons I like having a personal website. Countless times I’ve found answers or learned something cool from another’s personal website, and I hope that I can help other through my efforts.

The original article mentioned something about not being able to change the world through a personal website. You can have an impact in a persons life though, and I think that makes it worth it.


This is the reason that I maintain my website. About once per year, someone will contact me regarding something I posted. I don't get very many hits on it, but it's quite gratifying if it saves someone time.


Same for me!

I don't quite understand the author's framing of "brave knight fighting a battle against the commercial/corporate web"... Personal sites are what they are (usually more pleasant than the corporate), everyone is free to do what they want, I'm not forcing anyone to read my blog, nor is anyone forcing me to read corporate websites. Where is the battle?


The cost of a VPS where you can host stuff like this is far less than a Netflix subscription. You can just do it for the entertainment or satisfaction that it provides without worrying about what it costs or "monetizing" it.


> And occasionally I'll have someone send me a note saying thank you for some helpful piece of information they found there.

I do exactly the same thing! And for exactly the same reasons!

I just occasionally write to crystallise my thoughts. Sometimes I get insights just by writing up a project, in the same way I sometimes think of an answer while writing a question into StackExchange.


This is what I do with my personal blog. I actually kind of use it as a personal documentation guide. If I can write it for someone else to use I can re-impliment it myself


A few years ago I flirted with the idea of doing a more serious and focused travel-oriented site and went so far as to get a Wordpress site for it but the ambition lasted for about a month.

My site is mostly pretty static. There's a fairly nice templated front-end but the actual content is just on some fairly simple, mostly hand done HTML. My actual blog is on Blogger but a lot more of my content goes in other places these days. That may well change in the future (again).


I have run my own site, on my own domain (with email!) since 1994. The domain was lost and recaptured in 1998, so I suppose by the registrar, since 1998...

I go through spurts of updating and then not updating it. Some parts of this article resonate with me... is it worth it? Does anyone care? Will anyone ever see it? At the same time, if you view it as an item for ones self, then all of those pressures go away.

I think people do not realize the power they have (or they always make it out to be a much larger problem than it is) to run ones own server (perhaps both web and email). Before you flame, I have been setting up web/email for decades, and there are plenty of online tutorials (I wrote a few). The real point is that even with a little effort, the rewards are interesting. It is not for everyone, but I think the DIY spirit has a place here (in web pages and personal systems on the net).

Sorry for rambling, I feel compelled to say something... so many comments in other threads turn into "you can never keep up with a web or email server". I feel this person needs kudos for showing that it is really not too difficult.


Many things tech people do are basically equivalent to having a little garden in the back yard. Gardening isn't for everyone either. Especially if weeds keep you up at night, or the idea that an apache 0-day might drop any second and bots might take over your old LAMP site before you get around to ssh'ing in and updating it, these sorts of pastimes are probably not for you...


Great analogy!


Hi readingnews, I see the benefits of creating your running your own site/blog/bliki/digital-garden on a hosted or even your own server. You can tweak and make it into whatever you want. But what's the joy or reward of running your own mail server? I can imagine the feeling of being independent. But I imagine it also creates a bit of stress in keeping the email server running, spam free and secure?


> I can imagine the feeling of being independent.

Seriously, this is a total stress reliever. A ton of things are tied to email. What am I paying gmail or fastmail or whom ever? Nothing? Oh, so they can turn it off whenever, or lock me out or whatever, for no reason. Not with my own server.

> But I imagine it also creates a bit of stress in keeping the email server running, spam free and secure?

Again, I used to maintain the gentoo HOWTO on postfix/amavisd-new/spamassassin before the wiki crashed and someone else redid it. I have setup so many corporate wide spam filters... there is no stress in it.


> Seriously, this is a total stress reliever. A ton of things are tied to email. What am I paying gmail or fastmail or whom ever? Nothing? Oh, so they can turn it off whenever, or lock me out or whatever, for no reason. Not with my own server.

Automate occasional backups and use a domain you own outside of that service and they can't lock you out of anything except, potentially, any email you've received since you last took a backup. If Google or Fastmail went up in smoke tomorrow, you'd have all the hassle of... creating an account at one of the 1500 other services and changing your MX records.

You're avoiding an unlikely situation with a low impact and in exchange taking on managing your own mail deliverability where having issues delivering mail is very likely and has the impact of your mail simply not being received with likely no notice of the failure.

I self host pretty much everything. I don't mess with email. I have several times in the past. It's just not worth it to me at all.


You trade the stress of being locked out by fastmail with the stress of being blacklisted by the mail providers of the people you send mail to, and more importantly the people and governments and banks you receive mail from.


I can’t speak for everyone but I make enough money that messing around with e-mail spam is not worth it, when I’d rather be playing my instruments or whatever.

I do my email through my registrar (gandi) so I don’t have to worry about free email going away, but before that I paid for zoho.

Basically the reason I selfhost is to save time, not to have to muck around. Luckily I knew ansible from work so it wasn’t that much work in the beginning to set that up for my services.


> But what's the joy or reward of running your own mail server?

Not having to deal with rate limits or weirdness in regards to delivering mail from software in an automated manner, which would be there with any of the other "big" hosts.

For example, currently i have the following hooked up to a self-hosted e-mail server:

  - Nextcloud/ownCloud
  - Mattermost/Rocket.Chat
  - Drone CI/GitLab
  - OpenProject
  - PeerTube
  - SonarQube
  - Zabbix
Or any number of other pieces of software that could send lots of other e-mails.

If i use my own self-hosted mail server, then there are no odd spam filters or whatever to deal with (unless i introduce my own) - i send as many messages as i need and i receive all of them as well, plus all of the data stays wherever i want it to. This is also relevant for older software that might have problems with the more secure methods of delivering mails.

Oh and there's not a lot to think about, pricing wise, either.

Lastly, running a basic mail server is exceedingly simple with https://github.com/docker-mailserver/docker-mailserver No need to manually configure mail delivery agents or whatever. In my eyes, a mail server with everything it needs should be a single executable/container, with toggleable bits of functionality, rather than some overcomplicated amalgamation of mixed and matched pieces that would need lots of your time to maintain.


No stress and hardly any work once you get the configuration right - which usually comes down to configuring a smart server for outgoing mail, one or more domains for incoming mail and some form of spam filtering - Spamassassin being a good candidate. Make sure your user's passwords are up to snuff and the software is up to date (which can largely be automated) and the total maintenance time is usually quite a bit less than 8 hours per year.

* Exim or Postfix as MTA

* Dovecot as MDA, add managesieve and you'll be able to sort/filter/file mail based on just about any combination of criteria

* Spamassassin to keep out all those offers you can't refuse

* Roundcube or Rainloop for web mail, looking at the same IMAP account as you use on your other devices

I've never had any of my servers 'hacked' in those 26+ years.


One person’s stress is another one’s bliss


I've been running my own services since around 1996 and have never looked back. The scare stories told by those who either can not or do not want to run their own services and for some reason feel the need to dissuade others from doing so are just that, scare stories. Running your own services may not be for everyone but then again, what is?

From the 486DX2-66 under the stairs via the Pentium-90-under-the-stairs, the BP-6 with dual overclocked Celeron-333s, the re-purposed Virgin Webplayer when I emigrated and needed something small and light while looking for a house, the Intel SS4200 and now a DL380G7 (under the stairs, of course), from Slackware via Redhat to Debian, from a hand-crafted Sendmail config to Exim, from a time when spam was non-existent to a time when spam is something which seems to bother others but which is stopped in its tracks at the gates it has been one unbroken chain of enjoying the freedom of being in control.


I think your power bill exceeds my Digital Ocean bill by quite a bit, but you’re having more fun.


I tried to hide one of my handguns under your stairs and, yeah, no room at all


Plenty of space, just get bigger stairs.

Having said that, the total amount of space taken up by the server rack is about 80x110x170cm, half of which is used as a produce drying rack which uses the heat and air flow from the fan in the bottom of the rack to dry fruit/mushrooms/herbs and more of such. This is what it looks like:

https://rimgo.esmailelbob.xyz/a/M4Lbf1K

By now the D-Link switch has been replaced with a HP, a NetApp DS4243 with a varying number of drives in it sits two slots under the server and the remaining empty slots are covered with blanks to improve air flow. Total power consumption lies around 250-300W depending on load, about 170W of which is taken by the server which idles at 150W. Power use is of no issue given the 15KW solar installation on the roof of the new barn I built a few years ago, the residual heat also helps to warm the 2nd floor - which, being well-insulated, does not need any extra heating even in the Swedish winter.


Dammit I think I went to the wrong house


Put it in the server, plenty of room in those old boxen.


And that’s why they pay you the big bucks, sensei


I was born in 1997, so I wasn't really around or aware of the whole "hacker" scene. But this to me seems to be some of the most genuine expression of the "hacker" ethos. Doing things out of passion, stoically, knowing that your efforts will most likely fall upon deaf ears. But still putting out a signal into the vast ocean of information on the internet, hoping that someone with a similar mind will find you and connect. It is beautiful, and it is also deeply tragic. The society we find ourselves living in has stripped away the ability for humans to fully embrace their humanity. Oftentimes doing so is is stigmatized, or even a crime. Personal websites, and blog posts like this are like matches being lit in a pitch black forest. I feel lost, and alone in this modern day life, but I see, even if just an ephemeral glimpse, that there is someone out there, somewhere, struggling, like me, living. And this gives me hope. Thank you for this blog post, it has brought me joy today.


> After I had written my first 30 articles, I thought I had said everything I wanted to say. That was three years ago, and yet, somehow I have continued putting out fairly regular content. This amazes me, because I do not know how I am doing it. Ideas just come, and many of them are actually good, meaning they have value to readers. Though most of my best writing does not attract exceptionally large numbers of readers, at times when I have been very lucky, a few of my articles have brought tens of thousands each to my website.

This is an under-appreciated part of regular writing. If you want new ideas, one of the best ways to do that is to write. One way to uncover progressively deeper ideas is to stick to the same topic. This is not easy because in the back of your head may be a voice telling you to try to get more readers. It's a diversion. Go deep instead. The Internet is filled with surface-level discussions of just about every topic. What it will never have enough of are deep-dives written by an expert.


There's a writer (I want to say Annie Dillard, but I'm not finding it) who said -- don't store up all your best ideas for a rainy day, because invariably when you do open that box, you'll find it empty. Only and always work on the best you have at the moment.


I maintain a couple of personal websites, that aren't just "portfolio of work", that are my day-to-day stream of consciousness thoughts about things.

I've maintained at least one diary-like personal website since 1992-ish or 1993-ish when domains could only be registered with one company, and cost a couple of hundred dollars, and you had to talk to someone on the telephone to get one. I post about one thought a day, sometimes two!

And I've taken a lot of my "one thought a day" notes from physical notebooks and transcribed them to my website, along with photos of the notebook page. My collected thoughts go back to 1973 or so.

As a kid I didn't read or write too well, but I had a tape recorder where I thought my thoughts (I had the idea from a Dr Seuss book as I recall). I can listen to six year old me thinking about things like how airplanes fly and why would a cup made of stone float in water and realise just how dumb I was as I thought about how these things work. But it is also interesting to look back and go "wow! that's me! That's my thoughts and my voice and my notes!"

What helps with the writing is that I am under no allusions as to ever making any kind of profit or being popular. It truly is "just for me."

Welcome to the end of the age of forgetting.


> Over the last three years, I have learned two valuable lessons about writing for my blog. The first is to keep a list of ideas for articles. The reason for this is that sometimes ideas come as if from a fire hose, and at other times I feel like I am dying of thirst in the driest desert on earth. Having a list of ideas evens things out. When I am in the desert, I have a canteen of ideas from which to drink. This is how I survive to put out fairly regular content.

Yup, I don’t know how other people do it but lists work great for me.

I’ve got a big list of blog ideas (I write outlines when the ideas strike) and an even bigger list of startup/SaaS ideas that I’ve just made a newsletter for (since I’ll never get to executing on all of them of course)

Lists are the biggest tool in my tool belt, even at work


The destruction of RSS greatly damaged the Blogging ecosystem.

It used to be that if I followed a /. link to an article on a blog that I enjoyed, I'd immediately subscribe to their RSS feed. I'd made a collection of a few hundred blogs related to computer science alone this way.

At the time I was on a mac, and NetNewsWire was THE RSS reader of choice (backed by Google Reader for syncing feed info) and every morning I'd click on the folder which housed those 100s of blogs, and would probably find 10-20 posts, skim through them in minutes, only reading those that seemed interesting to me in depth. NNN also made it extremely easy to open the post on the original website in a background tab, so I'd keep opening them in the background if I wanted to read it, and then just go through all the background tabs.

It was an amazingly efficient system of following hundreds of independent blogs that were offering uncurated first person information.

I haven't found anything today that comes close to recreating that system, and I can't imagine even finding the kinds of articles I did through this process (an example...I had an HP MediaSmart server, and one day someone randomly posted a How to on replacing the CPU on it with a newer CPU...it was an easy process that cost me about $7 to buy the compatible faster chip on eBay, and a couple of dollars for the heatsink gel). And lo and behold, a week later, everything was running so much better.

Changing the CPU on the Mediasmart server was something I wouldn't even have considered searching for, and even if I did, I'm not sure Google would have turned up anything useful.


RSS didn't die, though? I still do this and it works fine?

This is optional - but I use Miniflux as a locally hosted RSS server, so that I can keep the feed list etc on there and access it from my phone, desktop etc... using my client of choice.


> RSS didn't die, though? I still do this and it works fine?

If your publishing platform supports it, RSS is still pretty good. On the other hand, I'm building my current blog with NextJS (both to learn it and because I like React a lot more than I do the Wordpress loop) and I am making sure to do RSS...but it's a lot harder than it used to be.


Why is it a lot harder? It seems like if you used just about any ready-to-go blogging platform it would be solved for you. And if you implement your own platform you need to generate a feed which is just as difficult as it was 20 years ago.


It is a lot harder because you have to do it.

If you don't care about it, you're not going to do it.

If you don't do it, there won't be RSS support.

And so, the circle shrinks.


Which client do you use? I'm searching one for android.


I'm currently using [Focus Reader](https://play.google.com/store/apps/details?id=allen.town.foc...), which seems pretty great, so far.


I installed miniflux recently and for now I'm using its web interface, shortcut to which I pinned to my launcher. It's very fast, so possibly I'll stay just with it.


I'm pretty happy with Handy News Reader, gets used every day. Appears to be on the play store as well as on f-droid.


> The destruction of RSS greatly damaged the Blogging ecosystem.

Since when is RSS "destroyed"? I still use it every day.

Okay Google Reader specifically died. But honestly it wasn't all that great. There are dozens of other RSS readers that still work fine.


What bugged me was when Apple removed RSS from Safari. I used to have a menubar list of sites that I would read regularly and I could go to the bottom and select open all in RSS and read everything. Having to use a dedicated RSS reader really feels like a step backwards.


> Having to use a dedicated RSS reader really feels like a step backwards.

Not if you use a web-based RSS reader (I use Nextcloud News for this purpose) since that is always just one link away on any browser you might happen to use. This is far more convenient than having your 'feeds' in a specific browser on a specific device.


Ok but I detected that now there are less posts than 10 years ago, and I have the same feeds or more

People get old, and do not write new posts in their blogs

Young people, most dont use blogs, they use fb, ig, ..


I do find it plausible that there are fewer things being posted in RSS-able blogs than 10 years ago, but if so, you wouldn't be able to tell that way. Especially if what you're following is a lot of individual blogs, it's pretty normal that at least some people get tired of blogging for any number of reasons over the course of a decade and stop or slow down. If you want to keep the same amount of total content, you have to constantly look for new blogs to follow.


It's noteworthy this blogger actually tells me if I don't change the settings on my RSS feed reader, he will ban me from reading his site. I decided to subscribe, but this is the first time I've ever had to change the update interval in my feed reader.

This isn't really a shining example of good encouragement for people to use RSS feeds.


What did he want changed? Were you polling too frequently?


Well, I'm polling my feed reader's default. Which is every 15 minutes. Many RSS feeds for large sites post more than ten stories a day but only show the last ten stories on their feed, so checking regularly is critical.

However, this author specifies I may get IP banned if I check his feed more than once a day: https://cheapskatesguide.org/policy-on-robots.html


Wow that's just... special. WTF?


Dunno why so much of this is in past tense – everything you named still works the same. I use NNN every day (though my reading list is smaller: around 30). If there's a blog that doesn't have a feed, I'll just use a scraper to make one for me. Works like a charm!


I don't know if you're aware that NetNewsWire is being developed again. It has been a joy to use it both on macOS and iOS.


First, you shouldn’t blog for others. I did it to index my knowledge, practice writing and document my thoughts. I almost consider them my public notes.

I’m in the category of a casual personal blogger. That said, Ill have articles get >100k unique views once or twice a year, average 25k views a month. I share the articles and people like them. That’s about it. Some of my niche stuff ranks on Google and DuckDuckGo because I did something people found interesting (hence the regular traffic).

My last update was Feb 7, 2022 so it’s not as if I’m active. And my most detailed analysis https://austingwalters.com/firearms-by-the-numbers/ actually receives relatively low traffic (it’s a censored / touchy subject and took me six weeks to write). However, it was something I did to understand a topic myself and have shared with others as it’s come up.

I took joy in writing it and frankly assumed it was a purely intellectual exercise. If you go for clicks, you don’t pick censored / deranked content, but imo that’s where interesting topics actually are.

Anyway, a bit of a ramble. I just find it interesting people start blogs for traffic. Instead it should be intellectual exploration first, then sharing second. Those are the blogs I’ve always enjoyed anyway.

If you want traffic you need a regular topic you cover. Most personal blogs are life events, which is covered by social media now. Blogs are for long-form content and can cover more practically what I would call essays or research topics.


Downvoted because the blog post you linked triggered me. (Ba dum dum.) Kidding! I like the data-based approach to 2A in that article.


> From time to time, I consider shielding my website with Cloudflare's services. I resist the temptation, because I strongly suspect that, despite Cloudflare's current good intentions, in the long-run, this is playing into the hands of the corporate-controlled web. Relying on commercial web services limits the freedom we have as personal website owners.

> Once they have us relying on them, we are forced to play by their rules, and at that point the battle is over.

These are the perfect words to convey the feelings I've long felt all the way down to the core of my being.


My main detriments to publishing on the web continue to be:

The less obvious and not already widely covered stuff is very situation dependent. i.e. Most advice I could share is incredibly nuanced and some of it may flip 180 degrees, depending on the particular context of the person looking for advice or help. — And I don’t necessarily pre-think all possible situations - which is also a big human problem, when making software or laws and everything in between.

The “right” advice changes over time, because everything from technology to human value systems evolve ever more quickly. So today’s wisdom may end up tomorrow’s deadly sin. And when the Internet doesn’t forget, a well meaning post today may become a big mortgage on the future.


Advice rots. As you describe, it’s context specific.

Stories on the other hand, are ageless. When well written they take on a life of their own.


My path was perl -> php -> wordpress -> static generator + disqus and dealing with spam pre disqus (for me, wordpress' anti-spam features didn't work, kismet, etc...)

I'm sure there are solutions but the problem I have with a static generator is that unlike wordpress I have no online preview. Sure I can run the generator on my local machine, even setup some watch build so I can update and refresh but I use 7 machines regularly. With wordpress, any machine let me draft a new post and pick it up else where and preview it but ATM I've lost that feature.

Maybe (it's hosted on github), I could setup a staging repo and have it build PRs to a staging site but a full PR is way more work than just typing and clicking 'preview' in wordpress. To add, my site does some processing that would be not be previews just using github's markdown preview in the editor.

I don't know if that's why I blog less than I used to but it is arguably friction. If I think of something to write I need to be in front of the correct computer with stuff synced and setup and if I don't finish I need to check in the unfinished article in some draft position, then later check it out on some other machine, and finally when it's done, move it to a folder that is used in the build.

It also takes me serious time. A typical post would take 4-8hrs. Write, proofread, edit, add pictures or diagrams. Just barfing thoughts out is twitter/facebook.

Otherwise, I think I just have less to write. I started blogging in the 90s selfishly, it was way more fun when it felt special. Now everyone writes (twitter, facebook, medium, etc...) so not nearly as special.


I've solved it for my own needs by making https://webide.se/ which is an web based editor with a Linux shell and my homegrown static site generator with live preview. If you host your repo on Github try https://webide.se/?github=/githubuser/repo/


Why not go back to WordPress? It sounds like it was working well for you. It may not be the shiniest tech anymore, but often, boring tech is good [1].

[1] https://mcfunley.com/choose-boring-technology


The reason I stopped using wordpress was it required constant maintenance. There are new exploits found monthly so you have to always upgrade. I needed a couple of addons to match my previously hand-coded blog as well and those plugins also needed to always be updated. Updating wp or the plugins would break stuff regularly.

So, wp is not really an example of "choose-boring-technology". The point of boring tech is it works and is low maintenance. wp is not low maintenance.

A static site mostly is. There's nothing to hack. No accounts. No way to "login"


The timing of this post is quite impeccable - just a few days ago I finally went ahead and paid for a domain, to set up a personal blog (despite having decided to make one several years back). I would like to ask fellow HN bloggers, what were some of the unexpected things you encountered in your process? (I'm more curious about the experience, though I am also interested about more technical details.)


Well, I can say, as a person who's been running a blog since 2003 or so, if you're planning a long-running online presence, building a site based on technology that makes it relatively easy to move to different technology platforms is really important. a) it future-proofs your site so you can move as the technology moves, but also, b) the odds are you'll end up with content that's easier to back up, manipulate, etc.

The first iteration of my blog was based on an old wiki engine, and the beauty of that engine was that, while the markup format was unique, the data was stored in flat text files that I could easily manipulate.

Some number of years back, now, I moved to a static site generator that uses Markdown as the source format, and since I was dealing with flat files, I actually had a fighting chance of making the switch fairly easily.

And now that everything is just Markdown, I can version control it with git, back it up easily, etc. And again, if I decide to switch technologies in the future, I can do that because the content is stored in a format that's very easy to manipulate.

And thinking very long-term, this kind of approach ensures that archiving your digital legacy will be a lot easier, since the content isn't tightly bound to a specific technology.


Thanks for your comment, that (tech getting depreciated) is something I hadn't consciously considered. I also think Markdown is a really solid idea for anything that doesn't need to be very fancy (like most blogs).


Well put - and maybe also worth noting, that the same issue also applies to media formats.


I've had my personal domain since 1999. It was originally hand-coded HTML, but has been WordPress for at least the last eight years.

Back in 2014 I wrote a beginner's guide to C#, with the lessons building a very simple (non-graphical) role playing game. It was mostly to show the thinking behind starting out very simple, with the basics of objects, and eventually build a program that is larger and "complete".

It got a little popular and I've received quite a few messages/comments from people who've told me the lessons helped them understand things better in their programming courses at college or code camps. Those messages have been a lot more fulfilling than being coder #12 on $BigCo's multimillion-dollar, multi-year project.

It's also a nice thing to point to when interviewing. Just like a public GitHub repo, I doubt most interviewers take more than a cursory glance at it, but it is a way to stand out from the crowd of candidates who don't have a technical blog.

I've had times when I've burnt out and haven't posted for a year or more. Other times, I get a burst of energy and write every day. There is a bit of pressure to feel like I should be writing and posting. And, since I have programming guides, there are occasional support questions to answer (especially when Microsoft changes Visual Studio or moves from .NET Framework to .NET Core then to .NET 5/6). But, it usually doesn't take too long to deal with that.

On the technical side, it has been a bit of a pain to go through web hosts every few years. The hosting service eventually gets bought out, service quality goes down, or the site gets slow (and support says, "It looks OK to me"), etc.


Thank you for your insights. I think the variability in writing is something I'm also going to personally experience. That feeling of satisfaction and happiness sounds great - good for you, and also thank you for offering such useful content for free!


Congratulations on your new blog - if you already have content up consider posting it here.

I wrote [0] and [1] with people like you in mind:

[0] https://sheep.horse/2017/4/so_you_want_to_start_an_unpopular...

[1] https://sheep.horse/2017/4/some_additional_advice_for_those_...


Thanks! I've yet to set up the website (planning to go for a simple static site), and I mainly intend to post to HN haha (partly because of the nature of topics and discussion).

Thanks for your links, the point of doing something once in a while no matter how small really makes sense and was something I hadn't considered.


I note that many blog on their personal web site. I decided against that back in the early 2000's, when it got popular - I just didn't want to face not having a clue what to write about next. The site's always been hosted with a commercial hosing provider. I've travelled a bit, and that meant my site could live on even if I moved yet again.

So I just use it to write down stuff I thought inetresting. I add new stuff very occasionally, and refer to it often, to this day. It's expanded (and sometimes contracted) with non-work stuff (hobbies, photos for the overseas family - although I could do more on that front).

And so I have no real expectation of myself to continually provide new content, or keep up with the latest hosting software and [at home] infrastructure. It receives lots of traffic for the 4x4 guide I wrote in 1998, and the consulting reference pages. I don't care for the traffic, but glad people find it useful. As long as it remains useful to me I'll keep it going, and call it a success.


This makes me smile. I have been maintaining a personal ( and sometimes business) website for over twenty years. I feel sort of responsible for keeping that 90's vibe alive.


I've been doing the same, and have found myself really attracted to the ideas of the indie web. The internet is so incredibly corporatized these days, and I feel like independent blogs and websites are a critical bulwark that helps preserve open spaces on the internet.


I agree 100%. I just closed my business, and a recent article on HN (~ "my personal website is a single binary") inspired me to start a personal website again, done in the same fashion. I think it's important to keep some level of noncommercial, passion-driven activity alive, lest the internet becomes a really sad place.


I struggled for some time with maintaining my own site (in profile), as it felt so disconnected from the conversation. Eventually I realized what I really wanted was a more customizable/flexible version of Twitter that could do things like host WASM - so I recreated my site to be more of a complement to Twitter threads rather than a standalone experience. The article author seems to be taking a strong stand against corporate web but I think a highly integrated yet fully owned personal site could be a good alternative approach.


> From time to time, I consider shielding my website with Cloudflare's services. I resist the temptation, because I strongly suspect that, despite Cloudflare's current good intentions, in the long-run, this is playing into the hands of the corporate-controlled web. Relying on commercial web services limits the freedom we have as personal website owners. Once they have us relying on them, we are forced to play by their rules, and at that point the battle is over. We may not have realized it yet, but we have been captured. Unless we manage to free ourselves, in a short time, we will be working for the enemy. Soon, we will have their advertisements on our websites. Soon, we will be rewriting our articles and modifying our websites to attract more traffic from their search engines. Soon, we will be throwing out our best content, because it is too controversial for their teams of often ethically-challenged marketers. Soon, we will no longer be knights. We will be their lowly serfs clad in filthy rages begging for scraps. They will own us.

OP conflates two different issues: the difficulty of running infrastructure that you own, and the difficulty of running content that you own. They are not the same!

Concerns about ownership of content because you no longer own the infrastructure are, I think, irrational. I can take my compiled heap of HTML, CSS, and JS to Cloudflare, and if they try to "make me a serf", I can move it to CloudFront, or Netlify, or GitHub Pages (which, to the point, even works in Iran from what I understand, because GitHub went through the pains of getting approved by export control), or any number of similar services.

Concerns about ownership of infrastructure are valid, but that war is largely lost. I still can't get IPv6 at home, and even if I could, leaving it open to ::/0 is playing with fire. Somewhat literally, depending on whether your hardware's automatic temperature shutoffs are working or not, something that is increasingly unlikely in an age of cheap connected kitchen appliances with heating elements. Denial of Service is real, and the fact of the matter is, even if IPv6 were to be offered to every household on the planet, virtually no Mom-and-Pop consumers want to screw around with network firewalls to get anything working.

Yes, own your content. Write what you want. Stop running long-commoditized hardware at home.


I wish more of the people posting about how they've had blogs for decades would link the blogs


I’ve had a continuously updated record of my reading since 1995. It began as manually updated HTML and then I made a PHP script to back it with a database. I never got around to writing the admin page so I still update it with phpMyAdmin after writing the entries in TextEdit. Some day, when I have time, I may fix some of the CSS and other formatting issues, but I think most of the traffic it gets is me using it to refresh my memory about something I’ve read in the past.

https://don.dream-in-color.net/books/


I can recommended BookWyrm for this!


With nearly 2000 entries in my system, I’m not eager to move to something new. I do add new reads to my Goodreads and I have a catalog of the books I own and have read on LibraryThing, but part of what I like about this is the continuity of the pages where there’s a long history that goes back to when my personal site was at ftp://cyberg8t.com/~dhosek


I've traced archeological layers remaining in my blog all the way back to at least 1999. It's been on worldmaker.net and/or blog.worldmaker.net since 2002 as I recall. For a period in the 90s there was a "Zine" component, and the word "Blog" wasn't even invented yet. There are some big lost layers from database drops in the middle there. The most contiguous history seems to start around November 2004, with the most recent migration from databases to Git a "recent" 7 years ago in 2015.

It's been a weird journey.


Does 1.5 decades count? [0]

I even have a similar article about blogging (old but still relevant, I think)[1]

[0] https://sheep.horse/everything.html

[1] https://sheep.horse/2017/4/so_you_want_to_start_an_unpopular...


Imagine going back to the nineties and saying "check out my blog at sheep dot horse."


Reminds me of that old SNL sketch: https://snltranscripts.jt.org/99/99adillon.phtml/


Part of the joy is finding them serendipitously on your own though. Still, if you're not too fixated on the age and more interested in a place being a personal site, here's a thread with a random collection from a couple years ago: https://news.ycombinator.com/item?id=22156868


Maybe because it feels self-promotional? So along with mine (https://maya.land/posts/) I'll share my blogroll (https://maya.land/blogroll.opml) of the many, many RSS feeds I follow. :)


Like so many others I also have created my own website to show my artwork (photography, printmaking), recipes for the benefit of others with dietary restrictions (low-sodium), and software projects.

But to me it's much more than a website. Behind it is a set of software tools I've been developing for website generation. The public-facing website is in effect a major demo of what the tools can do. The programming challenge is what makes it worthwhile, stretching the site's capabilities is highly rewarding. Also improving the site's performance and efficiency is a benefit of "roll-your-own" site creation. Polishing it to excellent results contributes to a sense of accomplishment.

Anyway it's pretty easy and cheap to use a modest VPS to host the site. If a person has the knowledge running a self-managed host is quite practical and allows maximum freedom and minimal risk of "deplatforming" that was going on a while back even if the risk is near zero.

I never expected to get a lot of traffic, though there is some. One surprising aspect has been the number of attempted "break-ins" (e.g., via ssh). Properly configuring a firewall is a necessity. But that stuff is just part of normal administrative responsibility.


WP and SSH seem to be the most common bot attacks. SSH is easily mitigated by changing the default port from 22 to something else, and adding in fail2ban.

As for WP, I like to use a static site generator like Hugo, unless you have huge requirements for dynamic server side resources (versus of a site with 1000s of pages, albeit static and highly cacheable).


Yup, changed SSH port from 22 to another, the attempts to barge in continue but don't succeed. The motivation for trying to intrude escapes me. I mean if a site is well-known it would make more sense. An obscure and obviously personal site, hard to fathom what breaking in would accomplish, unless some really bored or antisocial people out there consider it a sport of some weird kind.

My site is currently >100 pages, so it's modest in scale. Like many sites it consists mainly of static pages but there are dynamic features too, such as basic user comments and messaging. The generator program outputs the html, css and js needed. Written primarily in object-oriented Tcl, pages are generated from source files expressed in a generator-specific DSL. In practice it's way simpler and less exotic than this description suggests. To me "proof of the pudding" is that I can easily pick up where I left off even after being away from the project for a few months.

A goal for the generator is to finish packaging and documentation, then releasing as open-source. Not that there aren't already a million ways to create websites, but I figure one more can't hurt! ;-)


I just made the first new post on my personal blog in almost two years. Because it's a self-hosted site that I step away from for long periods of time, I accidentally rsynced my entire .git/ repo and some supporting scripts before finding the simple deploy.sh file I wrote last time I messed things up. :)


I wrote myself a tiny deploy-script for Hugo [0], but it could easily be adapted to different systemsas needed. Maybe someone else can find some utility from it?

0: https://git.sr.ht/~rsolvang/pu.sh-hugo


I'd be interested to see a HN poll to know how many of you roll your own (that is, you're serving all HTTP traffic on the back end of a load balancer), or using a website hosting service. But I don't know which hosting services are popular so I don't know what to add to the poll.

I have a happy little node server that's been running on an AWS micro instance since 2013 and I periodically stick a new SPA into the public area with auth through the express middleware. It might look a bit frankensteinish, but it works just ducky and costs like US$10/month with 20GB of SSD (most of that is the elastic IP + load balancer).


> After I had written my first 30 articles, I thought I had said everything I wanted to say. That was three years ago, and yet, somehow I have continued putting out fairly regular content.

Finding things to write about is not the bottleneck (for me). I'm willing to bet I could make a list of 1000 things to write about if I cared to make such a list. The reason I don't post more on my website is because it's rarely the best use of my time. I would create some value with another post, but not a lot, so I focus on higher-value tasks.


For me it is just hard to make a consistent effort to constantly promote my own content. Generally what I am motivated to put out there is deliberately contrarian, and that makes it even less likely for people to be interested. But whatever the reasons, after doing a couple of blog articles or pages and having almost 0 interest or comments, or maybe one shallow comment, I usually give up. There are just more interesting things to do with my time and energy.


For people who run their own blogs, how do you decide when to write about something? I feel like I should write once I have a project done to a satisfying extent, but of course that never actually happens. As a result, despite having a blog setup, I've never actually ended up writing anything.


I draw inspiration from my day to day work (software engineer). Sometimes it’s an idea that I’d like to implement at work, sometimes it’s something I just learned. What used to hold me back was the thought that what I know is nothing new. However someone told me there’s always a handful of other people who don’t know what you learned and might benefit from your own explanation.


Most of the server woes can be solved with static site hosting.

As for comments, I think the best option would be some comment service. However, everything would feel a lot less personal.

My blog uses surge.sh for static file hosting, in front of CF and CF dns for $10 per year. Not as crazy as the op makes out.


Web hosting is "economy of scale" which means hosting one site cost a lot, but hosting +1 site costs very little for that extra site, so if you are technical enough to run your own server, also host other people's web sites!


I think this is really different in other countries.

For example here in Czech republic we have lot of reasonably cheap hosting companies and whole backyard industry of small companies and individuals willing to create website for you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: