Hacker News new | past | comments | ask | show | jobs | submit | sfRattan's comments login

As I've been deliberately moving toward self-hosted computing, under my control, on my home network, I've had a feeling more and more that we're on the cusp of something transformative... For those who want it and those who care. There's an ecosystem of mostly FOSS software now designed to run on a home network and replace big, centralized, cloud providers. That software is right on the edge of being easy enough for everyone to use and for sufficient numbers of people to deploy and administer. News like Immich (to replace Google Photos) getting a major investment thanks to Louis Rossman and FUTO [1] is encouraging. The ecosystem of software you can now run on a commodity built NAS or homelab is, for me, the most exciting thing in computing since I first used the Internet in the late 90s.

The rollout and transformation, if it happens, won't look like all this stuff becoming so easy that every individual can run a server. But it is possible that every extended family will have at least one member who can run a server or administer a private network for the whole clan. And that's where tech like tailscale's offering will come in. That's where I see the author's vision being a believable moonshot:

Each extended family, and some small communities, with their own little interconnected, distributed network-citadels, behind the firewalls of which they do their computing, their sharing, and their work. Most family members won't need to understand it any more than they understand the centralized clouds they use now. And most networks won't be as well secured as a massive company can make its cloud offering, but a patchwork heterogeneity of network-citadels creates its own sort of security, and significantly lowers the value of any one "citadel" to even motivated adversaries.

[1]: https://www.youtube.com/watch?v=uyTPqxgqgjU


Totally this. I am old enough to remember LAN fun times. And writing software since 1970s at high school.

And Tailscale works for me to create my own network of phones, laptops, desktops and a remote node at DO. Works brilliantly to cross geo boundaries, borders, wifi networks(home has multiple) and seamless moving between mobile networks and wired.

Not sure will create a new internet or not but at least a new intranet where all my devices are reachable and controllable.


> But it is possible that every extended family will have at least one member who can run a server

That's as may be; but many, many people have no access to an "extended family". And extended families are not necessarily warm, safe spaces where everyone trusts everyone else; extended families are more likely to be "broken" than nuclear families.


> And extended families are not necessarily warm, safe spaces where everyone trusts everyone else; extended families are more likely to be "broken" than nuclear families.

It is a good thing to promote and advance privacy, security, and freedom to isolated, atomized individuals; but it is important for all of humanity to promote and advance those same ideals to extended families. People who have no access to an extended family will ultimately either join a different one or disappear into the mists of ages past. In 100 years, the Earth will be populated mostly by the descendants of people in extended families today, however imperfect or even broken those extended families may be. If those people today don't see privacy, security, and freedom as both possible and worthy, their descendants may not value or even possess any of those ideals.


> As I've been deliberately moving toward self-hosted computing, under my control, on my home network

Funnily enough, I was once like this but now I have deliberately moved everything to the big cloud providers as I don’t want to deal with the toil of running my own homelab anymore. This is coming from someone who used to have a FreeBSD server with ZFS disks and using jails to run various things like pf, samba, etc. Eventually things would fail and it felt like I was back at work again when all I want to do is drink a cold beer and watch YouTube.

Perhaps I will try again one day as things get easier. For now I am content with having my photos and videos automatically synced up to iCloud/Google Photos.


I tried once or twice in the early 2010s to set up a home server and had a similar experience to what you describe. Stuff would break and I wouldn't want to spend time fixing it.

I think part of the excitement I'm feeling is that the ecosystem today feels way more stable and mature than it did a decade to a decade and a half ago. Home Assistant, Jellyfin, TrueNAS, and a few other things have all pretty much run themselves for me with almost no downtime (other than one blackout that happened while I was traveling and drained my UPS) for the past nine months. There's tinkering to get it all up and running, but way less maintenance than I remember in the past.


I am curious about what your setup was, I have several systems, not sure I would call it a homelab but I rarely have to do anything. I am using Truenas for my ZFS storage and I have a few NUCs to run extra QoL services.

The only time I do anything with this stuff is when I want to upgrade (which is very rare) or add something. My NAS solution is a custom mini-ITX I built 8 years ago which I feel has more than paid for itself. I have long stopped chasing the latest and greatest because most of what has been produced in the last decade is very usable.

Very wary of going cloud, as I can't as easily control costs.


What are the hard parts of hosting from home? What solutions have you been using?


>at least one member who can run a server

It may be highly unlogical, but maybe by shooting for zero it would be possible to bat 1000?

I do everything it takes so that the "extended family" site just works after I leave, as long as the "operator" can keep track of their USB sticks.

Scrap PCs being used as media servers have no internal drive.

Boots to the stick containing the server app.

Accesses media on a second stick containing the files.

Hotplug the media stick to emulate game-cartridge/VCR-cassette convenience.

Upon server failure or massive update, replace that particular stick with a backup or later version, or in the worst case get another scrap PC.

I know, easier said than done :/


A medieval castle could be defended by surprisingly few people, but not by zero people. And a castle full of people who don't know how to fortify and maintain its defenses eventually becomes someone else's castle.

Aiming for zero required sysadmins in the short term after your own passing, I think the computers you leave behind will run into a similar case of the same general problem in the long term: there's no such thing as an entropy proof system. Castle walls erode and weapons rust if there are no skilled people to maintain them. Computer components slowly break down due to ordinary wear and tear. Software configurations become obsolete and unable to talk with other software, and become less secure as vulnerabilities are discovered over time. If there are no skilled people at all to maintain a familial network-citadel, it will eventually break down and fall into disuse.


You have hit the nail on the head.

Especially with passing, eventually it's like the siege of the Alamo, when the walls do end up breached there's not a soldier there that can do any good.

It's shoestrings anyway and amazing it's working for now :)


There are tabletop wargames for the consumer/hobby market that do try to include various kinds of friction in the gameplay. Both the classic Memoir 44 [1] and the Undaunted series [2] have you issue orders from a hand of cards drawn from a deck.

Memoir 44 divides the board into three segments (a center and two flanks) and your cards to issue orders always apply to a specific segment (e.g. right flank). Lacking the cards in your hand to issue the orders you might want simulates those orders not making to the front lines.

Undaunted explicitly has Fog of War cards which you can't do anything with. They gum up your deck and simulate that same friction of imperfect comms.

Atlantic Chase [3], a more complex game, uses a system of trajectories to obscure the exact position of ships and force you to reason about where they might be on any given turn. The Hunt [4] is a more accessible take on the same scenario (the Hunt for the Graf Spee) that uses hidden movement for its friction.

I don't know how many of these ideas leap across to computer games, but designing friction into the experience has been a part of tabletop wargames for a long time.

[1]: https://boardgamegeek.com/boardgame/10630/memoir-44

[2]: https://boardgamegeek.com/boardgame/268864/undaunted-normand...

[3]: https://boardgamegeek.com/boardgame/251747/atlantic-chase-th...

[4]: https://boardgamegeek.com/boardgame/376223/the-hunt


A license for a live event stream makes sense, but libraries having to buy special licenses to loan out physical books seems completely nuts to me. Does the UK not have an equivalent to first sale doctrine for physical goods?


Going by the Wikipedia article and included map, use of comma versus period as decimal separators is roughly an even split:

https://en.wikipedia.org/wiki/Decimal_separator

https://commons.wikimedia.org/wiki/File:DecimalSeparator.svg


Seems geographically split, but I wonder what is the actual population split is. Most of the top 10 population countries use the decimal separator. Only Brazil, Russia and Indonesia don't.

Maybe someone with a CSV of the world populations and a CSV of the countries broken down by their separator can do that comparison.


There's definitely a big distribution disparity. 11 of the 15 most populous countries use the period for decimals.


Most of Europe is in the fat long tail though, as those countries are counted individually.


My recollection is that health care and other, similar benefits were incentives employers could offer to attract and retain talent during a period when FDR had wage and price controls in place. That is: healthcare was exempt from wage controls, so you could offer a healthcare plan or an improved plan in lieu of a raise.

Then, employers noticed, "you mean offering them healthcare makes it harder for our employees to leave? And public perception of healthcare means it will probably always be a tax advantaged way to allocate our expenses?"

And here we are, eight decades and change later.

Note: I think public perception of healthcare as a moral good in principle is correct. It's just that there's a wicked mal-alignment of incentives which leverages that perception to cement a bad solution in place.


That's basically my understanding of how we got into that mess. There are still significant tax subsidies for employer provided health insurance and most people don't realize that the COBRA price is the actual cost of their "cheap" or "free" employer provided insurance for the same reason they think their "income tax refund" is actually "free money" from the IRS rather than their money that the government held onto without interest.

The average person is not very smart and half of the population isn't even that smart so of course they fall for these intelligence insulting tricks invented by politicians.


Most people are otherwise able to navigate their life just fine, but they struggle with systems that are ridiculously complex and opaque. Industrial society reduces your direct agency and forces you to rely on more sophisticated forms of influence.


"Note: I think public perception of healthcare as a moral good in principle is correct. It's just that there's a wicked mal-alignment of incentives which leverages that perception to cement a bad solution in place."

That is everywhere now. Birth, education, courtship/marriage, employment, ... all the way to death - everything is analyzed, pessimized, and stripmined for every last penny. "Enshittification" is all over HN and with good reason.


> The proposed solution that at least 54% of this village have recognized is that the internet has really eroded societal bonds.

Just under 11% of the village support the measure strongly enough to vote for it. Not 54%. Only 20% of the local electorate voted at all.

On that basis, it's probably a good thing the measure is unenforceable.


I think that as long as a) the effects of the measure were well communicated and b) there were no significant impediments to actually voting, then the percent that vote is immaterial and the result of the vote should stand for everyone.

After all this is the way the US conducts its elections. Many elections have very low turnout, yet the results are binding.


I think there's a difference between the result being binding, and that the vote stands for everyone.

Abstaining leads to your opinion not being represented, which is to be expected. It doesn't follow that therefore other people's vote hence represent you.

For the vote mentioned in the article, which is non-binding, non-enforceable, and ridiculous in its scope, I'd guess that most abstainers' real opinion on the matter, is that it's a waste of their time to participate.


People have their own reasons for silence, even when a referendum is nonbinding. Interpreting that silence as consent or approval is the moral equivalent of a bulldozer.


> After all this is the way the US conducts its elections.

Democracy is working great here and everybody is really happy about it.


> If you don’t want to have what an AI generates then don’t use it.

The author is writing about sort of AI outputs that other people and organizations are passing to him: chatbots, generated emails, phony presence at a meeting, and so on. Those use cases are a bit more like relatives who send their DNA to untrustworthy companies for analysis: you personally saying no for your own use doesn't actually mitigate or even affect the negative externalities imposed upon you by widespread general use.

I do agree that, for many use cases, personally opting out of junk generative AI is sufficient. But I'm not looking forward to the world flooded by low quality AI outputs that become impossible to avoid sifting through in all areas of life.


I had a similar realization early after first picking up vim in college: customization of any tool eventually hits a point of diminishing returns beyond which further alterations reduce your ability to use the tool in its default state. It's an insight I've found applies to almost any tool... From software to hardware and beyond.

Master the default behavior of a tool, and then improve your effectiveness with customization, but not so much customization that you can no longer use the tool effectively in its unmodified state. Sometimes you have to use other people's tools, and it's important to still work effectively when you do.


I agree with this very much! I learnt it the hard way: my experience with many powerful tools, Vim included, was to learn the basics first, customize it to the point where you can't recognize it anymore, and then finally strip it down to the basics again, keeping only the configuration that doesn't prevent me from using the the raw features.

I used to manage complex "dotfiles" and scripts to configure a new computer, etc. I still technically do, but they are much more simple now. I just don't want to spend my time on configuring the stuff anymore, and appreciate the out-of-the-box experience much more. This by itself became a criterion when choosing new tools, frameworks, etc.


I used to think like that, but 10 years ago I decided to create a git-repo for my .emacs.d and started configuring beyond the most trivial settings (and including all dependencies in the repo too). Diminishing returns? Not sure how to measure. Editor configuration was never anything I did because I thought it would somehow save time, as some seem to imply, but rather just about removing sharp edges and making the experience of editing files nicer. With everything a quick git clone away it is rare that I have to work without my configuration.


There are various third-party and sometimes first-party clipboard managers that give you more visual access to the things in the clipboard.

But you hit a steep uphill conceptual climb for the average user pretty quickly: the better solutions for moving data between applications end up resembling type-agnostic virtual 'registers' or Unix-style pipes for more than just text, but these abstractions seem to be too complicated in practice for anyone who isn't a power user.

PowerShell actually implements the latter solution, a kind of pipes-with-objects IIRC. And of all things the late Terry Davis' TempleOS has an ability to treat all kinds of things as text, render them, and pass them around.


My experience is that self-checkout works extremely well until it... Doesn't. And then you need a human anyway. The "doesn't" only happens to me rarely.

But when something goes wrong it becomes immediately clear that---whether at a grocery store, convenience store, or anywhere else---almost no business deploying self-checkout has a real plan for handling the edge and corner cases where it fails, freezes up, or spits out an error.

Most stores that want self checkout to replace five to ten employees with just one to supervise but do not then make sure that the one employee can actually handle the self-checkout's failures and errors.


I had a relative who had a summer job at Ikea monitoring self-checkout:

1) the system and software were unfriendly to both users and staff, especially in error states, as you suggest 2) many people needed rudimentary help 3) no time to look for theft due to 1) and 2) 4) surprising number of couples arguing with each other


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: