Hacker News new | past | comments | ask | show | jobs | submit login

One man shop pro fotog here. My macOS server 5.6.3 High Sierra is awesome, the last complete version... File Server with literally decades and multiple terabytes of raw photos… The ease of the email server is unmatched…. Websites and Nextcloud, invoiceninja, all just work with minimal php tweaking…. and all here IN MY OFFICE. I have linux droplets, and am more or less ready to make the move, but I won’t until it actually dies… but only when I have to. I hope the hardware (from 2010) holds out forever, but I do have a spare ready and daily CCC backups and disks rotated to firesafe… at least for me it just works. Knock on wood.



Are you exposing any ports to the internet? The older library versions in High Sierra have numerous security issues (of various severity). You might be able to reduce your risk by disabling server features, but that's a dangerous game of whack-a-mole. (Also, as a fellow self-hosting fotog (that never went pro): you should try PhotoStructure! Details in my profile).


Yea, no kidding, I wish apple would open source server.app, so some of the old components could be updated without hacking it. It was just a perfect all in one solution. I will check out Photostructure.


Unfortunately there's no data checksumming with APFS. Last time I checked, ages ago, CCC wasn't using a version of rsync that supports checksums. Newish versions (for a few years) of DNG compute separate checksums for DNG metadata and raw/image data, but to what degree applications use this to verify the integrity of the image file may vary.

Image silent data corruption via bitrot can be frustrating. Without a regime to prevent it, it spreads into all backups. And typical workflows depend on backups for eventual migration to new storage which allows any corruption to be replicated via the backup strategy. You end up backing up the corruption, unwittingly.


Few mac's have ECC..

Which was one of the big problems with the original G4 Xserve. They added it later in the G5 model, but I suspect much of the damage had been done. Most technical people, just ignored it after discovering that it didn't have ECC and the disks were low end ATA rather than SCSI. With the cluster/HPC version the stories changed to how unreliable the machines were.

At that point it seems only the diehards were willing to trust their data to apple server hardware and it died a couple years later.


My solution is to create par2 files. This is a manual solution though, and doesn't help you when you go and edit the original pictures. But what's great about par2 is that you can it allows you to correct errors.


I don’t use APFS on the big spinning drives where the pix are. But they definitely are on my iMacs and MacBook ssd’s. Yea Lightroom has a Validate dng feature... of course it locks you into Adobe.


PhotoStructure maintains SHAs of all files, but I currently assume the user has updated the file if the SHA changes. PhotoStructure validates files before it imports them to keep corrupted images out of your library.

How do you think I could discriminate between file corruption and the user making an edit to a file?

(Perhaps if the mtime and filesize doesn't change, but the SHA does?)


In a non-destructive workflow, the image data is never modified. The image data checksum should never change. Rather, the edits are a kind of "edit list" stored in metadata, which can itself be optionally (separately) checksummed. If the metadata is corrupt, it can be discarded, effectively resetting the image back to its original pre-edit state. Yes, you'd lose the edits.

The location for metadata depends on the application. For DNG workflows, the metadata is a separate location in the DNG file, with separate metadata and data checksums. For other workflows, the metadata is in a sidecar (a per image file), or stored in a database managed by the application.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: