For hard links to work you still need to know that the brand new layer you just downloaded is same as something you already have, i.e. running a deduplication step.
How? Well, the most simple way is compute the digest of the content and look it up, oh wait :thinking:
I’m not sure what point you’re trying to make. Are you assuming that a layer would be transferred in its entirety, even in cases where the majority of the contents are already available locally? The purpose of bringing up hard links was to state that when de-duplication is done at a per-file granularity rather than a per-layer granularity, it doesn’t introduce a ru time overhead.
How? Well, the most simple way is compute the digest of the content and look it up, oh wait :thinking: