I think it's both things. Netflix, and other platforms, don't send lossless streams to you. Even at 4k. Plus, you are re-encoding it.
Its like doing a VHS copy from another VHS, or creating a new JPEG image from another. Always there is a loss of quality.
Well a 1080p stream at 30 fps would be 1,5 Gbit/s -- a little outside the spec of most people's internet tubes. And 4K UHD at 30 fps would be around 5 or 6 gigabit.
It helps to capture the Netflix stream uncompressed to remove the extra compression step you'd otherwise get at capture time, and modern encoders are pretty good, I don't think most people would notice on a laptop screen.
On a 40+ inch 4K TV though, it can be quite noticeable
The signal remains digital, but decompressed into a raw bitstream that would be many mbps (think how big lossless filesizes get). So it has to be re-encoded but you’re doubling the compression artefacts, and can only avoid them by really dialing up the bitrate.
Maybe someone made a video encoder algorithm that’s tuned toward already compressed and decompressed video.
Though I’m in the camp of watching for quality of the story etc. rather than the crispness of the video. If it’s not worth watching in 480p, it’s not worth watching in 4K either.
Capture is not lossless. Think about a photocopy machine, every copy loses a small bit of information. Recapturing video output is a similar situation.
Why? Photocopy is obviously lossy since there is a very noisy digital-analog-digital conversion going on. But a capture card is capturing a digital signal. There should be no loss except for video decoding/encoding artifacts.
You're not understanding how lossy compression encoders work. Try recompressing a JPEG a few dozen times. Or take an MP3 and export it from Audacity, open the export, export as MP3 again a dozen times and see what it sounds like.
All those artifacts keep getting amplified every time you re-encode until it's practically just the artifacts. Every time you render and recompress you're losing information, it's lossy compression after all.
Lots of the time capture cards are returning a compressed video stream instead of raw frame data, at least for non-professional environments. I don't know too many amateur streamers handling SDI around their house.
In practice, that difference doesn’t really matter because almost no one is going to store their captured, already-lossy material in a lossless format.
If you know you have to recompress and want to reduce unneccessary artifacts, you do. But beware that uncompressed video in 8 bpc (not HDR) 1080p @ 30 fps is 1,5 Gbps so you'll need 1,3 TB to store your 2-hour capture :)
Like the codec used to get the stream from Netflix to you, to be decompressed for the capture card (so lossless capture of a lossy source) and then back through x264/265 so lossy compression on a lossy compression. Just because there is a capture card in the middle doesn't stop it going through multiple lossy steps.