In Spack [1] we can express all these constraints for the dependency solver, and we also try to always re-cythonize sources [2]. The latter is because bundled cythonized files are sometimes forward incompatible with Python, so it's better to just regenerate those with an up to date cython.
That's what PyYAML does as well. It uses PEP 518 [1] to specify the dependencies for the build which PyYAML has included Cython [2]. It's just that for previous releases there was no upper bound here so pip and other tools just selected the latest version which was incompatible. In the past PyYAML included the cythonised .c files in the sdist but as of 5.4.0 they went the PEP 518 route and ensures the client will cythonise them if installing from the sdist.
My impression is that the Python ecosystem rarely specifies upperbounds on dependencies even if they follow semver.
In the Julia ecosystem it's the default, and you basically have to release a new patch version for updated compat bounds with your dependencies. This is much more stable. It works because the process is mostly automated: dependency releases new version => a bot opens a PR on your repo updating the compat bound, you just merge it.
Neat! I had no idea that Spack can be used for cythonize dependencies. I use it to manage my local compiler/CUDA/Trilinos stack but never considered it for non C++ things.
[1] https://github.com/spack/spack/ [2] https://github.com/spack/spack/pull/35995