Games are a bit different – how the latency is measured depends on the game, and typically doesn't account for the rendering or local state update required to render correct data, may not account for the server's computation of the new state, won't account for input latency, etc. I agree that it's very noticeable in games, but a quoted 20ms network latency on a game could probably be anywhere from 40 to 200ms of effective latency, so I don't think they're a great example to use here.
What absolutely everyone is measuring today in games is motion to photon latency, with the help of a high-speed camera. This is the only thing that actually matters, along with motion-to-photon over network (with RTT measured separately), so you can tell how well both client-side and server-side parts are optimized, how high the server tick rate is etc. For the server part, different actions such as movement or attacking could even have wildly different latencies, depending on how it's implemented in the code.
The most responsive games achieve sub 30 ms motion to photon locally, with proper hardware.