The names for the various bit rates vary between authors and standards.
I believe that the least confusing names would be:
Data bit rate = the rate at which the data bits provided by the user are sent
Signaling bit rate = the rate at which bits are sent over the physical communication medium
The 2 rates are not the same because the user data bits are encoded in some way before being sent. The signaling bit rate does not have any importance, except for those who design communication equipment. For the users of some communication equipment, only the data bit rate matters.
The data bit rate is equal to the signaling bit rate multiplied by the ratio between data bits and the corresponding encoded bits.
5 Gb/s corresponds to 625 MB/s, but for a signaling bit rate it is completely useless to convert bits to bytes, because groups of 8 bits on the physical communication medium do not normally correspond to bytes from the data provided by the user. Only for the data bit rate it is meaningful to be converted to a data byte rate.
>Data bit rate = the rate at which the data bits provided by the user are sent
> Signaling bit rate = the rate at which bits are sent over the physical communication medium
There is a third one: in addition to the line coding, there's the message framing (at the logical level) e.g. USB 3 has a signalling rate of 5Gb/s, it has a raw data rate of 4Gb/s, but it has a theoretical effective data rate of around 3.2Gb/s (400MB/s).
Can you confirm with the rule to be used.
Raw Speed = Nominal / Encoding
UMS Speed = Raw / UMS overhead
In the case of 3.0 that would be:
Nominal = 625 MiB/s
Raw = 625 - 20% = 500 MiB/s
UMS = 500 - 20% = 400 MiB/s