"In TCP connections, the large bandwidth-delay product of high latency connections, combined with relatively small TCP window sizes on many devices, effectively causes the throughput of a high latency connection to drop sharply with latency" --Wikipedia
It does equal bandwidth until everything is jumbo frames UDP
If you would have pasted a larger part of the quote, you would have included the part of increasing the window size (e.g. window scaling, selective ACKs, like almost everything out there supports) and the mention of satellite links to refer to high latency conditions. And that notwithstanding that, we are talking about additional latency of microseconds against a typical delay of at least 30 milliseconds. "drop sharply" just doesn't apply here.
If you want to argue against DPI that's fine but "it'll make the Internet slower" comes off as whining plus you're fighting against the exponential effects of Moore's law. A quick Google search tells me there are several products that'll do line rate DPI at 10Gbps.
There are much better arguments to be made against DPI such as privacy, a slippery slope to a walled garden, or just plain unfairness.
It does equal bandwidth until everything is jumbo frames UDP