Yes. I’m not blaming the TV shows and movies for portraying the things they do rather than bleak reality, and I’m not blaming people for wanting to watch what they want to watch.
I’m just arguing (purely descriptively) that quite possibly the prominent depictions in US movies and TV shows have given many people an impression of the US that is more glorious than reality; and that this discrepancy is greater regarding the US than most other countries.