With distributed version control systems such as Git or Mercurial, when you clone a repository you get the entire history of that repository (or of a selected branch). This means that if you place large media files directly in the repository, then every clone will contain each and every revision of that file. In time, this will cause an enormous amount of bloat in your repository and slow work on the repository down to a crawl. Cloning a repository several dozens of gigabytes in size is no fun, I can tell you.
Centralized version control systems such as Subversion don't have this problem (or at least, to a lesser extent), because as a user you only download a single revision of each file when you check out the repository.
Extensions like git-media, git-fat and now git-lfs solve this issue by only storing references to large media files inside the Git repository, while storing the actual files elsewhere. With this, you will only download the revision of the large file that you actually need, when you need it. It's sort of a hybrid solution in-between centralized and decentralized version control.
Centralized version control systems such as Subversion don't have this problem (or at least, to a lesser extent), because as a user you only download a single revision of each file when you check out the repository.
Extensions like git-media, git-fat and now git-lfs solve this issue by only storing references to large media files inside the Git repository, while storing the actual files elsewhere. With this, you will only download the revision of the large file that you actually need, when you need it. It's sort of a hybrid solution in-between centralized and decentralized version control.