- Getting only the needed files form the sever, not all
- Not wasting space with .svn/.git copies of the files
- Reasonably fast.
They have quite some customers in the video game industry because of their support for huge projects and large files. We are using it for a different kind of application with similar requirements (100000s files, size up to GBs range per file). Perforce is well suited for the job and has good 24/7 support.
Is there a reasonable hosted option for Perforce? A quick search turned up Assembla[1]; I haven't heard of them, so I don't know how reputable they are.
GitHub has just become so ubiquitous, you start to forget other source control systems are out there. If I wasn't still using SVN at work, I'd be in git all the time.
Aside from what was mentioned, you can also self-host perforce at a relatively low cost (provided you don't need more than 20 users, as that's all perforce's free license supports).
I personally host a perforce server on google compute engine for around $15 a month (g1-small + whatever size persistent disk you want). Relatively easy to set up (they provide repositories for ubuntu and fedora iirc), configuration is largely interactive when setting up or through the P4Admin tool (which allows you to very easily setup permissions, depots, etc).
I know that isn't fully managed like github and doesn't provide issue tracking, wikis, etc, but I've found it fairly easy to deal with if you know a bit of how to setup a linux VPS.
Assembla indeed offers hosted Perforce, (and hosted Git, and hosted Subversion, and even all three in the same workspace) but only as part of their collaboration suite. To get your money's worth, you'd also have to buy into their wiki and ticketing system (which does integrate into whatever repository you choose) and planning tools.
Which aren't bad, but may not be what you're looking for.
Git Fusion is worth a checkout for those programmers who want Git with a mixed repo. Only discovered it this week, but it allows me to pull from a Perforce repo as a git origin :)
Yup, perforce is the de-facto solution to this. We had a similar setup where the art-sync was ~600GB(once again views are fantastic for keeping on-disk repo size down).
I haven't had a chance to use git-fusion "in anger" yet, but I hear lots of good things for treating a p4 like a git endpoint to allow devs to work in a DVCS manner while keeping all the good parts of Perforce.
Google has talked extensively about how they do Source Control, and it's freaking brilliant:
Perforce for source control. A FUSE file system. You configure it to point at a revision (or at head) of a branch (almost everyone is always on the main branch). Then it makes a drive (or folder) look like the Perforce repository, but read-only. It caches files as you read them. Then, if you try to make, modify, or delete an asset, it stores those locally, but makes it appear to be in the same directory as the rest of the repository.
You're looking for "bup". It's basically a special flavor of git that works at the block level rather than the file level, so incremental revisions only need to store the blocks that changed. It also has a bunch of other optimizations for storing a large number of potentially large files.
Because of the nature of bup's solution, there are some operations that haven't been implemented because they would be prohibitively expensive. Pruning old files for example -- the equivalent of git's "filter-branch" just doesn't work.
The thing everybody glosses-over is the usability issue: Git is a nightmare for anybody who's not a programmer to actually use day-to-day.
(I am a programmer, and I still find Git to be nightmarish. Sadly our company adopted it because it's trendy, not because they did any actual research or study about our source control needs-- the lack of centralized file locking bites us in the ass every day.)
The reason they stuck with SVN that most interests me is that SVN has some GUI clients that don't completely suck. Sadly, not true of Git. At least not on Windows.
Git definitely has a higher learning curve, but there are some very obvious advantages compared to SVN. I'd recommend SourceTree for a windows GUI that doesn't suck. https://www.sourcetreeapp.com/
If you think SourceTree doesn't suck, I really don't know what to say. It is the laziest kind of GUI: they literally just wrapped the CLI application in this monstrosity, then made a button in the UI for literally every commandline switch.
No thought or organization, and certainly no usability study, went into SourceTree.
I suppose I shouldn't really be recommending git GUI's as I mostly use the command line. I use SourceTree on occasion to get a better perspective when browsing the history.
Git is definitely a tool of the sort that you need to learn before you can use it well. Once you do, it's not really nightmarish at all, it's quite convenient.
I would never argue that it is user friendly though.
But for non-programmers, I would say that user-friendliness is very nearly everything. They are never going to keep the "simple" underlying model of an directed acyclic graph in their heads to let them intuit how to work with it. For them, they need an interface that baby steps them through the process and makes it obvious what to do at every turn.
git-annex, git-lfs, and Mercurial's largefiles all approach this problem in more or less the way you want. GitHub's fairly new git-lfs might be the easiest to use.
It's strange. We have so many widely available tools to keep a collaborative code history, but when it comes to art assets (images/music/sound/3D models/videos) the general consensus is just dumping everything into a central repository (svn/perforce) or use some kind of large file extension which does basically the same thing.
Visualising changes, multi-checkouts and merging is impossible.
Seems ripe for disruption? Or will this stay an unsolved problem forever?
My startup tried to solve exactly this problem in the Electronic Design Automation domain. We provided VCS and compare functionality focused on the needs of this industry.
We pivoted. The main issue we faced is the way the electrical engineers work and how difficulty it is to change this. It was incredible hard the explain these people how to properly use a VCS. We focused on a simple interface and tried to optimize on the way electrical engineers work. No chance!
That depends on the file type. Several version control clients can already detect image files for example and show a visual diff between revisions. There is still a lot of room for improvement and standardization however.
The main problem is not diff but merge. You currently can't merge changes to images, textures etc.
The whole advantage of DVCS like git is that it allows to work in parallel and merge effortlessly. The day we can merge photoshop, maya and 3D studio files central repository for art files will not make sense any more. But that day has not come, yet AFAIK.
What kind of artist work can you practically distribute between multiple people and merge? Most things like this are broken into pieces and each piece is tracked separately and assembled later.
I think this is a big mistake programmers make when trying to create VCS for other industries. A lot other fields with binary files don't care about merge as much. Like electronic design automation: track every part's history and even every module on a board or die layout. But, practically it's difficult to have two people productivly work on the same small piece simultaneously.
[edit] I also think a bunch of the reasons programmers require merge are because code is organized into files. Organization into files is arbitrary and often doesn't correlate particularly well with logical structure.
I'm working on something new in this field and would love to hear from people with interesting use-cases (and tales of woe about current tools).
Sadly I don't think diffing & merging of art assets will ever become viable and locking can only work if your assets are controlled at an OS level. The simplest solution is to hide a file if it's locked.
Perforce is not good at dealing with large assets one artist uploading 1 video file can grind 300 other people to a halt. I know this from bitter experience. You can solve it to a degree using standing servers - but it's tricky and expensive to get right.
Fuse is interesting but doesn't work with Windows. WebDAV is old and cranky and half broken. Anything that saves immediately will lock your editing software up while the file travels over the network (and with art files that can be minutes! (think video)).
Dropbox is the strongest contender here as they allow a quick save followed by a slow sync. Sparkleshare has the right idea and working Windows client (it uses Git behind the scenes so isn't really suitable for video assets). Camlistore is an exciting technology based on the Plan9 file system (and Git) it should cope with large files in theory though I haven't stress tested it. It's still young and there isn't a good visual client yet.
I'm coming at this problem from the world of D.A.M. (yuck) but worked for years in the AAA Games industry (many, many, giant art assets). Traditional DAMs have never been any good at dealing with assets still in development. Hopefully we can change that, however there are so many problems to overcome (see above) that its unsurprising no-one has managed it yet.
Drop me a line if you have any useful/interesting use cases.
Our artists used alienbrain at a previous job, while devs used p4. Except for the obvious annoyance of having to sync two separate repositories it worked well and neither side could be convinced to switch.
One possible solution is online editing, like https://clara.io or Google Docs. These tools have version control built in, and multiple people can simultaneously edit.
You also avoid the large asset problem by always keeping the large assets in the "cloud".
Perhaps somebody knows better than I do, but I'm pretty sure Unity only make the tools, which is a very different proposition! They will have the large files, but not the iteration.
Storing large files somewhere outside the working copy is only part of the solution. Working with artists is going to be a bit painful unless you have centralized file locking, or some other self-managing means of preventing two people modifying the same file by accident. In theory, you could use merge tools for the binary files that artists manipulate, but in practice, these tools don't seem to exist.
(I can't speak for hg - but an issue with git is that you really have to have a good mental model of how it works to use it effectively. This is something programmers like - or, failing that, are at least practised at doing. But it's really not the sort of thing artists seem to generally enjoy. Certainly very few of those I've worked with...)
> Working with artists is going to be a bit painful unless you have centralized file locking, or some other self-managing means of preventing two people modifying the same file by accident.
I am convinced that file locking is not the ultimate solution, even for artwork. Someone could decide to change the colours of some character while someone else could decide to redraw her arms. Both changes make sense together.
What we need is specialised diff and merge tools for artwork, which nearly any VCS already allows you to plug in. Even github has a WUI for diffing images. File locking is a workaround with other well-known problems. Merging divergent work is not any more annoying for artwork than it is for code.
Pixelapse seems to be presenting a compelling case on what VCS for design could look like:
> I can't speak for hg - but an issue with git is that you really have to have a good mental model of how it works to use it effectively.
This is widely touted as a huge difference between hg and git. You don't have to understand hg's revlog format to use hg, but you need to have a solid understanding of blobs, trees, commits, refs, and hashes to understand git.
What you do need to understand for both is that someone can have modified the same file that you're working on while you're working on it, and it is OK that they modified your file at the same time...
Nope, nope nope nope. This keeps coming up again with "we just need the right tools".
The thing is there's a good chance the people you're working with aren't very technically inclined(which is why they're such awesome artists, they have other focuses). Anything more than the simplest scheme is going to fall over at production scale.
It's been tried many times, the tools that you are talking about(Maya, Photoshop, ZBrush, etc) don't even consider merging with their large binary workflows. The closest you get to this is referenced scenes in Maya and that requires a good technical artist to set everything up and police things so that people don't touch things they don't understand.
> Anything more than the simplest scheme is going to fall over at production scale.
Then we need a simpler scheme that still allows people to understand that two can work at the same time. Everyone hates merging regardless of their technical ability, even software developers, but I refuse to believe that it's a hopeless task to build a tool for merging artwork.
That's because merging two things is inherently complex. You need a complete understanding of both how they work and all the things they interact with. That's why everyone hates merging.
You're welcome to take a crack at it, however you're looking at hundreds of hours of work to reverse-engineer all the formats that Adobe and Autodesk use plus whatever new tools are on the horizon that you don't know about yet.
AFAIK there isn't a single tool out there that can render a PSD to PNG reliably(aside from scripted photoshop) so thinking we would be able to cover the entire suite of tools is a bit naive.
Now if only we had this back in the 1600s. We'd know if the Mona Lisa really did have a different person featured in it during its earlier 'development'!
- Working with large files
- Getting only the needed files form the sever, not all
- Not wasting space with .svn/.git copies of the files
- Reasonably fast.
They have quite some customers in the video game industry because of their support for huge projects and large files. We are using it for a different kind of application with similar requirements (100000s files, size up to GBs range per file). Perforce is well suited for the job and has good 24/7 support.