
Source Control for Art Assets Must Exist - ingve
http://hacksoflife.blogspot.com/2015/12/source-control-for-art-assets-this-must.html
======
rffn
Perforce does what is requested in the article:

\- Working with large files

\- Getting only the needed files form the sever, not all

\- Not wasting space with .svn/.git copies of the files

\- Reasonably fast.

They have quite some customers in the video game industry because of their
support for huge projects and large files. We are using it for a different
kind of application with similar requirements (100000s files, size up to GBs
range per file). Perforce is well suited for the job and has good 24/7
support.

~~~
douche
Is there a reasonable hosted option for Perforce? A quick search turned up
Assembla[1]; I haven't heard of them, so I don't know how reputable they are.

GitHub has just become so ubiquitous, you start to forget other source control
systems are out there. If I wasn't still using SVN at work, I'd be in git all
the time.

[1]
[https://www.assembla.com/repositories/perforce](https://www.assembla.com/repositories/perforce)

~~~
redxdev
Aside from what was mentioned, you can also self-host perforce at a relatively
low cost (provided you don't need more than 20 users, as that's all perforce's
free license supports).

I personally host a perforce server on google compute engine for around $15 a
month (g1-small + whatever size persistent disk you want). Relatively easy to
set up (they provide repositories for ubuntu and fedora iirc), configuration
is largely interactive when setting up or through the P4Admin tool (which
allows you to very easily setup permissions, depots, etc).

I know that isn't fully managed like github and doesn't provide issue
tracking, wikis, etc, but I've found it fairly easy to deal with if you know a
bit of how to setup a linux VPS.

------
VikingCoder
Google has talked extensively about how they do Source Control, and it's
freaking brilliant:

Perforce for source control. A FUSE file system. You configure it to point at
a revision (or at head) of a branch (almost everyone is always on the main
branch). Then it makes a drive (or folder) look like the Perforce repository,
but read-only. It caches files as you read them. Then, if you try to make,
modify, or delete an asset, it stores those locally, but makes it appear to be
in the same directory as the rest of the repository.

I think this would be perfect for art assets.

------
paulmd
You're looking for "bup". It's basically a special flavor of git that works at
the block level rather than the file level, so incremental revisions only need
to store the blocks that changed. It also has a bunch of other optimizations
for storing a large number of potentially large files.

[https://github.com/bup/bup](https://github.com/bup/bup)

More (potentially stale) info here:
[https://lwn.net/Articles/380983/](https://lwn.net/Articles/380983/)

~~~
hire_charts
bup is great, but it's worth reading through the limitations before you
commit:

[https://github.com/bup/bup/blob/master/README.md#things-
that...](https://github.com/bup/bup/blob/master/README.md#things-that-are-
stupid-for-now-but-which-well-fix-later)

Because of the nature of bup's solution, there are some operations that
haven't been implemented because they would be prohibitively expensive.
Pruning old files for example -- the equivalent of git's "filter-branch" just
doesn't work.

------
blakeyrat
The thing everybody glosses-over is the usability issue: Git is a nightmare
for anybody who's not a programmer to actually _use_ day-to-day.

(I _am_ a programmer, and I still find Git to be nightmarish. Sadly our
company adopted it because it's trendy, not because they did any actual
research or study about our source control needs-- the lack of centralized
file locking bites us in the ass every day.)

The reason they stuck with SVN that most interests me is that SVN has some GUI
clients that don't completely suck. Sadly, not true of Git. At least not on
Windows.

~~~
schkoobydu
Git definitely has a higher learning curve, but there are some very obvious
advantages compared to SVN. I'd recommend SourceTree for a windows GUI that
doesn't suck. [https://www.sourcetreeapp.com/](https://www.sourcetreeapp.com/)

~~~
blakeyrat
If you think SourceTree doesn't suck, I really don't know what to say. It is
the laziest kind of GUI: they literally just wrapped the CLI application in
this monstrosity, then made a button in the UI for literally every commandline
switch.

No thought or organization, and _certainly_ no usability study, went into
SourceTree.

~~~
schkoobydu
I suppose I shouldn't really be recommending git GUI's as I mostly use the
command line. I use SourceTree on occasion to get a better perspective when
browsing the history.

------
TillE
git-annex, git-lfs, and Mercurial's largefiles all approach this problem in
more or less the way you want. GitHub's fairly new git-lfs might be the
easiest to use.

------
zubspace
It's strange. We have so many widely available tools to keep a collaborative
code history, but when it comes to art assets (images/music/sound/3D
models/videos) the general consensus is just dumping everything into a central
repository (svn/perforce) or use some kind of large file extension which does
basically the same thing.

Visualising changes, multi-checkouts and merging is impossible.

Seems ripe for disruption? Or will this stay an unsolved problem forever?

~~~
hellofunk
Is it because code is text and assets are usually binary? How do you do a diff
reliably on a binary file?

~~~
ndepoel
That depends on the file type. Several version control clients can already
detect image files for example and show a visual diff between revisions. There
is still a lot of room for improvement and standardization however.

~~~
orecht
The main problem is not diff but merge. You currently can't merge changes to
images, textures etc. The whole advantage of DVCS like git is that it allows
to work in parallel and merge effortlessly. The day we can merge photoshop,
maya and 3D studio files central repository for art files will not make sense
any more. But that day has not come, yet AFAIK.

~~~
craigjb
What kind of artist work can you practically distribute between multiple
people and merge? Most things like this are broken into pieces and each piece
is tracked separately and assembled later.

I think this is a big mistake programmers make when trying to create VCS for
other industries. A lot other fields with binary files don't care about merge
as much. Like electronic design automation: track every part's history and
even every module on a board or die layout. But, practically it's difficult to
have two people productivly work on the same small piece simultaneously.

[edit] I also think a bunch of the reasons programmers require merge are
because code is organized into files. Organization into files is arbitrary and
often doesn't correlate particularly well with logical structure.

------
sambeau
I'm working on something new in this field and would love to hear from people
with interesting use-cases (and tales of woe about current tools).

Sadly I don't think diffing & merging of art assets will ever become viable
and locking can only work if your assets are controlled at an OS level. The
simplest solution is to hide a file if it's locked.

Perforce is not good at dealing with large assets one artist uploading 1 video
file can grind 300 other people to a halt. I know this from bitter experience.
You can solve it to a degree using standing servers - but it's tricky and
expensive to get right.

Fuse is interesting but doesn't work with Windows. WebDAV is old and cranky
and half broken. Anything that saves immediately will lock your editing
software up while the file travels over the network (and with art files that
can be minutes! (think video)).

Dropbox is the strongest contender here as they allow a quick save followed by
a slow sync. Sparkleshare has the right idea and working Windows client (it
uses Git behind the scenes so isn't really suitable for video assets).
Camlistore is an exciting technology based on the Plan9 file system (and Git)
it should cope with large files in theory though I haven't stress tested it.
It's still young and there isn't a good visual client yet.

I'm coming at this problem from the world of D.A.M. (yuck) but worked for
years in the AAA Games industry (many, many, giant art assets). Traditional
DAMs have never been any good at dealing with assets still in development.
Hopefully we can change that, however there are so many problems to overcome
(see above) that its unsurprising no-one has managed it yet.

Drop me a line if you have any useful/interesting use cases.

------
muro
Our artists used alienbrain at a previous job, while devs used p4. Except for
the obvious annoyance of having to sync two separate repositories it worked
well and neither side could be convinced to switch.

------
bryanlarsen
One possible solution is online editing, like
[https://clara.io](https://clara.io) or Google Docs. These tools have version
control built in, and multiple people can simultaneously edit.

You also avoid the large asset problem by always keeping the large assets in
the "cloud".

disclaimer: employee of the maker of clara.io

------
jordigh
Unity, a gamedev shop, uses Mercurial with the largefiles extension:

[http://natoshabard.com/post/122632480712/mercurial-at-
unity](http://natoshabard.com/post/122632480712/mercurial-at-unity)

~~~
to3m
Perhaps somebody knows better than I do, but I'm pretty sure Unity only make
the tools, which is a very different proposition! They will have the large
files, but not the iteration.

Storing large files somewhere outside the working copy is only part of the
solution. Working with artists is going to be a bit painful unless you have
centralized file locking, or some other self-managing means of preventing two
people modifying the same file by accident. In theory, you could use merge
tools for the binary files that artists manipulate, but in practice, these
tools don't seem to exist.

(I can't speak for hg - but an issue with git is that you really have to have
a good mental model of how it works to use it effectively. This is something
programmers like - or, failing that, are at least practised at doing. But it's
really not the sort of thing artists seem to generally enjoy. Certainly very
few of those I've worked with...)

~~~
jordigh
> Working with artists is going to be a bit painful unless you have
> centralized file locking, or some other self-managing means of preventing
> two people modifying the same file by accident.

I am convinced that file locking is not the ultimate solution, even for
artwork. Someone could decide to change the colours of some character while
someone else could decide to redraw her arms. Both changes make sense
together.

What we need is specialised diff and merge tools for artwork, which nearly any
VCS already allows you to plug in. Even github has a WUI for diffing images.
File locking is a workaround with other well-known problems. Merging divergent
work is not any more annoying for artwork than it is for code.

Pixelapse seems to be presenting a compelling case on what VCS for design
could look like:

[https://www.pixelapse.com/](https://www.pixelapse.com/)

> I can't speak for hg - but an issue with git is that you really have to have
> a good mental model of how it works to use it effectively.

This is widely touted as a huge difference between hg and git. You don't have
to understand hg's revlog format to use hg, but you need to have a solid
understanding of blobs, trees, commits, refs, and hashes to understand git.

What you _do_ need to understand for both is that someone can have modified
the same file that you're working on while you're working on it, and it is OK
that they modified your file at the same time...

~~~
vvanders
Nope, nope nope nope. This keeps coming up again with "we just need the right
tools".

The thing is there's a good chance the people you're working with aren't very
technically inclined(which is why they're such awesome artists, they have
other focuses). Anything more than the simplest scheme is going to fall over
at production scale.

It's been tried many times, the tools that you are talking about(Maya,
Photoshop, ZBrush, etc) don't even consider merging with their large binary
workflows. The closest you get to this is referenced scenes in Maya and that
requires a _good_ technical artist to set everything up and police things so
that people don't touch things they don't understand.

~~~
jordigh
> Anything more than the simplest scheme is going to fall over at production
> scale.

Then we need a simpler scheme that still allows people to understand that two
can work at the same time. _Everyone_ hates merging regardless of their
technical ability, even software developers, but I refuse to believe that it's
a hopeless task to build a tool for merging artwork.

~~~
vvanders
That's because merging two _things_ is inherently complex. You need a complete
understanding of both how they work _and_ all the things they interact with.
That's why everyone hates merging.

You're welcome to take a crack at it, however you're looking at hundreds of
hours of work to reverse-engineer all the formats that Adobe and Autodesk use
plus whatever new tools are on the horizon that you don't know about yet.

AFAIK there isn't a single tool out there that can render a PSD to PNG
reliably(aside from scripted photoshop) so thinking we would be able to cover
the _entire_ suite of tools is a bit naive.

------
Arzh
Alianbrain was really popular for games a while back, not sure it if fell-out
of favor for. [http://www.alienbrain.com/](http://www.alienbrain.com/)

------
CM30
Now if only we had this back in the 1600s. We'd know if the Mona Lisa really
did have a different person featured in it during its earlier 'development'!

