
How do game companies share massive files? - Libertatea
http://www.bbc.co.uk/news/business-25037653
======
boothead
As someone said on twitter the other day:

    
    
      If "tech experts" writing news articles aren't really experts, what does that say about all the other experts..?

------
simias
So basically rsync?

Honestly synchronizing 50GB of contents across the globe doesn't sound that
terribly novel or difficult to me, or the article makes a pretty bad job of
explaining the difficulty.

~~~
mryan
> That's because some of the complete game files were as large as 50GB

50GB is not the total size of the files needing to be transferred - _some_ of
the files were 50GB. Knowing the total size would have been interesting for
the sake of discussion.

~~~
debian69
Yeah thats bollox , sorry but no game has 50 gb files on launch , and what
kind of idiotic company would send round a file they weren't using in launch
code base ?

~~~
jswanson
TLDR; Internal workflows, not customer facing. Enterprise dropbox.

The article title is misleading. They are actually talking about the internal
workflow.

Raw assets, art assets, and such, can be pretty large.

When you have a studio in Country A doing PC, a studio in country B doing ps3,
all wanting the base raw data but (presumably) transforming it in different
ways, between how the build actually happens or artists tweaking something for
a platform, it'd become a nightmare pretty quickly if you were to send it
across a company private network (often just VPN pipes across the internet
itself).

The article is, at it's heart, an ad for a company called 'Panzura', which
sounds like an enterprise-specific dropbox. Boxes 'on-premise', security
guarantees, and probably pretty expensive.

------
notduncansmith
Not gonna lie, when I saw "delta" I immediately assumed Git or some other
source control solution. What boggles the mind is how these professional game
developers recently discovered this technology. I didn't think Git was so
exclusively beholden to the web development community.

~~~
NickPollard
Git solves a different problem; sending large binary blobs around is not it.
As another comment said, this is essentially RSync, not git.

I've been out of game development for a year, but unfortunately git is not
being taken up much by large game developers. The main reason cited is
actually git's handling of binary files, which leads to every checkout
containing many duplicates of huge files. However this is possible to work
around and the benefits of git still make it worth it.

~~~
Negitivefrags
The binary file reason is big enough, but it's also the ability to check out a
sub tree without checking out the entire Repo. The repo is 1TB and there
really is no reason for your texture artist to be checking out the raw
uncompressed audio files.

The other reason not to use git in game development is because it's incredibly
hard to use, and more than half the game development team are not programmers
and/or not technical.

Getting them to use good source control discipline is hard enough with
TortoiseSVN. I shudder to think the insane mess we would have using git.

~~~
NickPollard
So one thing is to use separate repositories - either separate git repos, or
use git for code + design data, and something like Perforce for assets. We did
this at my first studio (Free Radical Design) and it worked very well.

As for git being 'hard', yes it's not trivial to learn, but for people using
just perforce/svn etc. they end up misusing so much that it's almost worthless
- as Linus said, even just sending around tarballs is better.

I think if you're running a team, you need to give your team the benefit of
the doubt that they can learn new tools, learn to use them effectively, and be
more productive. If you spend some time teaching them why the tools are
important and how to use them, then using git isn't really any more difficult
than anything else. It's just different.

~~~
w0rd-driven
This is how I've structured our use at the place I'm at. TortoiseGit is better
than svn to me. Submodules are the key though in our situation these are best
when they can exist in the same directory structure.

We create a master repo with all the design assets, statement of work, and any
other useful files for the end product. Then we have a client repo as it were
of the files we serve in the wamp www directory. Because this is physically
different, submodules make little sense. For the WPF builds that all live
together in the same directory structure, submodules are perfect. My only
hiccup is remembering to commit the subs via the master but that's a habit I
need to change. The files in the wamp repo are deployed to client computers
using git pull with a read only ssh key so people aren't committing from them.

Our wamp files are primarily text either in straight html or usually PHP. We
take care not to commit user editable sections so we don't overwrite what a
customer ultimately changes. This is kind of a hand rolled version of the now
many git deployment tools out there and it works rather well.

I know this isn't gaming nor are our source files that large but the psds can
be astronomical very quickly on just simple web pages. I could only imagine
what goes into "next gen games". GIT is still useful for binary deltas but you
definitely have to carefully plan how you partition things in repositories. If
I can do this, really almost anyone can. Kiln brought excellent binary support
to Hg so there's really no need to stick to svn. There may be areas where its
still useful but the distributed nature of dvcs is just too compelling when
your team is well... distributed. I do hear great things about perforce too
and I hope any company uses the better vcs not because they think the users
can't cope with anything new, but there is extreme benefit for doing so.
Again, if I could help our team move into Git with very little hand holding,
you would probably be surprised how quickly people adapted. Especially if
something like TortoiseGit isn't that much different than TortoiseSvn or
TortoiseHg for that matter.

------
unsigner
At some point Microsoft used a third-party solution called Aspera FASP to
accept uploads of final game disc images. It was crazy fast - when you
launched it, it reached our office' net connection speed within 10-15 sec, and
other Internet connections at the office dropp.ed

~~~
icelancer
I remember that! I worked at MS in one of the gaming capacities (no need to be
specific) and it was unreal how close to maximum saturation FASP would get.
Such an awesome tool.

------
junto
Sohonet 10Gbit/s. Used primarily by post production companies in Soho, London.

[http://www.sohonet.com/sohonet-media-
network/](http://www.sohonet.com/sohonet-media-network/)

------
t0
Sneakernets are still the fastest way to transfer data.
[http://en.wikipedia.org/wiki/Sneakernet](http://en.wikipedia.org/wiki/Sneakernet)

~~~
Piskvorrr
Not necessarily the cheapest, though - express shipping a bunch of disks is an
uncertain venture ("oops, we accidentally dropped it a few times, is that
bad?") and "I need a return cross-Atlantic flight ticket NOW" is sort of
expensive.

That said, I have heard of companies actually couriering semi-large volumes of
sensitive data this way.

~~~
Zenst
Also in some area's there is a legal aspect due to law of some contract that
makes sending somebody with a briefcase with data an easier way to move
forward without triggering some technicality in contract or law.

Indeed some countries and there data protection laws make things most
interesting in legal ways to transfer data, even today. No standard on data
protection World Wide and in itself a minefield, let alone other industries
like medical research companies or those dealing with country wide firewalls.

------
Zolomon
I think the article should mention that it is a Swedish based game developer
company, and a California based publisher for the game.

Sounds more fair that way.

------
kayoone
"That's because some of the complete game files were as large as 50GB, and
future games with more advanced graphics for the new Xbox One console and
Sony's PlayStation 4 are likely to be even bigger."

Ahm no, Battlefield 4 is already a PC game and scaled down even on the next
gen consoles, so console processing power isnt the limiting factor here.

------
rurounijones
I have always wanted to know more about Valve's steam architecture.

Why aren't they presenting at more conferences damnit!

------
moocowduckquack
Above a certain point, it becomes far quicker to send a hard drive by courier.

~~~
hengheng
That used to be true, but considering that hard drive transfer rates cap out
around 200 MByte/s, you'd have to send a large bunch of drives that must be
read and written in parallel to be faster than a network connection.

~~~
moocowduckquack
If you are constantly mirroring your media drives with multiple redundancy,
you can just unplug a set and post them.

------
planetjones
of all the technical challenges facing us in the 21st century, sending a 50GB
file quickly across some wire doesn't seem like the biggest!

also the feedback i see about battlefield's stability and quality has been
very poor (on ps4) so I had to laugh when I saw the " to locate defects and
improve quality" quote :)

------
golergka
Sounds a lot like git or other typical solution.

~~~
samnardoni
git would be pretty bad in this situation. It sounds more like rsync.

~~~
golergka
Thanks, I'll look into that.

------
_sabe_
Sponsored article?

~~~
JonnieCache
Innit. Someone at EA/Panzura's PR firm just got a raise.

