
Six Impossible Problems - wallflower
http://blog.artillerygames.com/2012/07/six-impossible-problems.html
======
mediocregopher
Another problem that needs to be handled is how to keep the source code
secret. For open source/free games this isn't a huge concern, but for a
triple-A company, or a company making a multiplayer game, keeping the source
code out of those haxxors hands is important. For a browser based game can we
do much besides simply obfuscating the javascript? If people can pick apart
obfuscated assembly code, js will be a piece of cake.

And for multiplayer games, how will a company detect a modified client? Short
of issuing a browser plugin there isn't access to the low level so a Warden
type program like the one Blizzard uses for WoW won't be workable.

I'm not saying these are impossible problems to solve, but I don't think
they're going to be trivial. Hopefully I'm wrong though :)

~~~
kevingadd
You can't do it. Don't bother trying.

It's at least less impossible in Native Client, but the verifier means that
you can only obfuscate so much.

Ultimately the winning model here (that most games with serious competition
end up taking, to some degree) is to never trust the client, at all, with
anything. Everything remotely important is the domain of the server and the
client is basically a dumb renderer that only has the minimum amount of
information necessary to play the game. This does have some unfortunate
consequences for latency and responsiveness, but people tend to hate cheating
more than lag.

Sometimes you can find domains where cheating is okay in exchange for better
responsiveness, but then it creeps into your design and causes issues. World
of Warcraft, for example, trusts the client to handle pathing and movement
logic in order to create the illusion of low-latency, super smooth motion. As
a result there are a litany of clever exploits that utilized this to break
important game systems or acquire subtle advantages over others. For example,
a locked door in a dungeon obviously is going to be impassable to players,
right? What do you mean they teleported through it?

~~~
shasta
_World of Warcraft, for example, trusts the client to handle pathing and
movement logic in order to create the illusion of low-latency, super smooth
motion._

There is a simple solution: Trust but verify. The main reasons not to check
that your clients aren't cheating are code complexity and the server
processing required.

~~~
kevingadd
Well, there's a reason why there are games (DOTA 2, for one modern example I'm
certain about) that don't do 'trust but verify': sometimes if you verify, you
will find that the client is wrong, in a way that probably just means they are
innocently desynced, but might mean they're cheating, and it's hard to tell
the difference. Worse still, those minor desyncs end up causing major gameplay
differences that make the player feel like your netcode is terrible.

Textbook example of the latter: Playing a melee class in Diablo 3. Rubber-
banding and teleporting everywhere, getting killed by things you can't see or
that are on the other side of the screen, etc:

<http://www.youtube.com/watch?v=CFw6MYqUyKs#t=0m15s>

I do wish 'trust but verify' worked at scale though. It feels like one of
those brilliant technology solutions that lets you beat the competitor on
server costs. :D

~~~
Negitivefrags
Trust but verify is absolutely the way that games using client side movement
prediction work, and games without client side movement prediction feel
terrible to play.

World of Warcraft has problems because they don't do much of the verify part.

The kind of problems you are seeing in this Diablo 3 video are avoidable, and
solving those kinds of problems are essential to making a networked game.

------
wtracy
I've been kicking around a slightly different approach to delivering games
over the web.

Broadly, pick a class of game you intend to work with (I was thinking of
building one for RTSes) and build an engine targeted at that genre. You can
make the engine have good performance since you can write it in native code,
so you don't have to fight with the limitations of JavaScript/Actionscript/any
other sandboxed language.

Embed a sandboxed Lua interpreter inside this engine. Force all game-specific
code to run inside this interpreter, making sure that the engine is complete
enough that nothing performance intensive is game-specific.

Then turn this whole thing into a client that users install. Make it easy to
have links that automatically open in this client. If you're feeling fancy,
make it operate as a browser plug-in, and then technically it will actually
run in the browser. :-)

How many people would develop for such a platform if it existed?

~~~
mechazoidal
Replace 'client that users install' with 'prebuilt targets for all major
mobile+desktop platforms', and you've got Zipline's Moai:
<http://getmoai.com/>

------
malkia
Some things are natural to streaming - voice-over, video, music - others are
very hard.

An FX explosion with tons of particles needs lots of textures/models in
advance to be visualized.

An on screen-model (say the weapon that the player carries in FPS game) is so
detailed that you need several megabytes only for it there. If it's multi-
player game, you need to be able to change that weapon instantly, so you need
to show it right away.

Oh, I've said sounds are easy to stream - well not so, if this is the sound of
your gun, which should come in an instant!... And in multi-player game, this
means a lot more guns :)

Then the level. In single player game you start from a predefined point, so
you can do some hints to the streaming system - precache this/that, etc.

Multiplayer baby! What would you do? You can be spawned at any point...

The UI of a modern AAA game is also pretty big - you need icons, images, etc
of a lot of detailed weapons, attachments, etc.

You need roughly 100mb in an instant to have some good play, and somehow load
the rest (gigabytes) in the mean time. And this is for current gen consoles,
for high-end PC and future consoles this could get x10 easily.

As for the whole procedural thing. It's cool, I like it - but don't overdo it
- it's the artist arch-nemesis. And more or less only programmer types are
kind of into controlling such procedural "thingies" :)

------
angelbob
World of Warcraft (and now other games) have a great semi-solution to number
six. It turns out that by doing the game world calculations on the server but
doing them in parallel on the client and updating regularly, you can beat the
speed of light thing in many cases.

And then when the client doesn't know something and guesses wrong, you get
rubber-banding as it snaps back to last verifiable server state.

It's still a really, really good hack.

~~~
psykotic
> World of Warcraft (and now other games)

The technique is much older than that in gaming. Client-side prediction was
the big ticket feature for QuakeWorld in 1996, and Wikipedia says Duke Nukem
3D had it 10 months earlier.

Dead reckoning, a more advanced variation on the same theme, goes back even
further to early work on distributed military simulations. With dead
reckoning, not only does the client extrapolate based on the last update from
the server, but the server also tracks how far each client's predictions would
be from the true state, so it knows how frequently to send updates to maintain
a low enough error. That is, the server simulates the clients simulating the
server.

For anyone with a deeper interest in this topic, perhaps the best overview of
the trickier issues with networked synchronization and prediction in modern
games can be found in this presentation on Halo: Reach from last year's GDC:
[http://www.gdcvault.com/play/1014345/I-Shot-You-First-
Networ...](http://www.gdcvault.com/play/1014345/I-Shot-You-First-Networking)

------
kevingadd
Part of the problem with the 'just stream the assets' idea is that yes, you
can do it, but it's really expensive to do because of the wide-ranging impact
it will have on your game design. This is easy to miss when you test it out on
simpler games because the scale isn't big enough for the problems to become
visible. To provide a concrete example:

I worked on a title that was one of the first 3D games (if not the first) to
do pervasive streaming. Any new player could go to our website, click a link,
download a 50kb .exe file, and be playing the game immediately - the .exe
would bootstrap itself by downloading the core runtime bits (roughly 4MB
worth) and then load up the full game client which would begin streaming down
textures, models, and sounds to get you into the game. The game itself had
something around 6GB of assets (if I remember right) and as expansions were
released the total size of the game's assets grew closer to 10GB (and is
probably far past that now).

Players loved it, and the bizdev guys loved it because it meant we got more
players in the door and got them playing faster. It also had some great
secondary consequences - we could roll out a new update, and even if it
changed content, players could be back in the game and playing within minutes.
It's still one of the most memorable things about the game and even now few
games have replicated the experience.

During my time on the design team, the streaming technology seemed like solid
gold. The designers didn't have to think about this technology (as far as I
knew), and didn't have to do any work to make it happen. We just built content
and players were able to see it really easily.

Despite all this, the big-budget sequel to the original game _will not_ use
streaming technology. It moved to the model all the big names use, where you
download a huge blob of files and they get unpacked on your disk, just like
World of Warcraft.

Why? It's subtle:

The idea of being able to stream assets on demand to players means that
strange and sometimes terrible things could happen if you don't carefully
think through the entire design of your game. One rather tricky example that I
missed during my time as a designer (but realized the depth of as an engineer)
is that in a multiplayer game, because players can enter and leave an area at
any time, you need to be ready to load all the textures and meshes for a
player's equipment and character customizations at a moment's notice. What
happens if you're missing the 50MB worth of assets you need to render that
player? Is he invisible? What happens if a player is invisible in a Player
versus Player game match? What happens if a player is invisible in a _ranked
tournament for cash prizes_ because the other players don't have his textures
cached?

It goes further if you think about it - entire classes of design tricks and
content aren't possible in this model because they imply being unable to know
in advance what content is needed. Something as simple as randomly assigning a
monster a set of skills when the player enters isn't easy to do, because that
means that the level now depends on _every skill in the game_ along with
potentially all the assets for those skills, because the actual list of
dependencies isn't known until the player has already entered - at which point
it's too late.

Anyway, the point of all this: Artillery aren't the first devs to aspire to
solve all these tough problems automatically for their customers, and they
won't be the last. They're pretty smart guys, so they've got a decent shot at
it, but the real issue is that some of these problems aren't solvable with
technology because they are _fundamental design limitations_ for the kinds of
games people build today. Hat tip to the post author for acknowledging this
with problem #6 (the CEO of a firm once tried to convince me that it was
possible to work around the speed of light with his brilliant physics
technology...). Ultimately, the winners will probably be the people who
instead of trying to work around these fundamental issues, make the issues as
easy to understand as possible and provide great tools to mitigate them.
Hopefully Artillery takes that path.

~~~
angelbob
Indeed. Several of these problems actually say a _lot_ about the design of the
games that implement the solutions. This is one of those.

Similarly, "start with procedural and add some bits on top" (one proposed
solution to this) would be a huge, huge change to how you design everything,
and require a different set of tools and a different mindset for designers,
and, and, and...

But man, if you could make it work it would be _amazing_.

Several of his points are the same way. They constrain all kinds of things and
don't look at all like the current game industry we're used to.

That's one reason I don't think they have a prayer of solving all five. Too
many constraints.

If they could do two or three, that would _still_ be brilliant.

~~~
emtel
Realm of The Mad God used procedural generation for its levels, and was hugely
successful, so yes, this way of doing things is a change, but it's also not
completely unprecedented.

~~~
angelbob
Sure. Pretty much none of what he's talking about is unprecedented (except
actually breaking the speed of light). It's just inconvenient and far from
mainstream.

------
dinkumthinkum
We've done all these research and development on hardware. Do we really need
to throw it all in the garbage just reinvent all these technologies to work in
the browser? What is the real benefit of this? I mean kudos to this team my
but my question is a bit broader/fundamental.

------
thedufer
> Build tools for procedural content generation that work for game designers.
> What if game designers could procedurally generate a map that they mostly
> like, and then make tweaks to it as needed? The client would download both
> the inputs to the procedural map generator and the tweaks, instead of an
> entire raw map file.

This strikes me as nothing more than an awful form of compression. Re-worded,
they're saying write a lossy compression algorithm and compress the map with
it, then diff the compressed version against the uncompressed. Then, send the
client all of the compression algorithm, lossily compressed version, and
diffs. How does this beat using a known compression algorithm?

~~~
Negitivefrags
Saying that procedural terrain generation algorithm is like writing a lossy
compression algorithm and compressing a map with it is just wrong.

For a good example of a game that uses exactly the technique described, see
this video: <http://www.youtube.com/watch?v=HhyyUiYQolA>

~~~
thedufer
So they're pitching as general-purpose solutions things that are fairly
specific. (Procedurally-generated maps might be great for open worlds, but
many games just need carefully planned maps, for balance purposes if nothing
else.) That explains some of the other points, like networked multiplayer
without the game maker knowing that its not local.

------
malkia
The Seventh Impossible Problem - How would your QA team fully test that game?

Btw, I've outlined some of my experience working on a streaming game here -
<http://news.ycombinator.com/item?id=2574946>

It was in response to some critique of Second Life:
[http://bbot.org/blog/archives/2011/05/22/the_failure_of_seco...](http://bbot.org/blog/archives/2011/05/22/the_failure_of_second_life/)

------
fleitz
Hasn't OnLive solved essentially all of these problems?

------
vr000m
There is the GRITS framework that Google showcased at the IO
(<https://www.youtube.com/watch?v=Prkyd5n0P7k>), in this the central server
resolves the issue and does game prediction for resolving latency between
players.

There is also the P2P data channel being proposed in the WebRTC working group,
which will allow sending data packets (there is another API for sending
multimedia) between browsers.
[1][http://dev.w3.org/2011/webrtc/editor/webrtc.html#peer-to-
pee...](http://dev.w3.org/2011/webrtc/editor/webrtc.html#peer-to-peer-data-
api) [2]<http://tools.ietf.org/html/draft-jesup-rtcweb-data-protocol>

------
sp332
I just installed a preview of MS Office 2013 which uses a streaming system to
(almost) seamlessly deliver program features before you use them. It's pretty
impressive how fast you can get up and running. You can try it out here
<https://www.microsoft.com/office/preview/en>

------
troymc
The browser adds a bunch of constraints. Forget the browser and those
constraints go away. Problems solved.

If you really must offer something in the browser, use something like OnLive
to give people a taste, but then offer a download for the best experience.

~~~
paulhodge
I had the same thoughts. It seems like a small case of lunacy to build an
ambitious state-of-the-art platform for high-performance multiplayer gaming,
while also trying to run inside the browser. They won't even have access to
UDP sockets, which is a huge setback for this kind of work. No vsyncing, no
tight control over input handling, restricted multithreading, and more. If
they manage to write the best code possible, they will only end up with an
experience that's totally mediocre compared to native code.

Speed of light is the least problem. Artillery Games, I hearby disallow you
from joking about speed of light until you are running on a platform that
allows UDP sockets. Sorry, rules are rules.

------
rshlo
I wish this brilliant minds were put to solve more urgent problems in areas
like water and energy, than gaming.

------
epaik
It may be because I'm browsing on my phone, but I couldn't find a contact
email anywhere.

~~~
wtracy
I'm on an actual computer and I can't find one either.

Maybe it's a hiring filter? Figure out our email address to get a chance to
interview? :-)

