
Tencent's AI plays StarCraft II, and wins over 90 percent of the time - zeyfah
https://www.eyerys.com/articles/news/tencents-ai-plays-and-defeats-starcraft-iis-built-ai-full-matches
======
Secretmapper
Looks like the AI is just playing against Starcraft II's AI, which makes this
slightly less impressive, as I first thought it was against human players.

The SC2 AI on higher difficulties does cheat though (extra resources) so it
does make it slightly impressive, but still a far cry from human opponents as
SC2's AI is pretty easy to beat (it would be around Gold in ladder I think)

~~~
Strilanc
If it's just the built-in AIs, you're right that it's much less impressive.
They should be winning 100% of the time.

For example, I don't know if they've fixed this, but you used to be able to
beat the AIs 1v4 on the highest difficulty. There was a bug where the AIs
would surrender individually even though they were on a team. So with an all-
out rush you could get two of them to surrender almost immediately by running
2-3 zealots through their mineral lines. Getting the third one to surrender
required a bit more fighting, and then the fourth was more of a normal game.

The AIs also used to keep their entire army in one big death ball, and if
there were 3-4 fighting units in their base they would send the army home to
defend. Even if their army was 10x bigger than yours and they'd win the base
race. So you could hop reapers into and out of their bases to prevent them
from attacking your base indefinitely (because they kept running home). Also
you were doing pretty serious damage with the reapers while they ran back.

Another exploit was to start building a cheap building in their base area at
the start, somewhat far away from the mining line. They would send a
ridiculous number of workers to attack your building. So they'd lose
significantly more money to lack-of-mining than you did for building and
cancelling a pylon or whatever.

~~~
qubax
> If it's just the built-in AIs, you're right that it's much less impressive.
> They should be winning 100% of the time.

Absolutely. Unless the win-loss ratio includes all games while the AI "learns"
to play the game. It might have lost most of its early games, but now wins
100% of the time.

I suspect it wouldn't take DeepMind long to master starcraft II and beat
starcraft's AI 100% of the time.

------
thaniri
For anyone who actually plays SC2, or any real time game where human speed is
a limitation, it should not be surprising that AI would crush human opponents.

This video demonstrates why:
[https://m.youtube.com/watch?v=IKVFZ28ybQs](https://m.youtube.com/watch?v=IKVFZ28ybQs)

AI has crushed humans in Go, the most complicated (widely played) board game.
I'll be impressed again by AI when it beats us in physical sport.

~~~
TulliusCicero
The article is talking about their AI crushing other AI's, not humans. There
aren't any AI's right now that can beat high level human players, even with
superhuman reflexes and control.

> AI has crushed humans in Go, the most complicated (widely played) board
> game.

In terms of ruleset, Starcraft is at _least_ a few orders of magnitude more
complicated than Go. It doesn't feel more complicated to humans, really, but
the sheer number of variables at play is vastly higher.

~~~
KevinCarbonara
Starcraft 2 is not "orders of magnitude more complicated than Go". It's far
less complicated in terms of options, and far more complicated in terms of
factors like speed/reaction time that don't exist in Go. Those things are
difficult to compare at all, but certainly do not qualify as "orders of
magnitude", which is measurable.

~~~
TulliusCicero
It absolutely is, which is partly why there aren't any good AI's for it yet,
even though we have ones that can crush the best players in the world at Go.

Go's game state, for example, is very small. You can pretty much represent it
with a 19x19 array of 2-bit variables, because each space only has three
possible states. That's only 722 bits. Maybe a few more to track whose turn it
is and how many pieces each player has remaining.

In contrast, Starcraft can have hundreds of units in play from dozens of
types, each of which usually has at least a position value (x and y) and a
health value, and commonly has other things like energy value and cooldown
values. And each of these you'd need at least a 16-bit int. And that doesn't
even get into the enormously larger possible action space for each "turn".

------
thomasahle
The article published:
[https://arxiv.org/pdf/1809.07193](https://arxiv.org/pdf/1809.07193).

> we propose to model the action structure by hand-tuned rules. By doing so,
> the available actions are reduced to a tractable number, which turns out to
> be easier for designing our decision-making system.

It seems they are not using the hard-core Blizzard/Deep-Mind interface (pixel-
based), but instead using an actions-based one, similar to Starcraft I bots
and Open AI's Dota bot. Still pretty interesting though.

------
neovive
For those interested, there is an excellent Youtube playlist by sentdex that
covers building a Python AI to play StarCraft II. He's up to 17 videos so far
and it's a fun way to learn/practice neural networks.

[https://www.youtube.com/playlist?list=PLQVvvaa0QuDcT3tPehHdi...](https://www.youtube.com/playlist?list=PLQVvvaa0QuDcT3tPehHdisGMc8TInNqdq)

------
jcmeyrignac
Another link: [https://thenextweb.com/artificial-
intelligence/2018/09/20/te...](https://thenextweb.com/artificial-
intelligence/2018/09/20/tencent-created-ai-agents-that-can-beat-
starcraft-2s-cheater-ai/)

------
ergothus
While others are talking about how "easy" the built in AI that can crush me is
(I'm pretty bad), the part I found interesting is that they taught it to "see"
the map from the data. Nothing new in the field, but interesting to me.

Part of me wonders if the techniques applied can be easily used on non-visual
data. AI that "sees" what we cannot. (I realize we aren't talking any kind of
actual AI here, but sometimes it is nice to just wonder... )

------
kinnth
Isn't deepmind also building an AI for Starcraft? Deepmind vs Tencent would be
a great playoff. I have a feeling deepmind might dominate tho.

~~~
kwoff
Deepmind collaborated with Blizzard to create the SC2 AI API:
[https://deepmind.com/blog/deepmind-and-blizzard-open-
starcra...](https://deepmind.com/blog/deepmind-and-blizzard-open-starcraft-ii-
ai-research-environment/)

------
gota
With OpenAI's Dota playing AI making the rounds over major media and serving
as a major advertisement for the game itself, why would Tencent develop and
publish an AI of a game they don't own (Starcraft) as opposed to one they do
own (League of Legends)?

Seems like a such a trivially better decision

~~~
ndh2
Because Starcraft is 1v1 and LoL is 5v5. Poker AIs also focus on 1v1 as
opposed to full ring games. It's easier.

Also because Starcraft has three races, i. e. 9 matchups. But if you pick one
race for your AI, you only have three matchups to worry about. LoL on the
other hand has about 500 champions (give or take), whose strengths and
weaknesses are vastly dependent on the current patch. LoL is less about skill
or strategy, and more about picking the champion of the week.

Starcraft works because they have such a small number of matchups that
balancing them is possible. Certainly not easy, and players keep figuring out
ways to evolve the meta game. Maybe an AI could even find interesting new
strategies. But LoL has so many matchups that their balancing team has an
absolutely impossible task of trying to catch up. It's so complex that they
can't just think about in theory, but they have to look at win-rate
statistics, over different skill ranges. At this point it's more a problem of
data science. And the number of champions keeps growing, because that's how
they make money.

~~~
LitFan
> LoL on the other hand has about 500 champions (give or take)

LoL has 141 champions. Still significantly higher than Dota's 116.

A common note I hear when people discuss open AI playing dota is that it uses
pre-made matchups, which reduces the number of matchups considerably.

Would it be that difficult to have the AI play the picking stage? Compared to
the complexity of the decisions you have to make during the game, the picks
are straightforward - especially if you have performance data on matchups.

The number of practice games the AI would have to play to learn to handle all
the heroes goes up drastically, but that's only a matter of time.

~~~
maaark
I'm pretty sure OpenAI only plays the single simplest hero, mechanically. The
other 115 would be varying degrees of "more difficult" to play competently.

Also, it only played mirror matches. If it had to play any hero against any
hero, that's 3.393109e+190 possible matchups.

~~~
bilkoo
I don't think you need to learn all possible matchups to play any matchup, as
a human would. You'd need to just learn how to play/counter each hero, plus
some 2 or 3 hero combos (not all permutations), so on the order of a hundreds?

------
everdev
Has there been any crossover success with gaming AI to other problem spaces?
It seems like they're just highly optimized for a particular game and can't
necessarily replicate that success outside of gaming or even in other types of
games.

~~~
jstarfish
It was never meant to train on Starcraft to dominate Solitaire or Mario, or to
figure out a more efficient way to do laundry. There's no money in that. The
particular interest in Starcraft (an iconic wargame) is meant to lend itself
to the military applications of logistics and troop placement.

At some point AI will be sufficiently advanced to understand what a grunt on a
map looks like in any context, so we'll be able to swap out the Space Marines
with real ones. Whoever can autonomously dominate a battery of RTSes has the
potential to autonomously conduct conventional warfare.

------
debt
Tencent owns PubG mobile. I wonder if they're training bots from actual
players from that game and any other ones they own.

I know bots inhabit that game, but they're not very goo.d

------
InsOp
Site seems to be down - alt link: [https://is.gd/gj6Y9M](https://is.gd/gj6Y9M)
however, as this article states, the bots learned to do a zerg rush not more
which is really not impressive. I would like to see ingame POV footage to
actually get a grasp of how these bots perform

