When Google DeepMind's AlphaGo bot defeated the world's top Go player earlier this year, many wondered what the next challenge would be. After all, computers have now beaten the world's top chess and Go players. Poker bots perform strongly in competitions with the best of the best professional human players. AI even did a bang-up job on Jeopardy! Where can artificial intelligence possibly go now?
HCII Visiting Scholar Kyung-Joong Kim thinks the answer is real-time strategy games. Specifically, StarCraft.
First released by Blizzard Entertainment in 1998, StarCraft is a real-time military strategy game that pits three races against each other as they compete to survive in a far-off portion of the Milky Way. Creating AI agents to play these types of games is challenging, primarily because the game's creators don't share the code, so researchers have little to work with. In the case of StarCraft, though, hackers reverse-engineered the game in 2010, creating a free, open-source programming interface used to interact with the game. Blizzard gave the hackers their blessing, and the first custom StarCraft AI agents were born.
Like all real-time strategy games, StarCraft poses AI challenges that don't exist for their board game equivalents, namely time and uncertainly. Board games, like Go, are turn-based. When the human or AI agent plays, they have about a minute to determine the best option among all possible moves. In StarCraft, players are moving at the same time — developing strategy, building and controlling their units, and making quick decisions about resource utilization. AI agents need to do these things instantly, but without using an exorbitant amount of computing power. Kim noted that AlphaGo used a thousand CPUs to make decisions in its one-minute timeframe. AI for games like StarCraft can't have such expensive requirements.
The second difficulty is that the game is fraught with uncertainty, unlike Go or chess, where both players can see all moves. "In StarCraft, there is a kind of fog," Kim said. "You can only see around your units. If you don't have any units in the opponent's space, then you have no information about what's going on there. It's a very difficult challenge."
Since the StarCraft programming interface was released in 2010, competitions between StarCraft AI bots have taken place across the globe, and those agents have been ranked according to their performance against each other — not humans. AI bots, it turns out, can't yet compete equally with humans. In fact, AI agents can win only about one game out of a hundred played against humans.
While some researchers may want to create an AI bot that beats the world's human StarCraft champion, Kim's goal is to make the AI bot humanlike instead.
"Players do not want to play against AI because it's not like a human," Kim said. "We're investigating how we can make something humanlike — that can play at different levels and make mistakes."
Kim's recent work looks at how human players perceive their AI counterparts, with the goal of using this information to improve on the AI's humanlike qualities. For a paper presented as "Late-Breaking Work" at the recent Association for Computing Machinery Conference on Human Factors in Computing Systems (CHI 2016) in San Jose, Calif., Kim and his colleagues invited 20 experienced StarCraft players to evaluate the skill levels, overall performance and human likeness of seven AI bots. At the end of 140 matches, the researchers found that an AI bot's win ratio against other AI bots didn't predict how favorably humans viewed it. In fact, human players favored bots with strong micromanagement and decision-making skills, while AI competitions favor mostly combat skills.
Their study also showed that StarCraft players with differing levels of experience evaluated their AI competitors differently. More expert players preferred bots with a well-balanced and integrated skill set, while less experienced players valued bots that demonstrated prowess in one particular skill.
Researchers also included one human player among the AI bots in an attempt to trip up their evaluators, but not one was fooled. All of the evaluators easily identified their human competitor.
The results, outlined in the paper "Evaluation of StarCraft Artificial Intelligence Competition Bots by Experienced Human Players," provide fodder for a research discussion on the best way to rank AI bots, given the discrepancies between how they're ranked in competition and how humans view them during play. Because players at different levels respond differently to different bots, it's important for designers to create AI players that dynamically adjust to the human competitor's level of expertise using machine learning techniques.
"If we can make AI that imitates human behavior in the games it plays, we can make better AI," Kim said. "We can also make StarCraft AI that improves its performance in matches against other AI bots."
While some people have bemoaned the future of human gaming since AlphaGo's victory, Kim thinks we're nowhere near a robot takeover.
"I feel that there should be new, super strong videogame AI, but it doesn't mean the end of human players," Kim said. "I always imagine that humans want to play against AI, but the AI should be very humanlike. We should cooperate, humans and AI."
Story by Susie Cribbs (DC 2000, 2006)