Artigo Revisado por pares

Predicting combat outcomes and optimizing armies in StarCraft II by deep learning

2021; Elsevier BV; Volume: 185; Linguagem: Inglês

10.1016/j.eswa.2021.115592

ISSN

1873-6793

Autores

Donghyeon Lee, Man-Je Kim, Chang Wook Ahn,

Tópico(s)

Human Motion and Animation

Resumo

Real-time strategy (RTS) games’ nature that, more complex than the turn-based, tabletop games such as Go, has been spotlighted in the field of artificial intelligence (AI) due to its similarity with real-world problems. In StarCraft II , agents cannot make decisions and control until they evaluate and compare the expected outcome of a choice. Among the ways to evaluate outcomes, combat models are one of the active areas of research to this problem, which is a basis for decision-making. The battlefield of combat needs to be considered in combat models because they have enough influence to overturn the outcome of the battle. However, its effect has not been sufficiently examined. We introduce a combat winner predictor that utilizes battlefield and troop information. Furthermore, we propose a constrained optimization framework with gradient updates to optimize unit-combinations based on the combat winner predictor. Experiments demonstrate the robustness and rapidness of the proposed methods in large-scale combat datasets on various battlefields of StarCraft II . The proposed framework achieved better accuracy in prediction and retrieved winning unit-combinations faster. Incorporating these frameworks into AI agents can improve the AI’s decision-making power. • Decision-making is a key-problem in real-time strategy games. • Proposed method enhanced StarCraft II AI’s decision-making ability regardless of battlefields. • A neural-network-based surrogate model resulted in rapid decision-making.

Referência(s)
Altmetric
PlumX