Cookies help us deliver our services. By using our services, you agree to our use of cookies. Learn more

crowdAI is shutting down - please read our blog post for more information

Visual Doom AI Competition 2018 - Multiplayer Track (2)

ViZDoom Reinforcement Learning



UPDATE: Added results of the final evaluation.


From 33 teams submitted 152 agents, we’ve selected 3 best submissions from 3 different teams according to leaderboard for final evaluation on 10 new unknown maps. We’ve added two best bots from 2017’s edition and the best bot from 2016’s edition.

Winner: bwbell (Ben Bell)

1st Runner-Up: TSAILAB (Tsinghua University & Tencent AI Lab)

2nd Runner-Up: michaelkrax (Michael Krax)

Team Bot Map 1 2 3 4 5 6 7 8 9 10 Total frags
bwbell Marv2in 19 15 53 31 21 51 33 34 32 53 342
TSAILAB AWM 19 21 30 33 39 27 22 12 19 24 246
michaelkrax CVFighter 26 18 21 21 30 40 16 24 9 29 234
Terminators Arnold4 (1st in 2017) 13 -4 16 15 15 9 16 24 8 17 128
TSAIL YanShi (2nd in 2017) 6 8 14 18 11 8 13 12 6 21 117
IntelAct IntelAct (1st in 2016) 2 -2 13 12 11 13 14 6 6 20 95

Visual Doom AI Competition at CIG2018 Multiplayer Track (2)

This competition is run on Computational Intelligence and Games Conference 2018.

The task here is to create an Artificial Intelligence agent that is able to compete with other agents in Doom deathmatches using only data available to regular players without any auxilliary information. The agent has to use the ViZDoom framework to connect to the game.

Track 2 is a full on Doom deathmatch (like in previous years) on unknown maps. Agents will compete in multiplayer games and the best frag collector will emerge victorious.

Singleplayer (track 1) challenge page is available here.


The task is to create bots that fight against each other in a regular deathmatch, where different weapons and items are available. The bots will be ranked by the number of frags, where the number of frags for this competition is defined as: frags = number of killed opponents - number of suicides

5 maps are provided for training and more maps can be found at Doomworld. The final evaluation will take place on several (secret) testing maps.

During the public evaluations phase bots will play multiple matches with different opponents and the leaderboard will be updated accordingly.

During final 2 weeks the leaderboard will be hidden. Multiple matches on various maps will be played similarly to the public evaluation period. Best bots (probably 8 of them) will take part in a final matches that will determine the top ranking. Finalists will most likely be published before publication of final results at CIG


Each team is allowed to have a single submission with 1 bot for a given point in time - new submissions will override old ones. Teams and bots aren’t allowed to cooperate.

Before the actual evaluation, the submitted controller will have to pass a simple test - winning a match against bots from ViZDoom.

We also reserve the right to disqualify bots that behave random-like or/and are unintelligent in an evident way or have programed malicious behaviour.

During the contest, ViZDoom will have somewhat limited capabilities.


  • loading your config file,
  • using ANY resolution,
  • using available buttons except for CROUCH,
  • using available game variables except for POSITION_X, POSITION_Y, POSITION_Z
  • using ANY available screen format,
  • changing the render options (render_crosshair/weapon/decals/particles),
  • setting agent’s name and color via add_game_args(“+name AI +colorset 0”) (or in a config file),


  • network communication other than the game connection itself (unless you need localhost for some reason),
  • using new_episode, send_game_command methods (e.g. adding bots)


  • using depth, automap and labels buffers (they will be filled with random noise),
  • changing mode, scenario path, vizdoom path,
  • using USERX variables,
  • using CROUCH button (will not have any effect)
  • using POSITION_X, POSITION_Y, POSITION_Z game variables (zeros or noise)


  • Submission deadline: August 12, 2018 - 23:59 UTC.



The prizes have been updated, as follows:

Top-3 places: will be awarded 500USD/300USD/200USD from IEEE CIS if eligible. For more information on awarding policy please consult this website

Want to help or support us in any way? Contact us!


Repository with a sample (random) submission


Contact Us

We strongly encourage you to use the public channels mentioned above for communications between the participants and the organizers. In extreme cases, if there are any queries or comments that you would like to make using a private communication channel, then you can send us an email at :