Cookies help us deliver our services. By using our services, you agree to our use of cookies. Learn more

crowdAI is shutting down - please read our blog post for more information

Visual Doom AI Competition 2018 - Singleplayer Track (1)

ViZDoom Reinforcement Learning



UPDATE: Results of the final evaluation are up to date. They have been re-evaluted (detailed explanation can be found here).

Results of the final evaluation

From 51 teams submitted 204 agents, we’ve selected 4 best submissions from 4 different teams according to leaderboard on 10 new unknown easy maps.

Winner: TSAIL (Tsinghua University & Tencent AI Lab)

1st Runner-Up: DoomNet (Andrey Kolishchak)

2nd Runner-Up: VIPLAB (from Tsinghua University)

Team Bot Map 1 2 3 4 5 6 7 8 9 10 Total time (m)
TSAIL TSAIL 3.90 5.00 0.72 0.34 0.82 0.32 4.06 5.00 5.00 0.20 25.34
DoomNet DoomNet 0.97 5.00 0.95 1.97 5.00 0.33 5.00 5.00 5.00 0.65 29.86
VIPLAB agent_viplab 2.45 5.00 5.00 0.86 2.42 0.48 5.00 5.00 5.00 0.33 31.54
ddangelo DoomGai 5.00 5.00 5.00 5.00 0.65 1.00 5.00 5.00 5.00 0.68 37.33

Visual Doom AI Competition at CIG2018 Singleplayer Track (1)

This competition is run on Computational Intelligence and Games Conference 2018.

The task here is to create an Artificial Intelligence agent that is able to finish Doom game levels (not the original levels but randomly generated ones) using data normally available to human players (mostly visual data) without any auxilliary information. The agent has to use the ViZDoom framework that gives real-time access to the screen.

Track 1 challenges agents to beat single player levels as fast as possible. Levels vary in difficulty with time so the entry threshold is low - you do not need sophisticated knowledge to start and you can learn on the run!

Multiplayer (track 2) challenge page is available here.


The task is to create a bot that completes randomly generated levels of Doom, filled with monsters and resources. PyOblige random generator is provided for training. The final evaluation will also use maps generated by this generator.

Bots will be ranked by the time of completion of a set of levels. The competition times will be summed up. Death will reset progress without resetting the timer. Each level will have limited time to complete, that will also be the maximum time for a level (lowest score).

Generated maps may contain various monsters, hazardous surfaces (acid and lava), weapons and items, doors (that need to be opened with the USE key). Difficulty levels will be moderated as the competition progresses. At the beginning, test evaluation will take place on really simple maps. If the submitted controllers beat them with ease, the difficulty will be increased.

During the public evaluation period bots will be scored after submission and the leaderboard will be updated. During final 2 weeks, the scores will be reset and kept secret. Only a list of participants will be published.


Each team is allowed to a single submission with 1 bot for a given point in time - new submissions will override the old ones. Teams and bots aren’t allowed to cooperate.

Before the actual evaluation, the submitted controller will have to pass a simple test - complete a simple level without monsters.

We also reserve the right to disqualify bots that behave random-like or/and are unintelligent in an evident way have programmed malicious behavior or violate the rules in any way.

During the contest, ViZDoom will have somewhat limited capabilities.


  • loading your config file,
  • using ANY resolution,
  • using available game variables except for POSITION_X, POSITION_Y, POSITION_Z
  • using ANY available screen format,
  • changing the render options (render_crosshair/weapon/decals/particles),


  • network communication (unless you need localhost for something),
  • using new_episode, send_game_command methods (e.g. adding bots)


  • using depth, automap and labels buffers (they will be filled with random noise),
  • changing mode, scenario path, vizdoom path,
  • using USERX variables,
  • using CROUCH button (will not have any effect)
  • using POSITION_X, POSITION_Y, POSITION_Z game variables (zeros or noise)


Submission challenge deadline: August 10, 2018 - 23:59 UTC.


Top 3 places at the end of the challenge (August 12, 2018 - 23:59 UTC) will be awarded 500USD/300USD/200USD from IEEE CIS if eligible. For more information on awarding policy please consult this website

We are also looking for sponsors!

Want to help or support us in any way? Contact us!


Repository with a sample (random) submission


Contact Us

We strongly encourage you to use the public channels mentioned above for communications between the participants and the organizers. In extreme cases, if there are any queries or comments that you would like to make using a private communication channel, then you can send us an email at :