Difference between revisions of "AI Arena"

From The Battle for Wesnoth Wiki
m (fixed a typo in the date)
m (fix errors (title levels))
 
(26 intermediate revisions by 4 users not shown)
Line 1: Line 1:
= Summary =
+
{| class="wikitable" style="text-align: center;"
There was an IRC discussion on March 23-24 between Dragonking, Crab_, and Velory, regarding the AI testing Arena - a specific map (later - set of maps) to test a 'chosen' AI in 'chosen' situation.
+
|'''Note'''
 +
|-
 +
|The AI Arena should still be usable, but it is currently not maintained. If you encounter any problems with the Arena itself or the content of this page, please let us know either on the forums or the developer IRC channel.
 +
|}
  
The elements of proposed implementation are as follows:
+
== AI Arena is an interactive AI testing framework ==
 +
It is implemented as a test scenario <i>ai_arena_small</i> (available from r34329 (31 Mar 09) )
  
<b>One shared map, usable with any ai and any starting situation</b>
+
This documentation is accurate as of r34889 (14 Apr 09)
  
First arena is to be based on Den of Onis. It was picked by Dragonking because it is a small map and is good since it has different terrain including caves, lava, water etc and we can look for a bugs there when we test some particular usages.
+
To launch it, run Wesnoth with a -t parameter (must not come as last parameter). For example:
  
--
+
./wesnoth-debug -t ai_arena_small -d
This map should be changed a bit to allow 3 players (human1, computer2, computer3). Faction leaders should start 'outside' the main map in isolated 1-space pockets (for example, to allow human player to be a 'spectator' in the fight between two computers, or to allow a human to fight with a computer).
 
  
<b>Ability to select from multiple 'pre-defined' starting situations.</b>
+
The scenario is located in data/ai/scenarios/scenario-AI_Arena_small.cfg
This will allow to design 'challenges' for the ai and select one of them for execution.
 
A challenge basically creates some units on the map (optionally, if it so required by the situation under test - moves leaders from their isolated pockets to the battleground or kills them), sets their hit points and statuses, etc. It's purpose is to create a 'interesting tactical situation' for the ai to handle.
 
  
<b>Ability to hot-redeploy ai for the specific side from in-game console.</b>
+
It is based on the Den of Onis.
This will allow loading a specific ai from a specific WML configuration file (this may also load some .fai files which were included in the configuration)
 
  
<b>Ability to clean up the map</b>
+
The map includes three sides:
This will allow to return the leaders to their starting positions, and remove all units from the arena. It can be used to 'restart' the test.
 
  
<b>Ability to restart the test - Combo ability to clean up, then select same pre-defined tactical situation, then hot-redeploy all the AIs</b>
+
1: Human (AI developer)
This is basically an 'one shot restart test without typing too much in the console'. Can be implemented as a 'tile that some off-arena unit must step on' or as a console command.
 
  
<b>Usage example:</b>
+
2: Challenger AI (Side which which is tested, 'north team')
# create WML configuration which includes some formulas or includes some .fai files.
+
 
# launch the Arena.
+
3: Champion AI (Side which commands the enemy of the AI being tested, 'south team')
# select a suitable tactical situation.
+
 
# load the ai(s) using in-game console.
+
All leaders start off-map in pocketed locations.
# test and debug the AI efficiency in that tactical situation
+
 
# restart and redeploy as needed. Changes in the AI WML configuration will be picked up after either redeploy or restart.
+
You can access an interactive menu by right-clicking a map tile.
 +
 
 +
There are menu options which allows AI developer to pick the challenge (each challenge has a small description and a unique number), and to pick the AI that will try the challenge (AI developer can select from a list a .cfg location which contains the bare *contents* of the SIDE tag with [[SideWML]] and [[AiWML]] configuration - without the SIDE tag itself). The test will be loaded (units will appear inside the Arena) and selected AI will be hot-redeployed).
 +
 
 +
Then, the AI developer can end his turn and watch the AI's actions.
 +
 
 +
== Creating new challenges ==
 +
Please feel free to add more 'AI challenges' and improve existing ones. Each challenge should be AI-independent.
 +
 
 +
To create a challenge which will be committed into Wesnoth repository:
 +
 
 +
# Look at the content ai/scenarios/ai_arena_small directory
 +
# Assign a unique CODE number (4 digits) for your challenge. Pick a NAME (suitable as part of WML event name) for your challenge. Write a small DESCRIPTION for your challenge
 +
# Create a new file in ai/scenarios/ai_arena_small directory.
 +
[event]
 +
    name=preload
 +
    [lua]
 +
        code = << register_test('CODE-NAME','DESCRIPTION'); >>
 +
    [/lua]
 +
[/event]
 +
[event]
 +
    name=CODE-NAME
 +
    first_time_only=no
 +
[/event]
 +
 
 +
Use the second event to create the situation. If you do not intend to commit that file, you may use any filename and event name you like, just make sure that this is legal WML event name and make sure that it matches in both events.
 +
 
 +
Note: Add code to clean all your labels to cleanup function. (since there is (so far) no way to delete all labels on the map)
 +
 
 +
== Creating new AI configurations ==
 +
 
 +
Example of AI configuration file:
 +
ai_algorithm =formula_ai
 +
[ai]
 +
    eval_list=yes
 +
    [register_candidate_move]
 +
        name=poisoner
 +
        type=attack
 +
        evaluation="{ai/formula/poisoner_eval.fai}"
 +
        action="{ai/formula/poisoner_attack.fai}"
 +
    [/register_candidate_move]
 +
[/ai]
 +
Then use it by typing a name in the interactive menu, or add a reference to it to the AI Arena scenario
 +
 
 +
== Where to put new AI configurations ==
 +
 
 +
If it should be available for selection as a multiplayer AI and AI Arena in both non-debug and debug modes, put it in
 +
data/ai/ais/
 +
 
 +
If it should be available for selection as a multiplayer AI and AI Arena only in debug mode, put it in  
 +
data/ai/dev/
 +
 
 +
== Where to put all formulas which may be used by the AIs ==
 +
 
 +
data/ai/formula/
 +
 
 +
[[Category:AI]]

Latest revision as of 15:12, 30 November 2017

Note
The AI Arena should still be usable, but it is currently not maintained. If you encounter any problems with the Arena itself or the content of this page, please let us know either on the forums or the developer IRC channel.

AI Arena is an interactive AI testing framework

It is implemented as a test scenario ai_arena_small (available from r34329 (31 Mar 09) )

This documentation is accurate as of r34889 (14 Apr 09)

To launch it, run Wesnoth with a -t parameter (must not come as last parameter). For example:

./wesnoth-debug -t ai_arena_small -d

The scenario is located in data/ai/scenarios/scenario-AI_Arena_small.cfg

It is based on the Den of Onis.

The map includes three sides:

1: Human (AI developer)

2: Challenger AI (Side which which is tested, 'north team')

3: Champion AI (Side which commands the enemy of the AI being tested, 'south team')

All leaders start off-map in pocketed locations.

You can access an interactive menu by right-clicking a map tile.

There are menu options which allows AI developer to pick the challenge (each challenge has a small description and a unique number), and to pick the AI that will try the challenge (AI developer can select from a list a .cfg location which contains the bare *contents* of the SIDE tag with SideWML and AiWML configuration - without the SIDE tag itself). The test will be loaded (units will appear inside the Arena) and selected AI will be hot-redeployed).

Then, the AI developer can end his turn and watch the AI's actions.

Creating new challenges

Please feel free to add more 'AI challenges' and improve existing ones. Each challenge should be AI-independent.

To create a challenge which will be committed into Wesnoth repository:

  1. Look at the content ai/scenarios/ai_arena_small directory
  2. Assign a unique CODE number (4 digits) for your challenge. Pick a NAME (suitable as part of WML event name) for your challenge. Write a small DESCRIPTION for your challenge
  3. Create a new file in ai/scenarios/ai_arena_small directory.
[event]
    name=preload
    [lua]
        code = << register_test('CODE-NAME','DESCRIPTION'); >>
    [/lua]
[/event]
[event]
    name=CODE-NAME
    first_time_only=no
[/event]

Use the second event to create the situation. If you do not intend to commit that file, you may use any filename and event name you like, just make sure that this is legal WML event name and make sure that it matches in both events.

Note: Add code to clean all your labels to cleanup function. (since there is (so far) no way to delete all labels on the map)

Creating new AI configurations

Example of AI configuration file:

ai_algorithm =formula_ai
[ai]
   eval_list=yes
   [register_candidate_move]
       name=poisoner
       type=attack
       evaluation="{ai/formula/poisoner_eval.fai}"
       action="{ai/formula/poisoner_attack.fai}"
   [/register_candidate_move]
[/ai]

Then use it by typing a name in the interactive menu, or add a reference to it to the AI Arena scenario

Where to put new AI configurations

If it should be available for selection as a multiplayer AI and AI Arena in both non-debug and debug modes, put it in

data/ai/ais/

If it should be available for selection as a multiplayer AI and AI Arena only in debug mode, put it in

data/ai/dev/

Where to put all formulas which may be used by the AIs

data/ai/formula/
This page was last edited on 30 November 2017, at 15:12.