SoftwareTesting

From The Battle for Wesnoth Wiki

The Wesnoth project uses a few different software testing systems. Especially for graphical and layout features, we use the test scenario, a highly decorated test scenario with a ton of graphical features that we can click around in to quickly spot problems. For non-graphical issues, we also have automated tests. We (loosely) term these unit tests. In the software development industry at large, a unit test normally refers to a small, quick test of a single class or method. Unit tests guarantee that each unit is working individually. They are particularly valuable when used with continuous integration, allowing us to catch regressions as they occur. For continuous integration, we currently use GitHub Actions.

This page is meant to explain how to use the tests and how to add new tests.

Overview

The test systems which we currently use are:

The test scenario

Located at data/test/scenarios/manual_tests/scenario-test.cfg.

This is the default scenario used when wesnoth is run in test mode. This can be launched by running:

./wesnoth -t

The test scenario can also be launched from the main menu. To do this, you need to first open preferences and assign a hotkey to the "Start Test Scenario" action, which has no default assignment. Once you've assigned a hotkey, you can press it while at the main menu to bring up a list of known test scenarios. The test scenario is identified as simply "test".

Other interactive test scenarios can be started in the same way – either using the hotkey, or by adding their ID after -t on the command-line. Most interactive test scenarios are found either in data/test/scenarios/manual_tests or in data/ai/scenarios. The AI test scenarios are set up to be demos for various AI features.

C++ unit tests (game engine and API)

Located at src/tests/.

These can be run by compiling the test suite executable, one of the possible targets when compiling with SCons or CMake. You will need to have the Boost.Test library development files installed, and, in CMake's case, pass -DENABLE_TESTS=ON during configuration.

# Building with SCons:
scons boost_unit_tests

# Building with CMake:
cmake .. -DENABLE_TESTS=ON
make boost_unit_tests

# Running the test suite from the root dir (one above data):
# (Adjust if using a separate build dir)
./run_boost_tests

WML unit tests (WML/Lua API)

Located at data/test/scenarios/ in a variety of subfolders. Note that the tests in <data/tests/scenarios/manual_tests are not unit tests.

These are Wesnoth test scenarios which contain events which run at the start, perform sanity checks, and then report victory or defeat immediately depending on the results. They are not meant to be interactive. They are run using the main Wesnoth game executable, by means of a script. On Unix-based systems, this script is run_wml_tests located at the root of the repository. See the forums for alternative scripts/a method to add this to your Visual Studio project file.

./run_wml_tests

By default, this runs all tests listed in the wml_test_schedule file. Or you can run a specific individual test with the following command:

./wesnoth -u id_of_test

Submitting unit tests with patches

It is not mandatory in the project to accompany any patch with unit tests, however it is very welcome and encouraged, especially for patches that includes changes or additions to the engine's internal APIs or the public WML/Lua API. For many kinds of patches unit tests aren't appropriate, but it's always good practice to find ways to test your code. Submitting tests with your code helps to ensure that your contribution will keep working in the future, and makes everything easier to maintain.

Similarly, if you add new graphical features, it might be a good idea to add them to the graphical test scenario.

How to add C++ unit tests

To add a C++ unit test, create a .cpp file in src/tests/. A suitable template to start with might be src/tests/test_config.cpp.

In the simplest case, you just have to do the following things:

0. Include the boost unit test framework.

#include <boost/test/unit_test.hpp>

1. Declare a "boost auto test suite" and give it an appropriate name. An auto test suite is a bundle of tests.

BOOST_AUTO_TEST_SUITE( test_my_feature )

[...]

BOOST_AUTO_TEST_SUITE_END()

2. Declare any number of test cases inside the test suite. A test case is BOOST_AUTO_TEST_CASE( ... ) with an identifier as the argument, followed by a C++ method body. The method body should contain statements like BOOST_CHECK_EQUAL( ... , ... ). All of the checks should pass when the function is executed.

For an example, you might check src/tests/test_config.cpp.

Please consult the Boost.Test docs for other styles of checks that you can use:

3. Finally, add entries for your new .cpp file in source_lists/boost_unit_tests alongside the other test .cpp files, to ensure that it is compiled as part of the test executable. You do not need to register your test cases in any other way, as the unit testing framework will execute all of the tests that it finds. If you are using Xcode, also add your tests to the project file and include them in the unit tests target. Not doing this will not block your code from being merged – someone else will add it later on.


If a check fails, you will generally get the suite, the test case name, and the line number and the exact expression that prompted the fault. BOOST_CHECK_EQUAL will also give you the mismatched values. If the test code segfaults, you might get just the test case name, or only get the test suite name.

Pro tip

If you have many test cases which test an object which is very complicated to construct, and you don't want to construct and destroy it repeatedly, then you should use a "global fixture", which essentially puts the object at file scope where it will be available to all the test cases in the suite when they run. It's better to use the boost fixture system than to just put it at file scope, because then the boost system knows what is going on, and can report problems more easily. (There are probably other reasons as well.)

You can see an example of a global fixture being used in src/tests/test_mp_connect.cpp.

How to add WML unit tests

To add a WML unit test, make a test scenario .cfg file and place it in the appropriate subdirectory under data/test/scenarios/. Any wesnoth scenario using the [test] tag can be a valid unit test. However, there are some macros to make writing unit tests simpler. Here's a minimal example.

Generally speaking, a unit test is just a scenario with events which cause it to end in response to the start event, without interaction from the user. If you know how to make a Wesnoth scenario with events, and use the [endlevel] tag, you know everything you need to know. There are a handful of macros which can make it a little more succinct.

Test scenarios need to be registered in the test schedule file to be used by the continuous integration systems. The test schedule file also says what is supposed to be the result of the test -- is it supposed to pass? fail? timeout? result in a corrupted replay? The test schedule file is wml_test_schedule at the root of the source tree.

#
# Sanity checks of the unit test system
#
0 test_return
1 test_return_fail
0 test_assert
1 test_assert_fail
1 test_assert_fail_two
2 empty_test
4 break_replay_with_lua_random
0 fixed_lua_random_replay_with_sync_choice
0 test_end_turn
...

Make a line with your test scenario id. The line should begin with a number, this is the result code.

The schedule includes various kinds of expected failure to ensure that the system is working. Almost surely you want your own tests to pass, and should use code 0 for that.

Consult the UnitTestResult enum of the run_wml_tests script for the possible results of a WML unit test.

The run_wml_tests script also supports adding alternate test schedules and various other options, please consult its help text for more info:

./run_wml_tests -h

For more general info about WML unit tests please refer to this forum thread: http://forums.wesnoth.org/viewtopic.php?t=40449

See also

This page was last edited on 4 December 2023, at 04:50.