Difference between revisions of "OpenTasks/Generic/Add Testing Framework Subsystems"
(Move OpenTasks onto unique pages with an Infobox and Category so we can collect them all up onto a summary page.) |
(No difference)
|
Revision as of 11:20, 6 March 2011
Open Task | |
Task Name | Add a testing framework for ScummVM's subsystems |
Technical Contact(s) | Johannes Schickel, Max Horn |
Subsystem | Generic |
TODO: This task was partially implemented during GSoC 2010. However, this is still missing a few things, e.g. dedicated testing of Common::Archive (such a test is important, considering FSNode implements the Archive protocol incorrectly). Thus, we should not simply delete this task, but rather edit it to remove implemented parts, and flesh out what's left.
Background:
ScummVM is a highly portable application. To achieve that, there are different interfaces to access platform specific functionality. Our APIs are usually well documented, thus giving porters and engine authors a clear vision of how different functions / methods are supposed to work. Apart from the lower level platform abstraction APIs we usually offer more high level APIs for engine authors.
For example we have a low-level filesystem abstraction API: FSNode. This can be used to browse directories, open files, create new files for writing etc. Since the API is pretty much low level, engine authors usually do not use it directly, but use some wrapper code to access it. To prevent code duplication and take work from engine authors we introduced the Archive and SearchMan classes. SearchMan can be easily used to access game data files without fiddling with FSNode.
However, experience has shown that the documentation of all these classes is not always as clear and complete as it should be, and sometimes might be misunderstood. In addition, the implementations for the various systems we run on may not adhere to the documented specification and/or regressions were introduced when doing changes to the code. Currently we have only a very minimalistic test suite in "tests/", testing only a tiny subset of our code: container classes, string handling and some stream code. Currently it is hard to do proper regression/implementation testing for other APIs, for example FSNode API.
The Task:
As visible from the description above, we need an automated test system to help ensure our implementations are bug free. We need both a framework for the tests, as well as an extensive test suite. There are several different approaches to testing which all are of independent interest and complement each other:
- Unit tests: Add many more tests to our test suite in "tests/". This would also require some enhancements to our build system, since currently we do not link against all of our code for the test suite. Also new test mechanisms are needed to make it possible to e.g. test audio & graphics output (e.g. code that can load an existing reference image and compare it to the output of a graphics primitive). It might also be helpful to enhance the "null" backend so that all unit tests can be run against that, headless (good for automated tests on a server with actual graphics/audio output).
- A "test" engine: Unit tests would probably only be runnable on "desktop" ports like Windows and Linux, and not on ports like PSP or NDS, due to run environment restrictions. To allow extended coverage on *all* ports, implementing a "test" engine would be helpful. In addition to running on all ports, the "test" engine also provides an environment to test things in an integrated, non-isolated way, unlike unit tests. It should test as many features as possible: Game detection, file loading/saving, navigating the file system, testing graphics and audio output, etc. -- and it must be possible to verify everything. I.e. compare results to reference data, output progress data on the screen, create dump logs, and so on. Lots can be done here.
The test suite extension would be easiest to use on Linux etc. and there is already some existing code to build on. A test engine would work on all ports, including game consoles and phones, and is nice for testing in a more "holistic" and complete way. Of course best would be to have both, and also make it possible to run all unit tests from inside the test engine.
Overall the test engine has the highest priority, since it would give us the possibility to test all ports without additional work. Thus the primary task would be to implement and document at least a test engine, which tests (preferably) all of ScummVM's subsystems extensively.
Required Skills:
- Reasonable C++ skills.
- Experience with unit tests.
- Knowledge about various ScummVM subsystems would be useful, but not required.
- Knowledge about how to create a new engine for ScummVM would also be useful. See also our Engines HOWTO.
Also required:
- Access to a non-desktop testing device (PSP, NDS, PS2, WinCE or Symbian mobile, etc.) and the ability to compile and run ScummVM on it (the ScummVM team would help with compiling and running).