Sure; to start with I was thinking of writing tests that exercise the
API, rather than trying to check that the output is letter-perfect.
IE, we know that for xp-laptop-2005-07-04-1430.img and DTB=0x39000,
VA 0x823c87c0 should translate to physical address 0x23c87c0; a
process list should return 44 processes, etc...
Basically, for the major API functions that back the commands, I'd
like to be able to verify that they report the same results for a
known image, even as the underlying implementation may change. Of
course, that makes some assumptions that the current implementation
is correct (which, as it turns out, was *not* the case with
handles.py until recently), but it seems like a good starting point.
-Brendan
On Sep 26, 2009, at 7:37 PM, Tim wrote:
What do people
think about perhaps adding in some unit tests so we
can make sure we don't introduce regressions as we port things over
to NewObject? I'm thinking that perhaps we could base them off of a
well-known image (like the NIST xpsp2 images).
Anyways, just a thought. If I have some time in the next week I'll
see how hard it would be to implement some tests like these using
the unittest module.
I think unit tests are a great idea. At a minimum, system tests that
focus on particular commands. Over the last few reglookup releases
I've mostly been running the same commands on old versions and new
versions, then doing semi-scripted diffs between the output.
Scripting something like that would be a start (though it clearly has
limitations in the long run).
If you come up with anything, I'd be interested to hear about how you
implemented it.
cheers,
tim