Proposal for Open Source Test Bucket


     With this test bucket, we strive to be:

OSTB Structure

        The Open-Source Test Bucket will consist of two main parts: the 
directory structure containing the tests and targets and a web interface that
will provide access to this structure, as well as providing procedural infor-

Directory Structure

	The directory structure will be hiearchial.  From a main directory it
will have three directories initially, one for each general area of work:  
BPatch, DPCL, and one for the targets.  In
each of the work directories will be directories corresponding to each public 
class, group (or misc. funct).  The targets directory hiearchy will be
organized as follows: 
			lang1   lang2 		...
		mpi       serial mpi  serial		 ...
 nonthreaded threaded 						...
programs 								...

That is, underneath main targets directory will be  directories corresponding to
particular programming languages, such as C, fortran77, fortran90.  From here,
we divide targets into either serial or parallel (mpi) and after that into 
threaded or nonthreaded.  In these lowest directories will be the actual target
	Each directory should have a README file that states what 
tests/subdirectories are in it and in leaf directories, it should list all tests
and what methods/items from the class are being tested by each test. 
	Also there should be scripts to automate 
each directory's tests in batch and each parent directory should call each 
child's test script.

Web Interface

	The web interface for OSTB, will provide an overview (purpose & goals),
a tar file of entire test bucket (perhaps tars of individual directories as 
well) and procedures for adding/removing tests. The end-user will also have the
opportunity to view individual test files. The web interface will be 
organized in frames but will give the user first the choice not to use frames.
Currently I am leaning towards interspersing the html files among their corres-
ponding directories as opposed to having a separate directory structure for 

Procedures & Standards

           All test programs will have to comply with some minimal coding 
standards.  Besides the normal ideas of "good" coding practices such as giving
meaningful names to variables, classes, etc., all tests will have a common 
message format for both failed and passed tests.  This is especially useful if 
one program tests, say, load and start it might produce something like:
"TEST < load > PASSED" ... then go on to say "TEST < start > FAILED".
It could also either on a separate stream or the same one give a brief 
description of the test that failed.  For example, "TEST < load > FAILED" 
could be followed by "The load command failed with ... arguments."  The 
description would not have common format.
	   This common output format will be controlled through the use of some
standard utility classes and miscellaneous functions that are provided with the
test bucket.   There is a class OutLog that has methods for checking the 
AisStatus value and producing a standard PASSED/FAIL message.  For instances
where general information needs to be written to the log, additional write
methods are provided (one will produce a standard PASSED/FAIL message in the
case where no AisStatus value is provided and the other will simply write the
text it is passed to the log file).
        Currently the procedures for adding and removing tests is simple:
  1. Write the test(s) following the guidelines set out in the Standards
    	    section.  If adding entire directory of tests, make sure to also
    	    comply with the directory structure.
  2. Send an email to the Owner of the Test Bucket, stating the reason
    	    for the add/remove action along with a tar file of all the related
  3. From here, the owner of the Test Bucket will forward the request
    	    to all core DPCL members for approval.  If a quorum is reached 
    	    (for now "quorum" is simply a majority) the test will be added.


           Eventually add a script or program that will analyze outlog files
and produce a simple report (how many tests, number passed, number failed).