Laserfiche WebLink
TSTool Documentation Quality Control <br /> <br />1. If not already installed, install the data set in its default location (e.g., <br />C:\CDSS\data\colorado_1_2007 ) – these files will not be modified during testing. <br />2. Create a parallel folder with a name indicati ng that it is being used for verification (e.g., <br />C:\CDSS\data\colo rado_1_2007_verify200 9 0216 ). <br />3. Copy the data set files from step 1 to the folder created in step 2 (e.g., copy to <br />C:\CDSS\data\colo rado_1_2007_verify200 9 0216\colorado_1_2007 ) – these files will be <br />modified during testing. <br />4. Create a TSTool command file in the folder creat ed in step 2 that will run the tests (e.g., <br />VerifyTSTool.TSTool ). It is often easier to edit this comma nd file with a text editor rather than <br />with TSTool itself. The contents of the file are illustrated in the example below. Some <br />guidelines for this step are as follows: <br />a. Organize the command file by data set folder, in the order that data need to be created. <br />b. Process every *.TSTool command to verify that it runs and generates the same results. <br />c. If command files do not produce the same results, copy the command file to a name with <br />“-updated” or similar in the filename and then change the file until it creates the expected <br />results. This may be required due to cha nges in the command, for example implementing <br />stricture error handling. These command files can then be shared with maintainers of the <br />data set so that future releases can be updated. <br />d. As tests are formalized, it may be beneficial to save a copy of this file with the original <br />data set so future tests can simply copy the verification command file rather than <br />recreating it (e.g., save in a QualityControl folder in the master data set). This effort will <br />allow the creator of the data set to quality control their work as well as helping to quality <br />control the software. <br />5. Run the command file – any warnings or failures s hould be evaluated to determine if they are due <br />to software or data changes. Software differences should be evaluated by so ftware developers. It <br />may be necessary to use command parameters such as Version , available for some commands, <br />to recreate legacy data formats. <br /> <br />The following example command file illustrates how TSTool software is verified using the full data set <br />(indented lines indicate commands that are too long to f it on one line in the documentation). Note that <br />intermediate input files that would normally be modi fied by other software (e.g., StateDMI for CDSS data <br />sets) could impact TSTool verification. However, a similar quality control procedure can be implemented <br />for StateDMI. <br /> <br />Guidelines for setting up the each test in the command file are as follows: <br /> <br />1. Remove output files that are generated from each individual command file that is run using <br />RemoveFile() commands. This will ensure that test does not use old results for its output <br />comparison. <br />2. Run each individual command file using the RunCommands() command. <br />3. Compare the results of the run with the original data set file using the CompareFiles() <br />command. <br /> Quality Control - 9 125