Laserfiche WebLink
<br />OOuJ(~1t <br /> <br />r <br />i: <br /> <br />I'. <br />t <br />, <br />L <br /> <br />f <br />L <br /> <br />r <br />f <br />I <br /> <br />r <br />L <br /> <br />.!' <br />, <br />i <br /> <br />r <br />l.. <br /> <br />[ <br />L <br />t <br />i <br /> <br />r <br />t <br /> <br />F) <br />t <br /> <br />".., <br />r <br />- <br /> <br />~ <br /> <br />Analysis Tools <br /> <br />summary of key parameters at each project and for the overall system for a given period of <br />time (i.e., daily for 14 days; daily for 14 days coupled with weekly for 6 months). <br />Examples of key parameters for each project include: inflow, elevation, storage, discharge <br />(turbine and spill), and generation. When daily routing is employed and/or water quality <br />issues are important, key parameters could also include flow, elevation, temperature, and DO <br />at specified locations. System parameters such as system storage, available flood control <br />storage, system energy-in-storage, system generation, should also be available. This <br />information must be available in report and graphic format, as specified by the user. <br />Graphic displays should be coordinated with the formats being utilized by the DSS system <br />status displays. The user must also be able to specify which projects to include in the <br />analysis. Finally, if the model has been run multiple times (e.g., to compare the effects of <br />forecasted inflow), it must be possible to compare and statistically analyze model runs. <br /> <br />Reservoir System Perfomw.nce. In addition to being able to view and inspect model output <br />directly, it must be possible to interpret these results in comparison to performance criteria <br />or targets at individual proJects and on a system-wide basis. For example, to evaluate a <br />model run with respect to multipurpose objectives, graphics and reports should be available <br />to document violations of minimum flow requirements for water quality, actual vs targeted <br />elevations for summer recreations, actual vs targeted releases for recreational floatways, <br />actual vs targeted generation, etc. These capabilities must be available for single and <br />multiple runs and allow for comparison between runs. It should be possible to evaluate <br />violation and/or achievement of targets in terms of extent (severity), duration, and frequency. <br />Access to statistical analysis procedures will be required to support this function. The user <br />will be responsible for developing the reservoir system performance criteria. <br /> <br />Model Perfomw.nce. The capability must exist to compare model results to observed <br />conditions for the same forcing hydrology and operational constraints (i.e., special <br />operations, outages, etc). <br /> <br />Multi-Objective Tradeoff Analysis <br /> <br />The analysis tools must also include capabilities for evaluating the tradeoffs involved in <br />meeting multiple reservoir system objectives. It is expected that in this version of the <br />PRSYM model, tradeoff analysis will be based upon the use of multiple runs under varying <br />operating assumptions and constraints. The use of more formalized multi-criteria evaluation <br />techniques, however, can be explored. The most cost-effective methodology should be <br />implemented at this time. Nonetheless, procedures must be developed to present the tradeoff <br />analyses in a clear and precise manner that is useful in an operational decision making <br />environment. The user must work closely with the consultant and/or subcontractor in <br />defining multi-objective criteria or targets. <br /> <br />5-2 <br />