My WebLink
|
Help
|
About
|
Sign Out
Home
Browse
Search
IntermtnClimateSummaryJan2006
CWCB
>
Drought Mitigation
>
DayForward
>
IntermtnClimateSummaryJan2006
Metadata
Thumbnails
Annotations
Entry Properties
Last modified
10/6/2011 3:41:52 PM
Creation date
10/12/2007 10:18:50 AM
Metadata
Fields
Template:
Drought Mitigation
Title
Intermountain West Climate Summary
Date
1/17/2006
Description
Water Availability Task Force Meeting Presentation
Basin
Statewide
Drought Mitigation - Doc Type
Presentation
Document Relationships
IntermountainWestClimateSummary
(Attachment)
Path:
\Drought Mitigation\Backfile
IntermtnClimateSummaryJan2008
(Attachment)
Path:
\Drought Mitigation\DayForward
IntermtntClimateSummaryJune2006
(Attachment)
Path:
\Drought Mitigation\DayForward
There are no annotations on this page.
Document management portal powered by Laserfiche WebLink 9 © 1998-2015
Laserfiche.
All rights reserved.
/
18
PDF
Print
Pages to print
Enter page numbers and/or page ranges separated by commas. For example, 1,3,5-12.
After downloading, print the document using a PDF reader (e.g. Adobe Reader).
Show annotations
View images
View plain text
<br />FEATURE ARTICLE FROM INTERMOUNTAIN WEST CLIMATE SUMMARY, JANUARY 2006 <br /> <br />(Continued from p.2) <br /> <br />and 33 percent considered below-average. <br />For example, a forecast that calls for a <br />40 percent probability of above-average <br />temperature is less certain than a forecast <br />that calls for a 70 percent probability of <br />above-average temperatures. In both cases <br />the projection is for temperatures to fall <br />into the above-average tercile as compared <br />to the actual conditions observed from <br />1971 through 2000. <br />White space on the map indicates <br />Equal Chances (EC) of falling into any <br />of the three terciles (i.e., no forecast). <br />Only rarely does the CPC issue a forecast <br />predicting near-average temperatures, <br />indicated by gray shading. <br /> <br />Climate forecast performance <br />On the FET home page, you'll also <br />see options to "Explore the Forecasts," <br />to consider "How do the forecasts relate <br />to my specific situation?" and to evaluate <br />"Forecast Performance." Select "Forecast <br />Performance" to follow the example here. <br />This is where you can test and com- <br />pare how CPC forecasts have performed <br />in the past, based on the forecasts issued <br />since 1994. Here we take a step-by-step <br />approach to testing a seasonal forecast's <br />success: <br />1. The "National Weather Service Cli- <br />mate Prediction Center" option is auto- <br />matically selected, so there's no need to <br />do anything. (In the future, other options <br />will become available.) <br />2. Select NWS CPC seasonal climate <br />outlooks (contiguous states). <br />3. Select precipitation. <br />4. Select a forecast season, in groups of <br />three months, by sliding the shaded box <br />with your cursor and then clicking on it. <br />The months are listed by their first initial <br />only. Choose DJF to get the three-month <br />seasonal outlook for December, January, <br />and February. The selected grouping will <br />show up below the shaded area as DJF. (If <br />you want to do more than one three-month <br />period, click your mouse upon each selec- <br />tion and you'll see the selected months <br />listed below.) <br />5. Select the month or months during <br /> <br />which the forecast was issued. Click in <br />the boxes for each year you want. We'll <br />select N (November) for each available <br />year (1994-2004). The three-month sea- <br />sonal forecasts are issued up to a year in <br />advance and updated every month. <br />6. You now have the opportunity to select <br />the type of statistical test you'd like to <br />apply to the forecasts. Select the "False <br />Alarm Rate" option. Brief descriptions <br />of the other options (e.g., Probability of <br />Detection, Brier Score) are included at the <br />end of this article. <br />7. Once you have made your choices, hit <br />"Submit" to launch the program. When <br />the results appear, read the box at the <br />top under "You Chose" to make sure the <br />computer accurately recognized all your <br />choices. (For example, if you did not click <br />on your season selection, the default" All <br />Seasons" \vill appear.) <br />8. The results will include national <br />maps color-coded by division and a <br />color bar below that explains the legend <br />(Figure 1 b). For these comparisons, the <br />344 NOAA climate divisions have been <br />grouped into 102 larger divisions. Colo- <br />rado Wyoming, and Utah have eleven total <br />di visions under this system, with some di- <br />visions that overlap other states. You can <br />see the actual value for a climate division <br />by holding your cursor over it. <br /> <br />Frequency of Forecast Results <br />Regardless of which category you <br />select, you will first see a map indicating <br />the Frequency of Forecast Results. This <br />shows how often a forecast was actu- <br />ally made about the season of interest by <br />climate division. A value of 0.322 means <br />a forecast covered some or all of the divi- <br />sion about 32.3 percent of the time since <br />1994, when forecasts were available more <br />than one month ahead. Scroll down to see <br />the results you were seeking. <br /> <br />False Alarm Rate <br />This comparison considers how <br />often the projected forecast turns out to <br />be wrong, using the category that was <br />predicted to be most likely. To convert <br /> <br />the resulting climate division score into a <br />percentage, just multiply the value by 100. <br />So if forecasters called for wet condi- <br />tions three times, but they only occurred <br />twice, the false alarm rate would be 0.333 <br />or 33 percent. Note that, in this case, low <br />scores are good. To consider how often an <br />issued forecast was accurate, just subtract <br />the False Alarm Rate score from 1 (or the <br />percentage from 100). In this theoretical <br />example, the forecast was accurate 66 <br />percent of the time. In the actual example <br />tested here, scores ranged from 0.5 to <br />0.857 for "wet" conditions and from 0 to <br />0.75 for "dry" conditions (Figure Ib). Wa- <br />ter managers have indicated they find the <br />False Alarm Rate particularly relevant. <br /> <br />Show Data Behind the Map <br />If you want to see the forecasts that <br />were considered for the evaluation, click <br />on a climate division of interest and then <br />click on the "Show the Data Behind the <br />Map" option. First you'll see a <br />description of how to interpret bubble <br />plots, including a sample bubble plot. <br />Then you'll see the data used for the cli- <br />mate division of interest for the season(s) <br />and years indicated. <br />Besides the False Alarm Rate, there are <br />a number of other options available for <br />evaluating forecasts. To try other tech- <br />niques, return to the Climate Forecast Per- <br />formance page. (If you can't find it, return <br />to the FET homepage and select "Forecast <br />Performance. ") <br /> <br />Modified Heidke Score <br />This selection is intended for use by the <br />National Weather Service (NWS) forecast- <br />ers who have historically used this ap- <br />proach to evaluate forecasts. It is included <br />on the FET site because NWS forecasters <br />receive instruction in use of this tool as <br />part of their ongoing climate training <br />courses, as explained by NWS Climate <br />Services Chief Robert Livezey. He feels <br />that for those not familiar with the Heidke <br />system, the other methods provided (e.g. <br />Frequency of Forecasts, Probability of <br />Detection, False Alarm Rate, Brier Score, <br /> <br />FEATURE ARTICLE I 3 <br /> <br /> <br />(Continued on p. 4) <br />
The URL can be used to link to this page
Your browser does not support the video tag.