FSL in Review 2000 - 2001

Cover/Title Page


Organizational Chart


Message from
the Director


Office of Administration
and Research


Forecast Research
Division


Facility Division


Demonstration Division


Systems Development
Division


Aviation Division


Modernization Division


International Division


Publications


Acronyms and Terms


Figures



Contact the Editor
Nita Fullerton


Web Design:
Will von Dauster
John Osborn


Best Viewed With
Internet Explorer

Demonstration Division

Margot H. Ackley, Chief
(Supervisory Physical Scientist)

(303-497-6791)

Web Homepage: http://www-dd.fsl.noaa.gov/

Norman L. Abshire, Electrical Engineer, 303-497-6179
Leon A. Benjamin, Programmer Analyst, 303-497-6031
B. Carol Bliss, Program Support Specialist, 303-497-5866
Michael M. Bowden, Engineering Technician, 303-497-3260
Jeanna M. Brown, Data Technician, 303-497-5627
James L. Budler, Engineering Technician, 303-497-7258
James D. Bussard, Information Systems Specialist, 303-497-6581
Michael G. Foy, Programmer Analyst, 303-497-6832
David J. Glaze, Electrical Engineer, 303-497-6801
Seth I. Gutman, Physical Scientist, 303-497-7031
Kirk L. Holub, Systems Analyst, 303-497-6642
Bobby R. Kelley, Computer Specialist, 303-497-5635
Kathleen M. McKillen, Secretary, 303-497-6200
Scott T. Nahman, Logistics Engineer, 303-497-3095
Michael J. Pando, Information Systems Specialist, 303-497-6220
Brian R. Phillips, Senior Engineering Technician, 303-497-6990
Alan E. Pihlak, Computer Specialist, 303-497-6022
Michael K. Shanahan, Electrical Engineer, 303-497-6547
Scott W. Stierle, Systems Analyst, 303-497-6334
Douglas W. van de Kamp, Meteorologist, 303-497-6309
David W. Wheeler, Electronic Technician, 303-497-6553

(The above roster, current when document is published, includes
government, cooperative agreement, and commercial affiliate staff.)

Address
NOAA Forecast Systems Laboratory Mail Code: FS3
David Skaggs Research Center
325 Broadway
Boulder, Colorado 80305-3328


Objectives

The Demonstration Division evaluates promising new atmospheric observing technologies developed by NOAA and other federal agencies and organizations and determines their value in the operational domain. Activities range from the demonstration of scientific and engineering innovations to the management of new systems and technologies.

Currently the division is engaged in five major projects:

  • Operation, maintenance, and improvement of the NOAA Profiler Network (NPN), including three systems in Alaska.
  • Assessment of the Radio Acoustic Sounding System (RASS) for temperature profiling.
  • Collection and distribution of wind and temperature data from Boundary Layer Profilers (BLPS) operated by other organizations.
  • Development and deployment of a surface-based integrated precipitable water vapor (IPWV) monitoring system using the Global Positioning System (GPS), known as ground-based GPS-Met.
  • Planning and support activities for a national Mesoscale Observing System initiative which will include profilers and GPS-Met systems.

The division comprises five branches organizationally; however, the branches work in a fully integrated team mode in supporting the overall objectives of the division.

  • Network Operations Branch – Monitors systems' health and data quality, and coordinates all field repair and maintenance activities.

  • Engineering and Field Support Branch – Provides high-level field repair, coordinates all network logistical support, and designs and deploys engineering system upgrades.

  • Software Development and Web Services Branch – Provides software support of existing systems, develops new software and database systems as needed, provides Web support of the division's extensive Web activities, and designs software to support a national deployment of profilers.

  • GPS-MET Observing Systems Branch – Supports development and deployment of the GPS-IPWV Demonstration Network, and provides software development and scientific support.

  • Facilities Management and Systems Administration Branch – Manages all computers, data communications, network, and computer facilities used by the staff and projects of the division.

Network Operations Branch
Douglas W. van de Kamp, Chief

Objectives

The Network Operations Branch is responsible for all aspects of NOAA Profiler Network (NPN) operations and monitoring, including the coordination of logistics associated with operating a network of 34 radars and surface instruments (Figure 24). In addition to the NPN sites, which include GPS integrated precipitable water vapor (GPS-IPWV) capabilities, another 55 NOAA and other-agency sites are also monitored for timely GPS positions and surface observations to produce real-time IPWV measurements. This branch relies heavily on the other branches within the division to maintain and improve NPN real-time data availability to the National Weather Service (NWS) and other worldwide users.

DD - NPN

Figure 24. The NOAA Profiler Network of 35 radars and surface instruments. The Alaska sites are shown at the bottom left.

Of the five people in the branch, three are involved with the day-to-day operations and monitoring tasks related to the hardware and communications aspects of the network. They also coordinate all field repair and maintenance activities, interact with field personnel, and log all significant faults. Another person monitors the meteorological data quality from the NPN, and yet another handles all financial aspects related to the continued operation of the NPN, including tracking land leases, communications, and commercial power bills for more than 30 profiler sites. All five continue to work with others in the division to support the operations and maintenance of the NPN, resulting in consistently high data availability statistics for the past six years. The high quality upper-air and surface observations are distributed in real time to a wide range of users, such as NWS forecasters and numerical weather prediction modelers.

Accomplishments

The availability of hourly winds to the NWS remained high for Fiscal Year 2000. A summary of the overall performance of the network for the past 10 years is presented in Figure 25. A decrease in the availability of hourly winds can clearly be seen each year during the spring and summer months, compared to slightly higher availability during the fall and winter months. This pattern can be attributed to increased lightning activity and severe weather during the convective season, causing more commercial power and communications problems, along with profiler hardware damage from nearby lightning strikes, and site air conditioner failures during the summer. From this trend analysis, additional lightning suppression and communications equipment protection are being added to the profiler sites by the Engineering and Field Support Branch.

DD - NPN Decadal Data Availability

Figure 25. NOAA Profiler Network data availability from January 1991 – January 2001.

A very important component of the Network Operations Branch is the logging of all significant faults that cause an outage of profiler data. The duration of each data outage is broken down into many different states, including how long it took to initially identify a failure, diagnose and evaluate the problem, wait for repair parts to be sent and received, restore commercial power or communications, and when and how the fault was ultimately repaired. Analysis of these states reveals important information regarding operation of the network, as shown in the examples below.

Each profiler site's mean time between failure (MTBF) over the most recent, nearly five-year period is shown in Figure 26. Also shown are the maximum time (days) between failures (MaxTBF) and the total number (count) of failures for each site, including all data outages (i.e., power and communications, not just profiler hardware) lasting longer than 24 hours. The "better" sites are shown toward the right-hand side of the figure, with many operating longer than one year without an outage.

DD - NPN mean time between failures

Figure 26. Mean time between failure for NOAA Profiler Network sites with outages over 24 hours.

The NPN, currently noncommissioned by the NWS, is routinely monitored by personnel in the Profiler Control Center (PCC) only during normal working hours, 8:00 AM – 5:00 PM during the week (27% of the total hour in a week). The remainder of the time, the profilers, dedicated communication lines, and Hub computer system operate while unmonitored and unattended. Figure 27 shows the distribution of downtime (normalized over the past four years). Generally more than 50% of downtime was due to waiting for parts to arrive or for a repair person to arrive at the profiler site. Thus, it was determined that increased staffing of the PCC would have little impact on improved data availability; however, additional NWS maintenance staff would improve response time.

DD - NPN downtime distribution

Figure 27. Distribution of NOAA Profiler Network downtime, normalized over four years from 1997 through 2000.

Figure 28 shows the total number of hours of profiler data lost by fault type (such as component failures, scheduled downtime for maintenance, and power and air conditioner failures) from 1 January 1996 – 1 October 2000. A further breakdown, by fault disposition, is shown in Figure 29. This information is monitored for trends that may be causing outages. The Fault Disposition Category indicates the corrective action that was required to restore normal profiler operations for each data outage in the past two years, and the number of hours of missing data attributed to each fault category. The largest category by far is the "Line Replaceable Units (LRUS) Replaced." This simply means that a piece of hardware, an LRU, had to be replaced to restore operations, along with the associated waiting time for a technician to respond to the site. The next largest category to restore operations is "Scheduled Down Time (SDT) Completed." This typically means that preventive system maintenance or antenna measurement/repair activities were completed. Note the significant number of lost hours of data attributed to the local breaker (main 200 amp) being tripped to the opposition, usually caused by lightning related power surges, and only needing to be reset. From this analysis, the Engineering and Field Support Branch designed and installed the capability to remotely reset the main breaker at each site. The Network Operations group routinely uses this method to restore profiler operations, as well as "power cycling" a site to sometimes clear other problems. The next largest category, "Profiler Maintenance Terminal (PMT) Restart," is activated to restore operations. This outage is typically corrected by logging into the profiler's computer from the Profiler Control Center in Boulder and reentering critical system parameters that have been corrupted, or simply restarting the profiler's data acquisition cycle. Each profiler must have a current Search and Rescue Satellite Aided Tracking (SARSAT) inhibit schedule in order for the transmitter to radiate, and for the profiler to measure the winds. This inhibit schedule (the smallest category in Figure 29) expires very rarely, usually because of an extended primary communications link outage, causing the profiler's transmitter to shut down as a fail-safe mechanism to prevent possible interference to the SARSAT system.

DD - NPN data loss by fault type

Figure 28. NOAA Profiler Network data lost by fault type over fiscal years 1999 and 2000.

DD - NPN data loss by fault disposition

Figure 29. NOAA Profiler Network data lost by fault disposition over fiscal years 1999 and 2000.

The branch has made significant improvements in its ability to remotely monitor activity within the NPN via the World Wide Web. Activities that are now routinely monitored on the Web include information on profiler real-time status, data flow to the NWS Gateway, and ingest of profiler data into the Rapid Update Cycle (RUC) model at the National Centers for Environmental Prediction (NCEP).

A significant improvement was made to the quality control algorithm, primarily affecting the three Alaska profiler sites. The algorithm allows the removal of very specific radial velocities that come from multiple trip ground clutter returns. Overall data quality has been greatly improved.

The Bird Contamination Check algorithm, developed nearly five years ago within the division, was modified. The original algorithm analyzed only the north and east beams to detect the broader spectral widths caused by migrating birds. Now the spectral width from the vertical beam has also been incorporated into the algorithm. Although the spectral width bird signature is not as broad in the vertical beam, it is still apparent and improves the algorithm's detection ability.

Staff continue to evaluate the Radio Acoustic Sounding System (RASS) for temperature profiling. When the Neodesha, Kansas profiler was recently relocated, the planned addition of RASS was a primary consideration in selecting the new site.

Projections

The division expects to receive its first DEC (now Compaq) Alpha this year to replace the aging MicroVAXes for the profiler sites. The new system will be tested and evaluated, and should lend itself to acquiring raw Doppler spectra, which will lead to improved data quality related to reduced ground clutter and internal interference, and bird rejection in addition to the existing bird detection.

An improved wind and RASS data quality control algorithm has been investigated, and is being designed primarily to detect and correctly flag data contaminated by internal interference.

Additional lower tropospheric (boundary layer) profiler data will be acquired from targets of opportunity around the country. Staff will also continue adherence to sound operating principles that have produced high data availability rates.


Engineering and Field Support Branch
Michael K. Shanahan, Chief

Objectives

The Engineering and Field Support Branch provides high-level field repair, coordinates all network logistical support, and designs and deploys engineering system upgrades. These activities lead to improved operation and maintenance of the NOAA Profiler Network (NPN) and help to increase data availability. The 35-site network is monitored to assure data quality and reliability. Working with others in the Profiler Control Center (PCC), branch staff identify problems using remote diagnostics to analyze the situation and pursue corrective action.

Through agreement with the National Weather Service (NWS), their electronics technicians perform most of the preventive and remedial maintenance. At the PCC in Boulder, staff use the profilers' remote diagnostic capabilities to detect failed components, order Line Replaceable Units (LRUs), and coordinate with the NWS electronics technicians to carry out field repairs. A team of specialized engineer/technicians, called rangers, who are experienced in the design and operation of the profiler systems, handle the more complex problems. As division employees based in Boulder, the rangers can be mobilized to repair the profilers on short notice.

Accomplishments

NOAA Profiler Network

Alaska Profiler Network – The branch was involved with the transition of the Alaska profilers to the NWS. A Memorandum of Agreement was signed in 2000 by NWS headquarters, the NWS Alaska region, and the Office of Oceanic and Atmospheric Research/FSL for the implementation, support, maintenance, and operation of the profilers. NWS headquarters is providing coordination and support for the Alaska Region, and intends to use the three 449-MHz Alaska profilers as operational systems. FSL will continue to operate the profilers as part of the NPN, and the Alaska Region will assume responsibility for onsite maintenance, logistics, and funding of these systems.

The Alaska 449-MHz Profiler Network became operational in October 1999, and has recorded data availability to the NWS of over 90% since June 2000, and over 98% since August 2000 (Figure 30). When the Alaska profilers became operational, the wind profiles in the first five to seven range gates were corrupted due to receiver saturation. A delay of the transmitted pulse through a band pass filter in the transmitter caused a timing problem, whereby the receiver turned on while the RF pulse was still being transmitted. To solve this problem, the signal processor was reprogrammed to send the transmitted pulse sooner to compensate for the delay in the band pass filter. To ensure the integrity of the range gates, a calibration was performed before and after corrective action was taken.

DD - AK Network Data Availability

Figure 30. Alaska 449-MHz network data availability to NWS from October 1999 – January 2001.

Accumulation of snow on the antennas in Alaska has caused some loss of data. The Talkeetna and Central profilers have experienced problems when large amounts of snow build up on the antenna and the temperature rises above freezing. The snow melts and then refreezes causing the antennas' electromagnetic pattern to become distorted with high side lobes and thus degrading the data. It is interesting to note that when the snow pack is dry and consistent, it has no effect on the antenna pattern. The Alaska electronics technicians have had to remove the snow from the Talkeetna antenna twice this year and once from the Central antenna.

Minor ground clutter problems have also occurred with the Alaska profilers. When the antennas were installed, they were raised higher than usual off the ground for easier maintenance. This caused the antenna to be closer to the top of the security fence, apparently making it more susceptible to ground clutter. The security fence seems to perform a dual purpose, also acting as a clutter fence to eliminate ground clutter. Experiments with a higher fence will be conducted in the spring to help diminish the clutter problem.

Profiler Site Relocation – The Neodesha, Kansas, profiler was relocated because the site's landowner changed and that location was unsuitable for the Radio Acoustic Sounding System (RASS) operations. A new site was found about six miles from the original site and relocation was completed last September. Although the move only took one week, data were not available for a month because a component failed and the site was in checkout mode for quality assurance.

Equipment Upgrades – In response to an NWS Service Assessment report following the 3 May 1999 tornado outbreak in Oklahoma, which stated that "The NWS should make a decision on how to support the existing profiler network so that the current data suite becomes a reliable, operational data source," all NPN profiler sites have been outfitted with a remote-control main breaker (Figure 31). Profiler main breaker trips are the second largest contributor to site downtime. In most cases, main breaker trips are caused by AC power fluctuations or power surges during storms. Although the site usually does not sustain any damage from these occurrences, data availability remains down until a site visit is made to reset the main breaker. To address this problem, the main breakers at all profiler sites were replaced with ones that can be controlled remotely using a touch-tone phone. This capability reduces expenses and increases data availability by eliminating the need for a technician to visit the site to simply reset the main circuit breaker.

DD - Remote Control Main Breaker

Figure 31. A remote control main breaker installed at all NPN sites.

Projections

Alaska Profiler Network – Experiments will be performed to find a solution to correcting the ground clutter problems at the Alaska profilers. Figure 32 is a photo of the profiler site at Glennallen, Alaska.

DD - Glennallen Profiler

Figure 32. Profiler site at Glennallen, Alaska.

Other NPN Sites – The branch will continue providing prompt field repairs, appropriate coordination of network logistical support, and economical equipment upgrades to provide the meteorological community with quality NPN data. A continuing effort involves outfitting sites with wind and temperature profile capability, along with water vapor and surface meteorological measurements. Currently, 9 out of 35 sites are configured with RASS, and plans are to install these units at the Neodesha, Kansas, and Jayton, Texas, sites within the next year.

Two types of surface meteorological instruments, the Profiler Surface Observing System (PSOS) and the GPS Surface Observing System (GSOS), are now located at each profiler site. Plans are to replace these two units with one digital system, PSOS 11, at all profiler sites.

The grounding and lightning protection at all sites will be evaluated and upgraded to safeguard against lightning strikes. Existing ground networks will be tested and refurbished if necessary. Communications equipment, profiler components, and computers will be protected to isolate them from the damaging effects of lightning strikes.


Software Development and Web Services Branch
Alan E. Pihlak, Chief

Objectives

The responsibilities of the Software Development and Web Services Branch are to provide software support of existing systems, develop new software and database systems as needed, provide Web support of the division's extensive Web activities, and design software to support a national deployment of profilers.

A recently implemented strategy concerns the development of new software to support future operations of the NOAA Profiler Network (NPN) and its infrastructure. A process reengineering effort during Fiscal Year 2000 indicated that a more effective way to view the NPN of the future is as a set of platforms offering convenient power and communications at which to install meteorological measuring equipment for both operational and research purposes. This strategy drives software development efforts in three areas: migrating profiler operations to the National Weather Service (NWS), continuing to support the record-setting reliability and availability of the NPN data, and becoming the prime focal point on the Web for profiler data.

Accomplishments

The NOAA Profiler program was mandated a NOAA "mission-critical system," which required that formal Y2K test plans be produced and executed by a transition team composed of members from the division representing FSL and other NOAA laboratories. These Y2K test plans were successful, and the NPN experienced no interruption in data delivery to NWS because of date related problems. Tests were also performed on the systems for Y2K+1 and 2001 Leap Year, and both passed with almost no effect.

The branch was directly responsible for enabling NPN data availability and reliability to reach new heights during the fiscal year. In collaboration with the NWS, a monitoring system was implemented so that NWS electronics technicians could begin diagnosing profiler problems with limited support from the Profiler Control Center (PCC). In operation at the NWS Telecommunications Gateway, this system observes the delivery of wind profiler data from the NPN hub. If data are interrupted at a particular profiler, the system delivers a message via AWIPS to the NWS forecast office responsible for maintaining that site. The electronic technician may then diagnose the problem and initiate any action deemed necessary.

A "process reengineering" task was completed that involved documentation of the operations of the PCC, NPN hub, and NPN instrumentation. In this vital task, the computer captures the processes and systems so that their requirements can be extracted and documented, and then used to engineer new systems needed to accomplish goals. Using these requirements, branch personnel began to develop and test a "software toolkit" consisting of low-level software objects that will become the foundation for accelerating efforts in Fiscal Year 2001.

During 2000, staff serviced an average of 18,000 Web hits per month. Thirty percent of these Webpage views were from nongovernmental sites with network domains of ".net" and ".com." For the first time, raw profiler data were made available via the web. Significant amounts of internal operational documentation were converted to the Web and placed on the division's Intranet site. Informational materials (e.g., the "Facts Fax Bulletin" and the "Chiefs Report") formerly distributed via fax or e-mail were also converted to the Web.

Projections

It is envisioned that an NWS national profiler network will have a network architecture different from the current NPN. In the modernization of the NPN, the division will be taking various steps toward implementation of the new architecture which will facilitate transition of the NPN to a national network operated by NWS. The following discusses three phases that will take place soon.

The first phase of transition to NWS operations consists of altering the delivery mechanism for data acquired at NPN sites. The data are now delivered in five composite bulletins, each containing information from up to eight different sites. When one of these sites is unable to deliver data via landline, the entire bulletin is delayed until it can be transmitted via the backup communication system that uses the GOES satellites. This delay sometimes affects regular delivery of data to the numerical models operating at FSL and the National Centers for Environmental Prediction (NCEP). Beginning this spring, a single message will be sent from each type of measuring instrument, per site, per time period to alleviate the delays caused by message formatting. Part of this phase also includes collaboration with NWS regional and headquarters personnel, as well as with other FSL divisions, to produce decoders so that data in the new formats can be accessible on AWIPS.

A prototype of the Wind Profiler Processing Platform (WPPP), which will break the time-based dependency between the Lockheed-Martin manufactured wind profiler and the GOES Data Collection Platform (DCP), will be in place this year. This is important because the manufacturer of the particular GOES DCP for which the profiler was engineered is no longer in business. Additional functions of the WPPP will include moving quality control processing to the actual site, becoming a central collection platform for other collocated instrumentation such as GPS water vapor and RASS, and collecting data for the diagnosis of faults occurring in the collocated instrumentation. The WPPP will produce data ready for display on AWIPS and in a format ready for routing by the NWS and Global Telecommunications System.

The second phase of the transition is installation of a WPPP in the three Alaskan network sites, the addition of the WPPP to the database of Line Replaceable Units (LRUS) for NPN wind profilers, and training for NWS personnel, all to be completed in mid-2002.

The third and final phase involves installation of a WPPP at each remaining profiler site, again with training and documentation for NWS personnel. The upgrade of the network to an operating frequency of 449MHz will also be completed.

The branch will also be working on a joint project between the NWS and the University of Northern Florida to demonstrate the feasibility of remote wireless communications for meteorological measuring instruments. This project will also exemplify new technology including independent "smart" sensors and self-configuring networks.

In 2001, the staff will continue to look for opportunities to use the Web to make operations more efficient and cost effective. The division's Website will be improved in both appearance and operation. It will also be hosted on more modernized hardware, and will use new Java and other current software technologies.


GPS-Met Observing Systems Branch
Seth I. Gutman, Chief

Objectives

The GPS-Met Observing Systems Branch develops and assesses techniques to measure atmospheric water vapor using ground-based GPS receivers. The branch was formed in 1994 in response to the need for improved moisture observations to support weather forecasting, climate monitoring, and research. The primary goals are to define and demonstrate the major aspects of an operational GPS integrated precipitable water vapor (IPWV) monitoring system, facilitate assessments of the impact of these data on weather forecasts, assist in the transition of these techniques to operational use, and encourage the use of GPS meteorology for atmospheric research and other applications. The work is carried on within the division at low cost and risk by utilizing the resources and infrastructure established to operate and maintain the NOAA Profiler Network (NPN).

To accomplish these goals, the branch collaborates with other NOAA organizations, government agencies, and universities to develop a 200-station demonstration network of GPS-Met observing systems by 2005. These collaborations allow the division to build, operate, and maintain a larger network of observing systems with lower cost, risk, and implementation time than would otherwise be possible using laboratory and division resources alone. The cornerstone of this effort is a "dual-use paradigm" within the federal government that allows leveraging of the substantial past and current investments in GPS made by other agencies such as the U.S. Coast Guard (USCG) and Federal HighWays Administration (FHWA) for purposes such as high accuracy surveying and improved transportation safety. From a technical standpoint, this is possible because of a fortuitous synergy between the use of GPS for precise positioning and navigation, and meteorological remote sensing. The branch is also taking advantage of the substantial effort that NOAA's National Geodetic Survey (NGS) has made to establish a growing network of Continuously Operating Reference Stations (CORS) in the Western Hemisphere. The CORS program collects, archives, and disseminates the GPS observations made by numerous organizations, including FSL, and distributes them to the general public. The branch contributes GPS and surface meteorological data acquired at 56 NPN and other NOAA sites. In fact, the fourth most requested dataset in the CORS network currently comes from the GPS-Met system on the roof of the David Skaggs Research Center in Boulder, Colorado (Figure 33).

DD - GPS-Met System/DSRC

Figure 33. The GPS-Met system on the roof of the David Skaggs Research Center.

Accomplishments

Real-Time GPS Meteorology

For the past three years, the branch has been collaborating with the Scripps Orbit and Permanent Array Center (SOPAC) at the Scripps Institution of Oceanography, and the School of Ocean and Earth Science and Technology (SOEST) at the University of Hawaii at Manoa, to develop real-time orbit and data processing techniques for NOAA. The effort has resulted in the first-ever practical implementation of real-time GPS meteorology (GPS-Met). By the end of Fiscal Year 2000, GPS-IPWV observations with millimeter-level accuracy were being made at 56 sites every 30 minutes with about 20-minute latency. The realization of real-time GPS-IPWV for objective weather forecasting (near real time for subjective applications) with no significant loss of accuracy compared with those achieved using post-processing techniques is the last major milestone in the 1994 GPS-Met Project Plan to be achieved.

To accomplish real-time GPS-Met, three things had to be achieved: accurate satellite orbits had to be available in real time, and ways to acquire and process GPS data from an expanding network of sites in the shortest possible time had to be developed, as follows.

Real-time Satellite Orbits – The calculation of integrated precipitable water vapor from GPS signal delays requires knowledge of the positions of the GPS satellites in Earth orbit with an error of less than 25 cm. In contrast, the accuracy of the orbit commonly available to civilian users of GPS is about 100 cm. Until this year, the improved satellite orbits were calculated once each day by seven orbit analysis centers belonging to the International GPS Service (http://igscb.jpl.nasa.gov/). These centers used data acquired over a 24-hour period at about 150 stations within the IGS global tracking network. They processed the data and produced improved orbits for high accuracy GPS positioning within about 8 hours. As a consequence, users had to wait about 32 hours for sufficiently accurate orbits to be available, an unacceptable delay for real-time applications that include (but are not limited to) weather forecasting. Recognizing that more frequent data would be required for these applications, several of the orbit centers took steps to acquire data from as many of the IGS tracking stations as feasible in the shortest possible time. This resulted in the availability of hourly data from a sufficiently large subset of the network to generate improved satellite orbits every hour using a "sliding window" technique developed at SOPAC. Short-term (two-hour) orbit predictions, also implemented at SOPAC, were shown to be accurate enough for use in real-time GPS-Met. This breakthrough left data acquisition and data processing as the only impediments to real-time GPS met.

Real-time Data Acquisition – When the GPS-Met project began, guidance from the Forecast Research Division was that the GPS-Met system should be able to provide timely and accurate data to the next-generation of numerical weather prediction (NWP) models, which were expected to have a one-hour data assimilation and forecast cycle by 1998. Therefore, the data acquisition cycle at the NPN sites was designed and implemented at 30-minute intervals to allow for further improvements in NWP capabilities. Leveraging the use of other federal agency GPS assets, such as the U.S. Coast Guard, in 1995 permitted NOAA to develop the GPS Water Vapor Demonstration Network quickly with low cost and risk. Data from these sites were acquired and distributed by NGS only once each day, so in 1999 they began expanding their capabilities to acquire and distribute these data every hour to keep up with user demand for more timely high-accuracy GPS data. They made substantial upgrades to their communications and data processing capabilities. Working with the branch in 2000, they developed and implemented methods to send GPS and surface meteorological observations at CORS sites to FSL every half-hour, bringing these sites into alignment with the data acquisition capabilities required for a next-generation upper-air observing system. Staff worked with the Facilities Management and Systems Administration Branch to develop and implement advanced server and Internet data transfer capabilities to keep pace with the growing supply of observations needed to complete the 200-station demonstration network by 2005.

Real-time Data Processing – To meet the demands of an hourly NWP data assimilation cycle, observations must be available to the models with less than 20-minute latency. The challenge of acquiring timely data from an expanding network of GPS-Met sites was previously discussed, but data processing is an entirely different matter. The task here is to combine these 30-minute observations with improved real-time orbits and other parameters, to calculate the signal delays at each site using common constraints provided by four "long-baseline" fiducial sites. Once this has been accomplished, quality controls have to be applied, the derived wet signal delays have to be mapped into IPWV, and the observations and retrievals have to be made available to the FSL and other groups within and outside NOAA. Figure 34 is a generalized diagram of this process. Until this year, "static" data processing was carried out once each day using expensive Unix workstations, and it took about 6 hours to calculate water vapor from 56 sites. The branch developed a scalable distributed processing system using fast and inexpensive personal computer workstations running the Linux operating system. Branch staff also implemented a "sliding window" data processing technique based on the one developed at Scripps in collaboration with the University of Hawaii to replace the old technique of generating static 24-hour solutions. This increased data processing speed by a factor of two, while reducing hardware costs by almost a factor of four. A high-level diagram of this system is shown in Figure 35.

DD - GPS-Met Data Processing

Figure 34. Diagram of data processing for the GPS-Met sites.

DD - Scalable Distributed Data Processing

Figure 35. Diagram of a scalable distributed data processing system.

Confirmation of GPS-IPWV Accuracy – Figure 36 shows a comparison between near real-time GPS-IPWV calculated every 30 minutes and static (24-hour) solutions during qualification testing of the new real-time data processing technique in April and May 2000. Comparisons number 2050, with a mean difference of 0.20 mm and a standard deviation of 1.02 mm of delay. This translates to an average difference of about 0.03 mm IPWV +/- .15 mm IPWV.

DD - GPS-IPWV Comparisons

Figure 36. Comparison between near real-time GPS-IPWV calculated every 30 minutes and static 24-hour solutions.

In September 2000, the branch participated in its third water vapor intensive observing period (WVIOP) experiment at the Department of Energy's Atmospheric Radiation Measurement (ARM) facility near Lamont, Oklahoma. The major difference between this WVIOP and previous ones was the availability of near real-time GPS-IPWV data to support on-the-fly comparisons with other instruments. Preliminary results indicate that GPS-IPWV measurement error is closer to 3.5% than the previous estimate of 5% that was determined in 1997. Figure 37 is a comparison of near real-time GPS data and rawinsonde measurements during WVIOP 2000.

DD - More GPS Data Comparisons

Figure 37. Comparison of near real-time GPS data and rawinsonde measurements during WVIOP 2000.

GPS-IPWV Impact on Weather Forecasts

Evaluations of the impact of GPS-IPWV observations on weather forecast accuracy have been conducted by the Regional Analysis and Prediction Branch of FSL's Forecast Research Division since 1997. In 2000, a previously undetected problem with the assimilation of GPS data into the Rapid Update Cycle (RUC) was corrected, and all of the GPS-IPWV observations from 56 sites were made available to the 60-km RUC for parallel runs (with and without GPS). Table 1 summarizes the results, which shows that improvements in 3-hour relative humidity forecasts using the 60-km RUC NWP model with GPS observations in parallel cycles using optimal interpolation are greatest at the lowest levels and sensitive to the number of stations in the network. Table 2 shows the results for the lowest 2 levels, 850 hPa and 750 hPa.

Table 1
Improvement in RH forecast accuracy with number of stations. Results
are expressed in terms of forecast from RUC, verification from radiosondes.

Improvement in RH Forecasts by pressure level


18-station tests: 857 in 1998 – 1999 56-station tests: 421 in 2000
850 hPa .15 .38
700 hPa .11 .41
500 hPa .07 .21
400 hPa .03 .01
300 hPa .01 .01

Period of parallel test statistics: March 1998 – September 1999;
February – November 2000


Table 2
Percentage of forecasts unchanged, improved, or made worse through the addition
of GPS-IPWV data to the 60-km RUC model between 1998 – 1999 and 2000.


850 hPa 1998 – 1999 2000 Change
Same 54% 40% -25.9%
Better 28% 37% +37.0%
Worse 19% 23% +15.0%
750 hPa 1998 – 1999 2000 Change
Same 54% 40% -25.9%
Better 25% 39% +56.0%
Worse 21% 21% 0.0%

Expansion of the GPS Water Vapor Demonstration Network

An agreement between OAR/FSL and the Department of Transportation's Federal HighWays Administration (FHWA) was signed this year that allows the installation of NOAA GPS Surface Observing System (GSOS) packages at all Nationwide Differential GPS (NDGPS) sites. There will be about 70 NDGPS sites in the water vapor demonstration network by 2004.

The branch added three sites to the network in 2000: one at the Nationwide Differential GPS (NDGPS) site at Driver, Virginia; a second at the last NPN site to receive a GPS water vapor observing system, Slater, Iowa; and the last at the Ground Winds Lidar facility at Bartlett, New Hampshire, operated by the Mount Washington Observatory. Figure 38 shows the network at the end of Fiscal Year 2000, along with stations identified for inclusion in 2001. A total of 56 sites were delivering data at the close of the year.

DD - GPS Water Vapor Demonstration Network

Figure 38. The GPS Water Vapor Demonstration Network at the end of Fiscal Year 2000, and stations identified for inclusion in 2001.

Projections

A major effort will be made to expand the number of stations in the Water Vapor Demonstration Network in 2001. GSOS payloads will be installed at about 19 more FHWA NDGPS sites, bringing the expected total to about 23 sites next year. An agreement is under discussion with the U. S. Coast Guard and Army Corps of Engineers that will permit the division to install GSOS payloads at all remaining Maritime Differential GPS sites. This agreement will allow the addition of another 20 sites to the network, bringing the potential number of sites in the network to about 95.

The branch will work with various FSL divisions and NCEP to make GPS-IPWV data available to the wider NOAA forecaster and modeler communities, and will work with others in FSL to display GPS-IPWV on advanced NOAA workstations including FX-Net, W4, and AWIPS.

Another collaborative effort involves the UCAR SuomiNet program to add some of their sites to the NOAA/FSL GPS Water Vapor Demonstration Network. Of special interest are sites owned by the University of Oklahoma in and around the ARM site, and one belonging to Plymouth State College in New Hampshire.

In cooperation with the NWS forecast offices in Florida, branch staff will determine if GPS-IPWV data can improve weather forecasts during the convective storm season. This will be accomplished by working with Florida Department of Transportation (FDOT) and the NWS forecast offices to install FDOT differential GPS receivers at all forecast offices in Florida, add the sites to the Water Vapor Demonstration Network, and retrieve, process, and provide the data to the forecasters in near real time. Plans are to work with the 45th Weather Squadron at Patrick Air Force Base and NASA at Kennedy Space Center to density the GPS network in the Cape Canaveral/KSC area, and help them to use these data to address their primary weather challenge: lightning predictions.

The branch will collaborate with the NASA Langley Research Center (LARC) on the Clouds and the Earth's Radiant Energy System (CERES) project, a high priority NASA satellite program which has several important goals. The branch's involvement will be to assist LARC to install a GPS-Met system on an offshore ocean platform, and provide them with accurate water vapor data in near real time. Since water vapor is a significant forcing function in radiative transfer processes within the atmosphere, GPS-IPWV data will be used to constrain their models of the incoming and outgoing shortwave and longwave radiation within the atmosphere.


Facilities Management and Systems Administration Branch
Bobby R. Kelley, Chief

Objectives

The Facilities Management and Systems Administration Branch manages and supports the division's communications and computer requirements in operations, maintenance, and support. Duties include performing systems operations, systems maintenance, systems administration, network administration, and NOAA Profiler Network (NPN) telecommunications administration. These responsibilities cover a broad range of computers and communications equipment. NPN processing is accommodated on 13 MicroVAXes configured in two clusters, primary and backup. Other data processing and Web page hosting is accomplished on a Sun E3000 server. Two Sun Ultra I workstations and 10 PCs running Linux are used for data acquisition, processing and distribution for the GPS Integrated Precipitable Water Vapor demonstration project. Four other PCs running Linux are supporting software development and testing to modernize the NPN processing system. The division's file and e-mail server is a PC running Microsoft NT-4 that provides connectivity to 31 PCs running Microsoft Windows 9x or NT-4. Backup NPN data communications are provided through an Intel 386-based PC running SCO Unix that is connected to a DOMSAT satellite receiver. Telecommunication responsibilities cover 38 NPN data circuits within the lower 48 states and in Alaska. Day-to-day work includes installing and configuring new components and systems on the division network, network problem isolation and maintenance, coordination with other FSL and building network staff, modifying system configurations to meet division requirements, system problem isolation and maintenance, in-house telecommunications maintenance or coordination for contracted maintenance, peripheral installation and configuration, computer and network security, preventive maintenance, information technology purchasing, property control, and routine system backups.

Accomplishments

Over the last 12 to 18 months, improvements have been implemented to ensure NPN data availability. To eliminate a single point of failure for electrical power, a 70 kVA uninterruptable power system (UPS) was installed and the division's computer facility was connected to the David Skaggs Research Center emergency generator. This work was completed prior toY2K and has since served the division well on many occasions. The UPS maintains electrical power for the division computer facility until the building emergency generator is online, and the UPS buffers the computer and communications systems against power spikes, eliminating data loss due to electrical power problems. New branch personnel were trained on NPN operations, reemphasizing monitoring and preventive maintenance; this resulted in very few calls from division customers over the last 12 to 18 months. Also, the branch is now on call 24 hours a day, 7 days a week to handle data delivery problems to division customers. To ensure protection of computer facility equipment, temperature sensors connected to an automatic paging system were installed in the computer room to provide alerts around the clock when abnormal temperature increases are detected. Since the Facility Division is now a recipient of division products and its operations staff now provide 24 hours per day onsite coverage, additional monitoring of division data delivery is now in effect. Close coordination between operators in the branch and the Facility Division enables quick response to data delivery problems.

For calendar year 2000, the payoff in efforts taken to eliminate single points of failure and minimize downtime and risks resulted in the uptime for the NPN processing systems averaging 99.3%, communications systems uptime averaging 96.8%, and data delivery to the National Weather Service (NWS) averaging 94.4%. Data delivery to NWS in calendar year 2000 improved by more than 4% over calendar year 1999 (effective improvement of approximately 5%). A summary of profiler data availability for 2000 is shown in Figure 39.

DD - NPN Data Availability

Figure 39. Summary of profiler data availability from January – December 2000.

Telecommunications risks and costs were minimized by requesting an exception to FTS-2001 for NPN data circuits. Department of Commerce approval of this request was obtained, and telecommunications services through AT&T were retained. This permitted continued use of existing local division equipment valued at approximately $130,000, ensuring continuous reliable NPN telecommunications services. A savings of approximately $750,000 over five years, compared to proposals for similar services by FTS -2001 providers, will be achieved using the existing AT&T equipment and circuits.

Projections

Plans are to maintain current operations and to ensure continued timely data delivery to all customers. Ongoing improvements are in support of software development and testing for the modernized NPN processing system. A low cost, high performance approach is being developed and tested using off-the-shelf PCs running the Linux operating system. The outdated NPN backup data communications system will be replaced with a modem PC running the Linux operating system and the new DOMSAT data communications software. To ensure rapid recovery from computer failures, an improved file backup system will be implemented. In all cases, the low cost, high performance approach will be the method employed to continue meeting mission requirements.


FSL Staff FSL in Review (Other Years) FSL Forum Publications