ESRL Global Systems Division

Advanced Computing Section

The Boulder HPC Facility: Exploring New Computing Technologies for NOAA

The Advanced Computing Section (ACS) of NOAA's Earth System Research Laboratory both supports modeling activities in the laboratory, and explores  new hardware and software technologies needed to run high resolution weather and climate models more quickly and accurately on High Performance Computing (HPC) systems.  The ACS is currently exploring accelerators for use in our weather models.  We also developed the Scalable Modeling System (SMS) to provide traditional parallelization support for our weather models.  SMS has been used to parallelize more than a dozen weather and ocean models since 1993 including the Rapid Update Cycle (RUC), Eta, Hycom, POM, FIM and NIM. 

Picture of the Boulder HPC computer
Intel Harpertown Linux Cluster, 10 GFlops of Computing per CPU Core (2008)

High Performance Computing

Over the last 25 years, super-computing has evolved from Cray vector machines, to a wide variety of commodity-based and vendor-specific CPU systems. CPUs have grown from a single processor per chip, to multi-core systems containing 8 or more CPU cores on a single chip. Modern systems are diverse; they can be shared memory, distributed memory or a hybrid mix of both. The ACS works with most types of HPC systems in use today, and does research and development in many areas of HPC including:

Parallel Programming
ESMF and NEMS Modeling Frameworks
Cloud Computing
Web Services
Grid Computing

Picture of the Tesla GPU
NVIDIA Tesla GPU: 1 TeraFlop of Computing (2008)

Supporting Model Development

We support the lifecycle of modeling including development of the model, parallelization, optimization, configuration, testing, and evaluation. In addition to providing parallelization support, we work with modelers during development so their codes are designed to take advantage of the latest developments in HPC architectures. For example, our staff have been on the design team of the Flow following finite volume Icosahedral Model (FIM) and continue to be involved in development, testing and evaluation activities. We are also helping develop the Non-hydrostatic Icosahedral Model (NIM) so it can run efficiently on both CPU and GPU architectures.

We have also developed two portals. to support ESRL and Developmental Testbed Center (DTC) modeling activities.

  • WRF Domain Wizard is used to define model domains for the Weather Research and Forecast (WRF) model
  • WRF Portal supports systematic test and evaluation of FIM, WRF and other models.

ACS Staff

Mark Govett

Tom Henderson

Jacques Middlecoff

Paul Madden

Jim Rosinski


Prepared by Mark Govett,
Date of last update: March 25, 2014