1. CCPP Overview

Ideas for the Common Community Physics Package (CCPP) originated within the Earth System Prediction Capability physics interoperability group (now the Interagency Council for Advancing Meteorological Services; ICAMS), which has representatives from the US National Center for Atmospheric Research (NCAR), the Navy, National Oceanic and Atmospheric Administration (NOAA) Research Laboratories, NOAA National Weather Service, and other groups. Physics interoperability, or the ability to run a given physics suite in various host models, has been a goal of this multi-agency group for several years. An initial mechanism to run the physics of NOAA’s Global Forecast System (GFS) model in other host models, the Interoperable Physics Driver (IPD), was developed by the NOAA Environmental Modeling Center (EMC) and later augmented by the NOAA Geophysical Fluid Dynamics Laboratory (GFDL).

The CCPP expanded on that work by meeting additional requirements put forth by NOAA, and brought new functionalities to the physics-dynamics interface. Those include the ability to choose the order of parameterizations, to subcycle individual parameterizations by running them more frequently than other parameterizations, and to group arbitrary sets of parameterizations allowing other computations in between them (e.g., dynamics and coupling computations). The IPD was phased out in 2021 in favor of the CCPP as a single way to interface with physics in the UFS

The architecture of the CCPP and its connection to a host model is shown in Figure 1.1. Two elements of the CCPP are highlighted: a library of physical parameterizations (CCPP Physics) that conforms to selected standards and an infrastructure (CCPP Framework) that enables connecting the physics to a host model. The third element (not shown) is the CCPP Single Column Model (SCM), a simple host model that can be used with the CCPP Physics and Framework.

_images/ccpp_arch_host.png

Fig. 1.1 Architecture of the CCPP and its connection to a host model, represented here as the driver for an atmospheric model (yellow box). The dynamical core (dycore), physics, and other aspects of the model (such as coupling) are connected to the driving host through the pool of physics caps. The CCPP Physics is denoted by the gray box at the bottom of the physics, and encompasses the parameterizations, which are accompanied by physics caps.

The host model needs to have functional documentation (metadata) for any variable that will be passed to or received from the physics. The CCPP Framework is used to compare the variables requested by each physical parameterization against those provided by the host model 1, and to check whether they are available, otherwise an error will be issued. This process serves to expose the variables passed between physics and dynamics, and to clarify how information is exchanged among parameterizations. During runtime, the CCPP Framework is responsible for communicating the necessary variables between the host model and the parameterizations.

The CCPP Physics contains the parameterizations and suites that are used operationally in the UFS Atmosphere, as well as parameterizations that are under development for possible transition to operations in the future. The CCPP aims to support the broad community while benefiting from the community. In such a CCPP ecosystem (Figure 1.2), the CCPP can be used not only by the operational centers to produce operational forecasts, but also by the research community to conduct investigation and development. Innovations created and effectively tested by the research community can be funneled back to the operational centers for further improvement of the operational forecasts.

Both the CCPP Framework and the CCPP Physics are developed as open source code, follow industry-standard code management practices, and are freely distributed through GitHub (https://github.com/NCAR/ccpp-physics and https://github.com/NCAR/ccpp-framework). This documentation is housed in repository https://github.com/NCAR/ccpp-doc.

_images/CCPP_Ecosystem_Detailed-Diagram_only.png

Fig. 1.2 CCPP ecosystem.

The CCPP is governed by the groups that contribute to its development. The CCPP Physics code management is collaboratively determined by NOAA, NCAR, and the Navy Research Laboratory (NRL), and the DTC works with EMC and its sponsors to determine schemes and suites to be included and supported. The governance of the CCPP Framework is jointly undertaken by NOAA and NCAR (see more information at https://github.com/NCAR/ccpp-framework/wiki and https://dtcenter.org/community-code/common-community-physics-package-ccpp).

The table below lists all parameterizations supported in CCPP public releases and the CCPP Scientific Documentation describes the parameterizations in detail. The parameterizations are grouped in suites, which can be classified primarily as operational or developmental. Operational suites are those used by operational, real-time weather prediction models. For this release, the only operational suite is GFS_v16, which is used for version 16 of the GFS model. Developmental suites are those that are officially supported for this CCPP release with one or more host models, but are not currently used in any operational models. These may include schemes needed exclusively for research, or “release candidate” schemes proposed for use with future operational models.

Table 1.1 Suites supported in the CCPP for the UFS SRW v2.1.0 release

Operational

Developmental

Physics Suite

GFS_v16

RRFS_v1beta

WoFS

HRRR

Microphysics

GFDL

Thompson

NSSL

Thompson

PBL

TKE EDMF

MYNN-EDMF

MYNN-EDMF

MYNN-EDMF

Deep convection

saSAS

N/A

N/A

N/A

Shallow convection

saMF

N/A

N/A

N/A

Radiation

RRTMG

RRTMG

RRTMG

RRTMG

Surface layer

GFS

MYNN-SFL

MYNN-SFL

MYNN-SFL

Gravity Wave Drag

CIRES-uGWP

CIRES-uGWP

CIRES-uGWP

GSL drag

Land surface

Noah

Noah-MP

Noah-MP

RUC

Ozone

NRL 2015

NRL 2015

NRL 2015

NRL 2015

Strat H2O

NRL 2015

NRL 2015

NRL 2015

NRL 2015

Ocean

NSST

NSST

NSST

NSST

Only the suites supported with the UFS SRW App v2.1.0 release are listed in the table. Currently all supported suites use the 2015 Navy Research Laboratory (NRL) ozone and stratospheric water vapor schemes, and the NSST ocean scheme.

The operational GFS_v16 suite includes GFDL microphysics, the Turbulent Kinetic Energy (TKE)-based Eddy Diffusivity Mass-Flux (EDMF) planetary boundary layer (PBL) scheme, scale-aware (sa) Simplified Arakawa-Schubert (SAS) deep convection, scale-aware mass-flux (saMF) shallow convection, Rapid Radiation Transfer Model for General Circulation Models (RRTMG) radiation, GFS surface layer scheme, the Cooperative Institute for Research in the Environmental Sciences (CIRES) unified gravity wave drag (uGWD) scheme, and the Noah Land Surface Model (LSM).

The three developmental suites are either analogues for current operational physics schemes, or candidates for future operational implementations.

Those interested in the history of previous CCPP releases should know that the first public release of the CCPP took place in April 2018 and included all the parameterizations of the operational GFS v14, along with the ability to connect to the SCM. The second public release of the CCPP took place in August 2018 and additionally included the physics suite tested for the implementation of GFS v15. The third public release of the CCPP, in June 2019, had four suites: GFS_v15, corresponding to the GFS v15 model implemented operationally in June 2019, and three developmental suites considered for use in GFS v16 (GFS_v15plus with an alternate PBL scheme, csawmg with alternate convection and microphysics schemes, and GFS_v0 with alternate convection, microphysics, PBL, and land surface schemes). The CCPP v4.0 release, issued in March 2020, contained suite GFS_v15p2, which is an updated version of the operational GFS v15 and replaced suite GFS_v15. It also contained three developmental suites: csawmg with minor updates, GSD_v1 (an update over the previously released GSD_v0), and GFS_v16beta, which was the target suite at the time for implementation in the upcoming operational GFSv16 (it replaced suite GFSv15plus). CCPP v4.0 was the first release supported for use with the UFS Weather Model, more specifically as part of the UFS Medium-Range Weather (MRW) Application. The CCPP v4.1 release, issued in October 2020, was a minor upgrade with the capability to build the code using Python 3 (previously only Python 2 was supported). The CCPP v5.0 release, issued in February 2021, was a major upgrade to enable use with the UFS Short-Range Weather (SRW) Application and the RRFS_v1alpha suite. The CCPP v6.0.0 release, issued in June 2022, was a major upgrade in conjunction with the release of the UFS SRW v2.0 release.

1

As of this writing, the CCPP has been validated with two host models: the CCPP SCM and the atmospheric component of NOAA’s Unified Forecast System (UFS) (hereafter the UFS Atmosphere) that utilizes the Finite-Volume Cubed Sphere (FV3) dynamical core. The CCPP can be utilized both with the global and limited-area configurations of the UFS Atmosphere. The CCPP has also been run experimentally with a Navy model. Work is under way to connect and validate the use of the CCPP Framework with NCAR models.

1.1. Additional Resources

For the latest version of the released code and additional documentation, please visit the DTC Website.

Please post questions and comments to the GitHub discussions board for the relevant code repository:

1.2. How to Use this Document

This document contains documentation for the Common Community Physics Package (CCPP). It describes the

The following table describes the type changes and symbols used in this guide.

Table 1.2 Type changes and symbols used in this guide.

Typeface or Symbol

Meaning

Examples

AaBbCc123

  • The names of commands, files, and directories

  • On-screen terminal output

  • Edit your .bashrc file

  • Use ls -a to list all files.

  • host$ You have mail!

AaBbCc123

  • The names of CCPP-specific terms, subroutines, etc.

  • Captions for figures, tables, etc.

  • Each scheme must include at least one of the following subroutines: _timestep_init, _init, _run, _finalize, and _timestep_finalize.

  • Listing 2.1: Fortran template for a CCPP-compliant scheme showing the _run subroutine.

AaBbCc123

Words or phrases requiring particular emphasis

Fortran77 code should not be used

Following these typefaces and conventions, shell commands, code examples, namelist variables, etc. will be presented in this style:

mkdir ${TOP_DIR}

Some CCPP-specific terms will be highlighted using italics, and words requiring particular emphasis will be highlighted in bold text.

In some places there are helpful asides or warnings that the user should pay attention to; these will be presented in the following style:

Note

This is an important point that should not be ignored!

In several places in the technical documentation, we need to refer to locations of files or directories in the source code. Since the directory structure depends on the host model, in particular the directories where the ccpp-framework and ccpp-physics source code is checked out, and the directory from which the ccpp_prebuild.py code generator is called, we use the following convention:

  1. When describing files relative to the ccpp-framework or ccpp-physics top-level, without referring to a specific model, we use ccpp-framework/path/to/file/A and ccpp-physics/path/to/file/B.

  2. When describing specific tasks that depend on the directory structure within the host model, for example how to run ccpp_prebuild.py, we explicitly mention the host model and use its directory structure relative to the top-level directory. For the example of the SCM: ./ccpp/framework/path/to/file/A.