3. CCPP Configuration and Build Options

While the CCPP Framework code, consisting of a single Fortran source file and associated metadata file, can be compiled and tested independently, the CCPP Physics code can only be used within a host modeling system that provides the variables required to execute the physics. As such, it is advisable to integrate the CCPP configuration and build process with the host model build system. Part of the build process, known as the prebuild step since it precedes compilation, involves running a Python script that performs multiple functions. These functions include configuring the CCPP Physics for use with the host model and autogenerating FORTRAN code to communicate variables between the physics and the dynamical core. The prebuild step will be discussed in detail in Chapter 8.

The SCM and the UFS Atmosphere are supported for use with the CCPP. In the case of the UFS Atmosphere as the host model, build configuration options can be specified as cmake options to the build.sh script for manual compilation or through a regression test (RT) configuration file. Detailed instructions for building the UFS Atmosphere and the SCM are discussed in the UFS Weather Model User Guide and the SCM User Guide. For both SCM and UFS the ccpp_prebuild.py script is run automatically as a step in the build system, although it can be run manually for debugging purposes.

The path to a host-model specific configuration file is the only required argument to ccpp_prebuild.py. Such files are included with the ccpp-scm and ufs-weather-model repositories, and must be included with the code of any host model to use the CCPP. Figure 3.1 depicts the main functions of the ccpp_prebuild.py script for the build. Using information included in the configuration file and the SDF(s), the script parses the SDF(s) and only matches provided/requested variables that are used within the particular physics suite(s). The script autogenerates software caps for the physics suite(s) as a whole and for each physics group as defined in the SDF(s), as well as for an API that the host model calls into from the (manually written) host model cap. At runtime, a single SDF is used to select the suite that will be executed in the run. This arrangement allows for efficient variable recall (which is done once for all physics schemes within each group of a suite), leads to a reduced memory footprint of the CCPP, and speeds up execution.

_images/ccpp_static_build.png

Fig. 3.1 This figure depicts an example of the interaction between an atmospheric model and CCPP Physics for one timestep, and a single SDF, with execution progressing toward the right. The “Atmosphere Driver” box represents model superstructure code, perhaps responsible for I/O, time-stepping, and other model component interactions. Software caps are autogenerated for the suite and physics groups, defined in the SDF provided to the ccpp_prebuild.py script. The suite must be defined via the SDF at prebuild time. When multiple SDFs are provided during the build step, multiple suite caps and associated group caps are produced, but only one is used at runtime.