From 208ceb06595989f62191309b549288a47edd2878 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Mon, 6 Oct 2025 12:10:05 +0200 Subject: [PATCH 1/9] Collected install instructions into one page --- docs/_toc.yml | 2 - docs/users_guide/installation/README.md | 29 +++++++++++++++ .../installation/source_installation.md | 37 ------------------- 3 files changed, 29 insertions(+), 39 deletions(-) delete mode 100644 docs/users_guide/installation/source_installation.md diff --git a/docs/_toc.yml b/docs/_toc.yml index 7f74453..cb3a4c6 100644 --- a/docs/_toc.yml +++ b/docs/_toc.yml @@ -37,8 +37,6 @@ parts: - caption: Developer's Guide chapters: - - file: users_guide/installation/source_installation - title: Building eCLM from source - url: https://hpscterrsys.github.io/eCLM/src title: eCLM Source Code Browser diff --git a/docs/users_guide/installation/README.md b/docs/users_guide/installation/README.md index 5599edd..15bdf0e 100644 --- a/docs/users_guide/installation/README.md +++ b/docs/users_guide/installation/README.md @@ -1,5 +1,15 @@ # Installing eCLM +## Requirements + +* MPI compilers (e.g. OpenMPI) +* CMake +* LAPACK +* [netCDF C and Fortran libraries](https://downloads.unidata.ucar.edu/netcdf) +* [PnetCDF](https://github.com/Parallel-NetCDF/PnetCDF) + +## Method 1: Build eCLM through TSMP2 (recommended) + The easiest way to install eCLM is through [TSMP2 build system](https://github.com/HPSCTerrSys/TSMP2). ```sh @@ -11,3 +21,22 @@ cd TSMP2 ./build_tsmp2.sh --eCLM ``` +## Method 2: Building from source (for advanced users) + +```sh +# Download eCLM +git clone https://github.com/HPSCTerrSys/eCLM.git +cd eCLM + +# Create eCLM install directory +mkdir install + +# Set compilers +export CC=mpicc FC=mpifort + +# Build and install eCLM +cmake -S src -B bld -DCMAKE_INSTALL_PREFIX=install +cmake --build bld --parallel +cmake --install bld +``` + diff --git a/docs/users_guide/installation/source_installation.md b/docs/users_guide/installation/source_installation.md deleted file mode 100644 index 2a2a304..0000000 --- a/docs/users_guide/installation/source_installation.md +++ /dev/null @@ -1,37 +0,0 @@ -# Installing eCLM from source - -```{warning} -For advanced users. -``` - -## Requirements - -* MPI compilers (e.g. OpenMPI) -* CMake -* LAPACK -* [netCDF C and Fortran libraries](https://downloads.unidata.ucar.edu/netcdf) -* [PnetCDF](https://github.com/Parallel-NetCDF/PnetCDF) - -## Steps - -```sh -# Download eCLM -git clone https://github.com/HPSCTerrSys/eCLM.git -cd eCLM - -# Create eCLM install directory -mkdir install - -# Set compilers -export CC=mpicc FC=mpifort - -# Build and install eCLM -cmake -S src -B bld -DCMAKE_INSTALL_PREFIX=install -cmake --build bld --parallel -cmake --install bld -``` - -## Reference build scripts - -- [eCLM build on Ubuntu](https://github.com/HPSCTerrSys/eCLM/blob/4d567d2d68cac0fba977914b4a9c3ba199afd0ff/.github/workflows/CI.yml#L70-L121) -- [eCLM build on TSMP2](https://github.com/HPSCTerrSys/TSMP2/blob/master/cmake/BuildeCLM.cmake) From 6910b0ba0c15729ca6c977fa072582d7b3983b69 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Mon, 6 Oct 2025 12:46:52 +0200 Subject: [PATCH 2/9] Quick touch up of Wuestebach test case --- docs/users_guide/case_examples/README.md | 10 +-- docs/users_guide/case_examples/Wuestebach.md | 88 +++----------------- 2 files changed, 11 insertions(+), 87 deletions(-) diff --git a/docs/users_guide/case_examples/README.md b/docs/users_guide/case_examples/README.md index 3c98c12..ea43cd6 100644 --- a/docs/users_guide/case_examples/README.md +++ b/docs/users_guide/case_examples/README.md @@ -1,12 +1,4 @@ # Running example cases -Always load the eCLM environment before creating a case. This only needs to be done once per terminal session. - -```sh -source $HOME/load-eclm-variables.sh -``` - -## Cases - ```{tableofcontents} -``` \ No newline at end of file +``` diff --git a/docs/users_guide/case_examples/Wuestebach.md b/docs/users_guide/case_examples/Wuestebach.md index 8b26533..ae9272c 100644 --- a/docs/users_guide/case_examples/Wuestebach.md +++ b/docs/users_guide/case_examples/Wuestebach.md @@ -1,5 +1,8 @@ # Single-Point Wuestebach +```{warning} TODO +``` + This test case at point scale covers the Wuestebach test site that is part of the TERENO network. Wuestebach is a forest site located in the Eifel region in Germany. To set up eCLM and run this test case, follow the instructions below. ```{figure} ../images/wtb_bogena.png @@ -9,34 +12,13 @@ This test case at point scale covers the Wuestebach test site that is part of th Location of the Wüstebach test site within the TERENO Rur/Lower Rhine Valley observatory. Adapted from Bogena et al (2010). ``` -## 1. Copy the namelist files -For JSC users, all required namelist and input files to run this case are in the shared directory `/p/scratch/cslts/shared_data/rlmod_eCLM` - -```sh -mkdir test_cases -cp -r /p/scratch/cslts/shared_data/rlmod_eCLM/example_cases/wtb_1x1 test_cases/ -cd test_cases/wtb_1x1 -``` - -## 1. Download and extract data files (**For non JSC users**) +## 1. Download Wuestebach data files -You can download all required files through the JSC datahub. ```sh -mkdir -p test_cases/1x1_wuestebach -wget https://datapub.fz-juelich.de/slts/eclm/1x1_wuestebach.tar.gz -tar xf 1x1_wuestebach.tar.gz -C test_cases/1x1_wuestebach -``` -The repository contains two directories. The `common` directory contains some general input files necessary to run eCLM cases. The `wtb_1x1` directory contains the case specific domain and surface files as well as atmospheric forcing data and a script for namelist generation. - -```sh -# Generate namelists -cd test_cases/1x1_wuestebach -export ECLM_SHARED_DATA=$(pwd) -cd wtb_1x1 -clm5nl-gen wtb_1x1.toml - -# Validate namelists -clm5nl-check . +git clone https://icg4geo.icg.kfa-juelich.de/ExternalReposPublic/tsmp2-static-files/extpar_eclm_wuestebach_sp.git +cd extpar_eclm_wuestebach_sp/static.resources +generate_wtb_namelists.sh 1x1_wuestebach +cd 1x1_wuestebach ``` ## 2. Check the case setup @@ -55,58 +37,8 @@ cat drv_in ## 3. Run the test case -Customize the copied job script `run-eclm-job.sh` as desired. In this example, it is already customized to this test case, you should just adapt the SBATCH parameters `--account` to your compute project and `--partition` to your system. As Wuestebach is a single-column case, the number of processors should be set to 1 (SBATCH parameter `--ntasks-per-node=1`). - -**For non JSC users**: Create a job script in your case directory with: - -```sh -cat << EOF > run-eclm-job.sh -``` -Adapt the SBATCH parameters and then copy the following in the shell file: - -```sh -#!/usr/bin/env bash -#SBATCH --job-name=wtb_1x1 -#SBATCH --nodes=1 -#SBATCH --ntasks-per-node=1 -#SBATCH --account=jibg36 -#SBATCH --partition=batch -#SBATCH --time=1:00:00 -#SBATCH --output=logs/%j.eclm.out -#SBATCH --error=logs/%j.eclm.err - -ECLM_EXE=${eCLM_ROOT}/install/bin/eclm.exe -if [[ ! -f $ECLM_EXE || -z "$ECLM_EXE" ]]; then - echo "ERROR: eCLM executable '$ECLM_EXE' does not exist." - exit 1 -fi - -# Set PIO log files -if [[ -z $SLURM_JOB_ID || "$SLURM_JOB_ID" == " " ]]; then - LOGID=$(date +%Y-%m-%d_%H.%M.%S) -else - LOGID=$SLURM_JOB_ID -fi -mkdir -p logs timing/checkpoints -LOGDIR=$(realpath logs) -comps=(atm cpl esp glc ice lnd ocn rof wav) -for comp in ${comps[*]}; do - LOGFILE="$LOGID.comp_${comp}.log" - sed -i "s#diro.*#diro = \"$LOGDIR\"#" ${comp}_modelio.nml - sed -i "s#logfile.*#logfile = \"$LOGFILE\"#" ${comp}_modelio.nml -done - -# Run model -srun $ECLM_EXE -EOF -``` - -Then you can submit your job: - -```sh -sbatch run-eclm-job.sh +```bash +mpirun -np 1 eclm.exe ``` -To check the job status, run `sacct`. - The model run is successful if the history files (`wtb_1x1.clm2.h0.*.nc`) have been generated. From c460fcc8e7733674cc1aa66e5056733b61e94e38 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Mon, 6 Oct 2025 13:57:02 +0200 Subject: [PATCH 3/9] Removed outdated 'create custom case' guides --- docs/_toc.yml | 11 --- .../case_creation/1_create_grid_file.md | 55 ----------- .../case_creation/2_create_mapping_file.md | 39 -------- .../case_creation/3_create_domain_file.md | 37 -------- .../case_creation/4_create_surface_file.md | 43 --------- .../5_modifications_surface_domain_file.md | 39 -------- .../case_creation/6_create_atm_forcings.md | 95 ------------------- docs/users_guide/case_creation/README.md | 14 +-- 8 files changed, 4 insertions(+), 329 deletions(-) delete mode 100644 docs/users_guide/case_creation/1_create_grid_file.md delete mode 100644 docs/users_guide/case_creation/2_create_mapping_file.md delete mode 100644 docs/users_guide/case_creation/3_create_domain_file.md delete mode 100644 docs/users_guide/case_creation/4_create_surface_file.md delete mode 100644 docs/users_guide/case_creation/5_modifications_surface_domain_file.md delete mode 100644 docs/users_guide/case_creation/6_create_atm_forcings.md diff --git a/docs/_toc.yml b/docs/_toc.yml index cb3a4c6..3199bed 100644 --- a/docs/_toc.yml +++ b/docs/_toc.yml @@ -27,13 +27,6 @@ parts: - file: users_guide/case_creation/README title: Creating a custom case - sections: - - file: users_guide/case_creation/1_create_grid_file - - file: users_guide/case_creation/2_create_mapping_file - - file: users_guide/case_creation/3_create_domain_file - - file: users_guide/case_creation/4_create_surface_file - - file: users_guide/case_creation/5_modifications_surface_domain_file - - file: users_guide/case_creation/6_create_atm_forcings - caption: Developer's Guide chapters: @@ -45,7 +38,3 @@ parts: - file: reference/history_fields - url: https://escomp.github.io/CTSM/release-clm5.0/tech_note/index.html title: CLM5 Technical Note - - url: https://github.com/HPSCTerrSys/eCLM_static-file-generator/blob/main/README.md) - title: eCLM static file generator - - url: https://hpscterrsys.github.io/TSMP2_workflow-engine - title: TSMP2 Workflow Engine diff --git a/docs/users_guide/case_creation/1_create_grid_file.md b/docs/users_guide/case_creation/1_create_grid_file.md deleted file mode 100644 index 1a85996..0000000 --- a/docs/users_guide/case_creation/1_create_grid_file.md +++ /dev/null @@ -1,55 +0,0 @@ -# Create SCRIP grid file - -The first step in creating your input data is to define your model domain and the grid resolution you want to model in. There are several options to create the SCRIP grid file that holds this information: -1. Using the `mkscripgrid.py` script to create a regular latitude longitude grid. -2. Using the `produce_scrip_from_griddata.ncl` script to convert an existing netCDF file that holds the latidude and longitude centers of your grid in 2D (This allows you to create a curvilinear grid). -3. Similar to the first option but using the `scrip_mesh.py` script to create the SCRIP grid file. - -To start the SCRIP grid file creation navigate into the `mkmapgrids` directory where you will find the above mentioned scripts. - -```sh -cd mkmapgrids -``` - -## 1. Create SCRIP grid file with `mkscripgrid.py` - -To use `mkscripgrid.py`, first open the script (for example using vim text editor) and adapt the variables that describe your grid. These include your grid name, the four corner points of your model domain as well as the resolution (lines 42-50 of the script). Then you can execute the script: - -```sh -python mkscripgrid.py -``` - -```{attention} -The `mkscripgrid.py` script requires numpy and netCDF4 python libraries to be installed (use pip install to do that if not already installed). -``` - -The output will be a SCRIP grid netCDF file containing the grid dimension and the center and corners for each grid point. It will have the format `SCRIPgrid_"Your grid name"_nomask_c"yymmdd".nc` - -## 2. Create SCRIP grid file from griddata with `produce_scrip_from_griddata.ncl` - -Unfortunately, NCL is not maintained anymore in the new software stages. Therefore, in order to use it you first need to load an older Stage and the required software modules: - -```sh -module load Stages/2020 -module load Intel/2020.2.254-GCC-9.3.0 -module load ParaStationMPI/5.4.7-1 -module load NCL -``` - -Next, adapt the input in `produce_scrip_from_griddata.ncl` to your gridfile.This includes choosing a name for your output file "OutFileName", adjusting the filename of your netcdf file in line 9 and the variable names for longitude/latitude in lines 10-11. Then execute: - -```sh -ncl produce_scrip_from_griddata.ncl -``` - -## 3. Create SCRIP grid file from griddata using `scrip_mesh.py` - -Alternatively to the first option, you can use the python script `scrip_mesh.py`. Like the ncl script it can create SCRIP files including the calculation of corners. It takes command line arguments like this: - -```sh -python3 scrip_mesh.py --ifile NC_FILE.nc --ofile OUTPUT_SCRIP.nc --oformat SCRIP # replace NC_FILE.nc with your netcdf file and choose a name for your output SCRIP grid file for OUTPUT_SCRIP.nc -``` - ---- - -**Congratulations!** You successfully created your SCRIP grid file and can now move on to the next step. \ No newline at end of file diff --git a/docs/users_guide/case_creation/2_create_mapping_file.md b/docs/users_guide/case_creation/2_create_mapping_file.md deleted file mode 100644 index 1e23c82..0000000 --- a/docs/users_guide/case_creation/2_create_mapping_file.md +++ /dev/null @@ -1,39 +0,0 @@ -# Create mapping files - -To start the mapping file creation navigate into the `mkmapdata` directory where you will find the script needed for this step. - -```sh -cd ../mkmapdata -``` - -Before you run `runscript_mkmapdata.sh` you need to adapt some environment variables in lines 23-25 of the script. For this open the script (for example using vim text editor) and enter the name of your grid under `GRIDNAME` (same as what you used for the SCRIP grid file). For `CDATE`, use the date that your SCRIP grid file was created (per default the script uses the current date, if you created the SCRIPgrid file at some other point, you find the date of creation at the end of your SCRIPgrid file or in the file information). Lastly, provide the full path and name of your SCRIP grid file under `GRIDFILE`. Save and close the script. - -To create your mapping files, you need a set of rawdata. If you are a JSC user you can simply refer to the common data repository by adapting the "rawpath" path in line 29 of the script. - -``` -rawpath="/p/scratch/cslts/shared_data/rlmod_eCLM/inputdata/surfdata/lnd/clm2/mappingdata/grids" -``` - -For non JSC users, download the data and adapt "rawpath" to their new location. To download the data to the directory use: - -```sh -wget --no-check-certificate -i clm_mappingfiles.txt -``` - -Now you can execute the script: - -```sh -sbatch runscript_mkmapdata.sh -``` - -The output will be a `map_*.nc` file for each of the rawdata files. These files are the input for the surface parameter creation weighted to your grid specifications. - -To generate the domain file in the next step a mapfile is needed. This can be any of the generated `map_*.nc` files. So, set the environment variable `MAPFILE` for later use: - -```sh -export MAPFILE="path to your mapfiles"/"name of one of your map files" -``` - ---- - -**Congratulations!** You successfully created your mapping files and can now move on to the next step. \ No newline at end of file diff --git a/docs/users_guide/case_creation/3_create_domain_file.md b/docs/users_guide/case_creation/3_create_domain_file.md deleted file mode 100644 index 9ea49b9..0000000 --- a/docs/users_guide/case_creation/3_create_domain_file.md +++ /dev/null @@ -1,37 +0,0 @@ -# Create domain file - -In this step you will create the domain file for your case using `gen_domain`. First, you need to navigate into the `gen_domain_files/src/` directory and compile it with the loaded modules ifort, imkl, netCDF and netCDF-Fortran. - -```sh -cd ../gen_domain_files/src/ - -# Compile the script -ifort -o ../gen_domain gen_domain.F90 -mkl -lnetcdff -lnetcdf -``` -```{attention} -If you get a message saying "ifort: command line remark #10412: option '-mkl' is deprecated and will be removed in a future release. Please use the replacement option '-qmkl'" or the compiling fails, replace `-mkl` with `-qmkl`. -``` - -Before running the script you need to export the environment variable `GRIDNAME` (same as what you used for the SCRIP grid file and in the `runscript_mkmapdata.sh` script). - -```sh -export GRIDNAME="your gridname" -``` -Then you can run the script: -```sh -cd ../ -./gen_domain -m $MAPFILE -o $GRIDNAME -l $GRIDNAME -``` - -The output of this will be two netCDF files `domain.lnd.*.nc` and `domain.ocn.*.nc` that define the land and ocean mask respectively. The land mask will inform the atmosphere and land inputs of eCLM when running a case. - -However, `gen_domain` defaults the use of the variables `mask` and `frac` on these files to be for ocean models, i.e. 0 for land and 1 for ocean. So to use them you have to either manipulate the `domain.lnd.*.nc` file to have mask and frac set to 1 instead of 0 (WARNING: some netCDF script languages have `mask` as a reserved keyword e.g. NCO, use single quotation marks as workaround). -Or simply swap/rename the `domain.lnd.*.nc` and `domain.ocn.*.nc` file: - -```sh -mv domain.lnd."your gridname"_"your gridname"."yymmdd".nc temp.nc -mv domain.ocn."your gridname"_"your gridname"."yymmdd".nc domain.lnd."your gridname"_"your gridname"."yymmdd".nc -mv temp.nc domain.ocn."your gridname"_"your gridname"."yymmdd".nc -``` - -**Congratulations!** You successfully created your domain files and can now move on to the final next step to create your surface data. diff --git a/docs/users_guide/case_creation/4_create_surface_file.md b/docs/users_guide/case_creation/4_create_surface_file.md deleted file mode 100644 index fe1d654..0000000 --- a/docs/users_guide/case_creation/4_create_surface_file.md +++ /dev/null @@ -1,43 +0,0 @@ -# Create surface file - -In this step you will create the surface data file using the `mksurfdata.pl` script. -First, we will compile the script with `make` in the `mksurfdata/src` directory. - - -```sh -cd ../mksurfdata/src - -# Compile the script -make -``` - -The script needs a few environment variables such as `GRIDNAME` (exported in the previous step), `CDATE` (date of creation of the mapping files which can be found at the end of each `map_*` file before the file extension) and `CSMDATA` (the path where the raw data for the surface file creation is stored) before executing the script. - -```sh -export CDATE=`date +%y%m%d` -export CSMDATA="/p/scratch/cslts/shared_data/rlmod_eCLM/inputdata/" # this works for JSC users only, for non JSC users see below - -# generate surfdata -./mksurfdata.pl -r usrspec -usr_gname $GRIDNAME -usr_gdate $CDATE -l $CSMDATA -allownofile -y 2000 -crop -``` - -```{tip} -The `-crop` option used in `./mksurfdata.pl` will create a surface file for BGC mode with all crops active. If you want to use SP mode, you should run without this option. - -Use `./mksurfdata.pl -help` to display all options possible for this script. -For example: -- `hirespft` - If you want to use the high-resolution pft dataset rather than the default lower resolution dataset (low resolution is at half-degree, high resolution at 3minute), hires is only available for present-day (2000) -``` - -**For non JSC users**: -Non JSC users can download the raw data from HSC datapub using this link or from the official rawdata repository using `wget` before submitting the script. - -```sh -wget "RAWDATA_LINK"/"NAME_OF_RAWDATA_FILE" --no-check-certificate # repeat this for every rawdata file -``` - -You will see a "Successfully created fsurdat files" message displayed at the end of the script if it ran through. - -The output will be a netCDF file similar to `surfdata_"your grid name"_hist_78pfts_CMIP6_simyr2000_c"yymmdd".nc`. - -**Congratulations!** You successfully created your surface data file! In the next step you will learn how to create your own atmospheric forcings. \ No newline at end of file diff --git a/docs/users_guide/case_creation/5_modifications_surface_domain_file.md b/docs/users_guide/case_creation/5_modifications_surface_domain_file.md deleted file mode 100644 index 17f7802..0000000 --- a/docs/users_guide/case_creation/5_modifications_surface_domain_file.md +++ /dev/null @@ -1,39 +0,0 @@ -# Modification of the surface and domain file - - -## Handling negative longitudes and the landmask - -eCLM does not accept negative longitudes for the surface and domain file. In case you used a grid file to create your SCRIP grid file which used negative longitudes (instead of creating it through the `mkscripgrid.py` script), these need to be converted into the 0 to 360 degree system used by eCLM. You can use the `mod_domain.sh` script in the main directory `eCLM_static_file_workflow` to do this. - -Before executing the script adapt the paths to your [surface file](https://hpscterrsys.github.io/eCLM/users_guide/case_creation/4_create_surface_file.html#create-surface-file) and [domain file](https://hpscterrsys.github.io/eCLM/users_guide/case_creation/3_create_domain_file.html#create-domain-file). - -`mod_domain.sh` also replaces the `mask` and `frac` variables of your domain file with the information from a `landmask_file` (this `landmask.nc` file should contain the 2D variables `mask` and `frac` that contain your landmask (value 1 for land and 0 for ocean)). This step should not be necessary as you already swapped the `domain.lnd.*.nc` and `domain.ocn.*.nc` file when creating them. However, for some domains (e.g. the ICON grid) the mask from the rawdata may not correctly represent your landmask. Additionally, if you want to replace the surface parameters with higher resolution data (see below), you may need to update the landmask as well to match your surface parameters (e.g. coast lines may have changed). - -## Modifying surface parameters - -You may want to modify the default soil, landuse or other land surface data on the surface file if you have measurements or a different data source of higher resolution or similar available. -You can do this by accessing the relevant variables on the surface file. - -Variables you want to modify may include (non-exhaustive list): - -Soil: -- `PCT_SAND`: percentage sand at soil levels (10 levels are considered) -- `PCT_CLAY`: percentage clay at soil levels (10 levels are considered) -- `ORGANIC`: organic matter density at soil levels (10 levels are considered) - -Landuse at the landunit level ([Fig. 2](https://hpscterrsys.github.io/eCLM/users_guide/introduction_to_eCLM/introduction.html#fig2)): -- `PCT_NATVEG`: total percentage of natural vegetation landunit -- `PCT_CROP`: total percentage of crop landunit -- `PCT_URBAN`: total percentage of urban landunit -- `PCT_LAKE`: total percentage of lake landunit -- `PCT_GLACIER`: total percentage of glacier landunit -- `PCT_WETLAND`: total percentage of wetland landunit - -Types of crop and natural vegetation at the patch level ([Fig. 2](https://hpscterrsys.github.io/eCLM/users_guide/introduction_to_eCLM/introduction.html#fig2)): - -- `PCT_NAT_PFT`: percent plant functional type (PFT) on the natural veg landunit (% of landunit) (15 PFTs are considered see here for a list of PFTs) -- `PCT_CFT`: percent crop functional type (CFT) on the crop landunit (% of landunit) (2 CFTs are considered in SP mode, 64 CFTs are considered in BGC mode, see here for a list of CFTs) - -Land fraction: -- `LANDFRAC_PFT`: land fraction from PFT dataset -- `PFTDATA_MASK`: land mask from pft dataset, indicative of real/fake points \ No newline at end of file diff --git a/docs/users_guide/case_creation/6_create_atm_forcings.md b/docs/users_guide/case_creation/6_create_atm_forcings.md deleted file mode 100644 index 539fea3..0000000 --- a/docs/users_guide/case_creation/6_create_atm_forcings.md +++ /dev/null @@ -1,95 +0,0 @@ -# Create atmospheric forcing files - -There exist a few global standard forcing data sets that can be downloaded together with their domain file from the official data repository via this link. For beginners, it is easiest to start with these existing data files. - -- GSWP3 NCEP forcing dataset -- CRUNCEP dataset -- Qian dataset - - -To run with your own atmospheric forcing data, you need to set them up in NetCDF format that can be read by the atmospheric data model `DATM`. - -There is a list of eight variables that are expected to be on the input files. The names and units can be found in the table below (in the table TDEW and SHUM are optional fields that can be used in place of RH). The table also lists which of the fields are required and if not required what the code will do to replace them. If the names of the fields are different or the list is changed from the standard list of eight fields: FLDS, FSDS, PRECTmms, PSRF, RH, TBOT, WIND, and ZBOT, the resulting streams file will need to be modified to take this into account. - -```{list-table} Atmospheric forcing fields adapted from CESM1.2.0 User's Guide Documentation. -:header-rows: 1 -:name: tab1 - -* - Short-name - - Description - - Unit - - Required? - - If NOT required how replaced -* - FLDS - - incident longwave - - W/m2 - - No - - calculates based on Temperature, Pressure and Humidity (NOTE: The CRUNCEP data includes LW down, but by default we do NOT use it -- we use the calculated values) -* - FSDS - - incident solar - - W/m2 - - Yes - - / -* - FSDSdif - - incident solar diffuse - - W/m2 - - No - - based on FSDS -* - FSDSdir - - incident solar direct - - W/m2 - - No - - based on FSDS -* - PRECTmms - - precipitation - - mm/s - - Yes - - / -* - PSRF - - pressure at the lowest atm level - - Pa - - No - - assumes standard-pressure -* - RH - - relative humidity at the lowest atm level - - \% - - No - - can be replaced with SHUM or TDEW -* - SHUM - - specific humidity at the lowest atm level - - kg/kg - - Optional in place of RH - - can be replaced with RH or TDEW -* - TBOT - - temperature at the lowest atm level - - K (or can be C) - - Yes - - / -* - TDEW - - dew point temperature - - K (or can be C) - - Optional in place of RH - - can be replaced with RH or SHUM -* - WIND - - wind at the lowest atm level - - m/s - - Yes - - / -* - ZBOT - - observational height - - m - - No - - assumes 30 meters -``` - -All of the variables should be dimensioned: time, lat, lon, with time units in the form of "days since yyyy-mm-d hh:mm:ss" and a calendar attribute that can be "noleap" or "gregorian". There should be separate files for each month called `YYYY-MM.nc` where YYYY-MM corresponds to the four digit year and two digit month with a dash in-between. - -For single point cases where the atmospheric data has hourly or half-hourly temporal resolution, all data can be in the same monthly files (`YYYY-MM.nc`). For regional cases and if the data is at coarser temporal resolution, different time interpolation algorithms will be used for solar radiation, precipitation and the remaining input data so the data needs to be split into three files and placed into three different folders (`Precip/`, `Solar/`, `TPHWL/`). You also need a domain file to go with your atmospheric data which can be the same as the land domain file created in the previous workflow if the spatial resolution of your atmospheric data is the same as of your specified domain. - -For JSC users, an example python script to create forcings for a single-point case based on hourly observations can be found under `/p/scratch/cslts/shared_data/rlmod_eCLM`. - -Simply copy the script to your directory and adapt it to your own data. - -```sh -cp /p/scratch/cslts/shared_data/rlmod_eCLM/createnetCDF_forc_hourly_input.py $HOME -``` \ No newline at end of file diff --git a/docs/users_guide/case_creation/README.md b/docs/users_guide/case_creation/README.md index 30300e6..80f8449 100644 --- a/docs/users_guide/case_creation/README.md +++ b/docs/users_guide/case_creation/README.md @@ -1,5 +1,9 @@ # Creating a custom case +```{warning} +TODO +``` + This workflow will guide you through creating your own input datasets at a resolution of your choice for eCLM simulations. Throughout this process, you will use a range of different scripts to create the necessary files. @@ -22,13 +26,3 @@ cd /p/project1/projectID/user1 # replace projectID with your compute project and git clone https://github.com/HPSCTerrSys/eCLM_static_file_workflow.git ``` -Sourcing the environment file that is contained in the repository will load all the required software modules. - -```sh -cd eCLM_static_file_workflow/ -source jsc.2023_Intel.sh -``` -You are now ready to start with the workflow. - -```{tableofcontents} -``` \ No newline at end of file From 805c8f52c67a2b6c4afc91741ab2aa2e6e592fa3 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Mon, 6 Oct 2025 14:21:29 +0200 Subject: [PATCH 4/9] Returned source installation guide and added Quickstart section --- docs/_toc.yml | 6 ++- .../installation/source_installation.md | 37 +++++++++++++++++++ .../README.md => introduction/Quickstart.md} | 0 3 files changed, 41 insertions(+), 2 deletions(-) create mode 100644 docs/users_guide/installation/source_installation.md rename docs/users_guide/{installation/README.md => introduction/Quickstart.md} (100%) diff --git a/docs/_toc.yml b/docs/_toc.yml index 3199bed..bf8f082 100644 --- a/docs/_toc.yml +++ b/docs/_toc.yml @@ -5,8 +5,8 @@ root: INDEX parts: - caption: Introduction chapters: - - file: users_guide/installation/README - title: Installing eCLM + - file: users_guide/introduction/Quickstart.md + title: Quickstart - file: users_guide/introduction/introduction title: Scientific Background @@ -30,6 +30,8 @@ parts: - caption: Developer's Guide chapters: + - file: users_guide/installation/source_installation + title: Building eCLM from source - url: https://hpscterrsys.github.io/eCLM/src title: eCLM Source Code Browser diff --git a/docs/users_guide/installation/source_installation.md b/docs/users_guide/installation/source_installation.md new file mode 100644 index 0000000..2a2a304 --- /dev/null +++ b/docs/users_guide/installation/source_installation.md @@ -0,0 +1,37 @@ +# Installing eCLM from source + +```{warning} +For advanced users. +``` + +## Requirements + +* MPI compilers (e.g. OpenMPI) +* CMake +* LAPACK +* [netCDF C and Fortran libraries](https://downloads.unidata.ucar.edu/netcdf) +* [PnetCDF](https://github.com/Parallel-NetCDF/PnetCDF) + +## Steps + +```sh +# Download eCLM +git clone https://github.com/HPSCTerrSys/eCLM.git +cd eCLM + +# Create eCLM install directory +mkdir install + +# Set compilers +export CC=mpicc FC=mpifort + +# Build and install eCLM +cmake -S src -B bld -DCMAKE_INSTALL_PREFIX=install +cmake --build bld --parallel +cmake --install bld +``` + +## Reference build scripts + +- [eCLM build on Ubuntu](https://github.com/HPSCTerrSys/eCLM/blob/4d567d2d68cac0fba977914b4a9c3ba199afd0ff/.github/workflows/CI.yml#L70-L121) +- [eCLM build on TSMP2](https://github.com/HPSCTerrSys/TSMP2/blob/master/cmake/BuildeCLM.cmake) diff --git a/docs/users_guide/installation/README.md b/docs/users_guide/introduction/Quickstart.md similarity index 100% rename from docs/users_guide/installation/README.md rename to docs/users_guide/introduction/Quickstart.md From 05071ce9406d0514027d912a1032d32bad5971b2 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Mon, 6 Oct 2025 14:49:16 +0200 Subject: [PATCH 5/9] Staging changes to INDEX and Quickstart --- docs/INDEX.md | 9 ++++-- docs/users_guide/introduction/Quickstart.md | 34 +++++++-------------- 2 files changed, 17 insertions(+), 26 deletions(-) diff --git a/docs/INDEX.md b/docs/INDEX.md index 76c7c62..446dc29 100644 --- a/docs/INDEX.md +++ b/docs/INDEX.md @@ -1,8 +1,11 @@ -# eCLM Documentation +# Welcome to eCLM! -```{important} -**Welcome!** You are viewing the first version of the documentation for eCLM. This is a living document, which means it will be continuously updated and improved. Please check back regularly for the latest information and updates. +```{warning} +TODO ``` eCLM is based on version 5.0 of the Community Land Model ([CLM5](https://www.cesm.ucar.edu/models/clm)) with simplified infrastructure for build and namelist generation. The build system is handled entirely by Cmake and namelists are generated through a small set of Python scripts. Similar to CLM5, eCLM is forced with meteorological data and uses numerous input streams on soil properties, land cover and land use, as well as complex parameter sets on crop phenology, and plant hydraulics for simulations. +An overview of eCLM can be seen from this [poster](https://virtual.oxfordabstracts.com/event/75166/submission/35) which was presented at [RSECon'25](https://rsecon25.society-rse.org). + +![eCLM poster](users_guide/images/rsecon25_eclm_poster.jpg) diff --git a/docs/users_guide/introduction/Quickstart.md b/docs/users_guide/introduction/Quickstart.md index 15bdf0e..58947c1 100644 --- a/docs/users_guide/introduction/Quickstart.md +++ b/docs/users_guide/introduction/Quickstart.md @@ -1,14 +1,10 @@ -# Installing eCLM +# Quickstart -## Requirements - -* MPI compilers (e.g. OpenMPI) -* CMake -* LAPACK -* [netCDF C and Fortran libraries](https://downloads.unidata.ucar.edu/netcdf) -* [PnetCDF](https://github.com/Parallel-NetCDF/PnetCDF) +```{warning} +TODO +``` -## Method 1: Build eCLM through TSMP2 (recommended) +## Build eCLM The easiest way to install eCLM is through [TSMP2 build system](https://github.com/HPSCTerrSys/TSMP2). @@ -21,22 +17,14 @@ cd TSMP2 ./build_tsmp2.sh --eCLM ``` -## Method 2: Building from source (for advanced users) +## Run Wuestebach test case ```sh -# Download eCLM -git clone https://github.com/HPSCTerrSys/eCLM.git -cd eCLM - -# Create eCLM install directory -mkdir install +git clone https://icg4geo.icg.kfa-juelich.de/ExternalReposPublic/tsmp2-static-files/extpar_eclm_wuestebach_sp.git +cd extpar_eclm_wuestebach_sp/static.resources +generate_wtb_namelists.sh 1x1_wuestebach +cd 1x1_wuestebach -# Set compilers -export CC=mpicc FC=mpifort - -# Build and install eCLM -cmake -S src -B bld -DCMAKE_INSTALL_PREFIX=install -cmake --build bld --parallel -cmake --install bld +mpirun -np 1 eclm.exe ``` From b8b3a71bf842b0f3b67a48d311d4426cb98a64f7 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Thu, 9 Oct 2025 13:43:59 +0200 Subject: [PATCH 6/9] Quickstart first draft --- docs/users_guide/introduction/Quickstart.md | 81 ++++++++++++++++++--- 1 file changed, 70 insertions(+), 11 deletions(-) diff --git a/docs/users_guide/introduction/Quickstart.md b/docs/users_guide/introduction/Quickstart.md index 58947c1..8498b46 100644 --- a/docs/users_guide/introduction/Quickstart.md +++ b/docs/users_guide/introduction/Quickstart.md @@ -1,30 +1,89 @@ -# Quickstart +# First Tutorial -```{warning} -TODO +Welcome! This guide walks you through the basic steps in setting up eCLM. eCLM is typically used on an HPC cluster. +Still, you can run small eCLM test cases (e.g. single-point domains) on your laptop. This tutorial aims to teach you +just that: setting up and running a small eCLM test case on your laptop. This workflow remains more or less the same +once you move to an HPC cluster to do some serious eCLM runs. + +An HPC environment and a personal computing environment (*e.g.* your laptop) use different sets of tools to accomplish +the same task. However, the goal is not overwhelm you with tool usage (which could be an interesting exercise in itself), +but rather focus on a common workflow that gets you up to speed with eCLM: + +[1. Load eCLM dependencies](#my-multi-word-header) +[2. Build eCLM](#my-multi-word-header) +[3. Generate namelists](#my-multi-word-header) +[4. Run eCLM](#my-multi-word-header) + +## Prerequisites + +**This guide has been written to work on an Ubuntu system**. For Windows/Mac users, I suggest to set up a virtual +[Ubuntu 24.04 LTS] OS first through a container app (*e.g.* [Podman] or [Docker]). + +**Users are also expected to be familiar with using command-line interfaces (CLI).** For GUI users, unfortunately CLI is +the most of the time the only option of using an HPC cluster. Consider this as a preparation to use HPC!. You don't have to be +a CLI wizard; for starters you just need to know how to run your local terminal/console app and what the basic commands +such as `cd`, `ls`, `pwd`, and `cat`, do. If you want a refresher, check out the [beginner-friendly shell tutorial by MIT]. +It will arm you with more than enough info to go through this tutorial. + +## 1. Load eCLM dependencies + +eCLM requires CMake, a Fortran compiler, an MPI library, and NetCDF. On an HPC cluster these libraries are typically +installed already. For our case, we need to install them: + +```sh +# Install basic utilities +sudo apt-get install libxml2-utils cmake + +# Install Fortran and MPI compilers +sudo apt-get install gfortran openmpi-bin libopenmpi-dev + +# Install NetCDF libraries +sudo apt-get install netcdf-bin libnetcdf-dev libnetcdff-dev libpnetcdf-dev ``` -## Build eCLM +## 2. Build eCLM -The easiest way to install eCLM is through [TSMP2 build system](https://github.com/HPSCTerrSys/TSMP2). +First, specify a folder where you want eCLM to be installed. + +```sh +eCLM_INSTALL_DIR=${HOME}/eCLM # you can change this to any directory +mkdir -p ${eCLM_INSTALL_DIR} # create eCLM install folder +``` + +Get the [TSMP2 build system](https://github.com/HPSCTerrSys/TSMP2). ```sh -# Download TSMP2 git clone https://github.com/HPSCTerrSys/TSMP2.git cd TSMP2 +``` -# Build and install eCLM -./build_tsmp2.sh --eCLM +```sh +./build_tsmp2.sh eCLM --install-dir=${eCLM_INSTALL_DIR} ``` -## Run Wuestebach test case +## 3. Generate namelists + ```sh git clone https://icg4geo.icg.kfa-juelich.de/ExternalReposPublic/tsmp2-static-files/extpar_eclm_wuestebach_sp.git cd extpar_eclm_wuestebach_sp/static.resources generate_wtb_namelists.sh 1x1_wuestebach -cd 1x1_wuestebach +``` + +The last command should generate ... + +## 4. Run eCLM -mpirun -np 1 eclm.exe +```sh +cd 1x1_wuestebach +mpirun -np 1 ${eCLM_INSTALL_DIR}/bin/eclm.exe ``` +## Next Steps + +[Podman]: https://docs.podman.io/en/latest/Tutorials.html +[Docker]: https://docs.docker.com/get-started +[VirtualBox]: https://www.virtualbox.org +[UTM]: https://mac.getutm.app +[Ubuntu 24.04 LTS]: https://hub.docker.com/_/ubuntu +[beginner-friendly shell tutorial by MIT]: https://missing.csail.mit.edu/2020/course-shell From 55ca1b990c43d7303b2fa9dba6cc34acdb930871 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Thu, 9 Oct 2025 13:46:50 +0200 Subject: [PATCH 7/9] Renamed 'Quickstart' to 'First Tutorial' --- docs/_toc.yml | 4 ++-- .../introduction/{Quickstart.md => first_tutorial.md} | 0 2 files changed, 2 insertions(+), 2 deletions(-) rename docs/users_guide/introduction/{Quickstart.md => first_tutorial.md} (100%) diff --git a/docs/_toc.yml b/docs/_toc.yml index bf8f082..e491aa2 100644 --- a/docs/_toc.yml +++ b/docs/_toc.yml @@ -5,10 +5,10 @@ root: INDEX parts: - caption: Introduction chapters: - - file: users_guide/introduction/Quickstart.md - title: Quickstart - file: users_guide/introduction/introduction title: Scientific Background + - file: users_guide/introduction/first_tutorial.md + title: First Tutorial - caption: User's Guide chapters: diff --git a/docs/users_guide/introduction/Quickstart.md b/docs/users_guide/introduction/first_tutorial.md similarity index 100% rename from docs/users_guide/introduction/Quickstart.md rename to docs/users_guide/introduction/first_tutorial.md From dfcd2a79b5c11707e94a5d9a486dbe346eb16c76 Mon Sep 17 00:00:00 2001 From: kvrigor Date: Fri, 10 Oct 2025 09:55:30 +0200 Subject: [PATCH 8/9] Fixed broken internal links --- docs/users_guide/introduction/first_tutorial.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/users_guide/introduction/first_tutorial.md b/docs/users_guide/introduction/first_tutorial.md index 8498b46..1879ca7 100644 --- a/docs/users_guide/introduction/first_tutorial.md +++ b/docs/users_guide/introduction/first_tutorial.md @@ -9,10 +9,10 @@ An HPC environment and a personal computing environment (*e.g.* your laptop) use the same task. However, the goal is not overwhelm you with tool usage (which could be an interesting exercise in itself), but rather focus on a common workflow that gets you up to speed with eCLM: -[1. Load eCLM dependencies](#my-multi-word-header) -[2. Build eCLM](#my-multi-word-header) -[3. Generate namelists](#my-multi-word-header) -[4. Run eCLM](#my-multi-word-header) +[1. Load eCLM dependencies](./first_tutorial.md#load-eclm-dependencies) +[2. Build eCLM](#build-eclm) +[3. Generate namelists](#generate-namelists) +[4. Run eCLM](#run-eclm) ## Prerequisites From f741b82f98d4d2e22ab09fa8fba7cea482f8a05b Mon Sep 17 00:00:00 2001 From: kvrigor Date: Fri, 10 Oct 2025 10:10:40 +0200 Subject: [PATCH 9/9] Removed nonfunctional anchor links --- docs/users_guide/introduction/first_tutorial.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/users_guide/introduction/first_tutorial.md b/docs/users_guide/introduction/first_tutorial.md index 1879ca7..ba9b698 100644 --- a/docs/users_guide/introduction/first_tutorial.md +++ b/docs/users_guide/introduction/first_tutorial.md @@ -9,10 +9,10 @@ An HPC environment and a personal computing environment (*e.g.* your laptop) use the same task. However, the goal is not overwhelm you with tool usage (which could be an interesting exercise in itself), but rather focus on a common workflow that gets you up to speed with eCLM: -[1. Load eCLM dependencies](./first_tutorial.md#load-eclm-dependencies) -[2. Build eCLM](#build-eclm) -[3. Generate namelists](#generate-namelists) -[4. Run eCLM](#run-eclm) +1. Load eCLM dependencies +2. Build eCLM +3. Generate namelists +4. Run eCLM ## Prerequisites