-
Notifications
You must be signed in to change notification settings - Fork 17
merge geomar dev into latest release #1352
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: release
Are you sure you want to change the base?
Conversation
compiles and starts, but misses (as expected) information in existing JSBACH restart files. Cold start not yet implemented
links all currently known files correctly running not yet tested as info from Shradda is still missing
draft version of ECHAM76-LMU that start up correctly but still fails due to land sea mask mismatch
fixed restart setup in a hacky way (see foci.yaml) added a new runscript for FOCIOIFS
…geomar_dev exludes the conflicting oasis.py
This should be the only difference between geomar_dev and release that matters
allow different streams for restart and output for ECHAM and JSBACH
with FOCI-MOPS with the 12 tile edition
merge geomar dev into geomar branch
separate echam6 namlist templates for foci and foci_lmu
before merging this branch into geomar_dev differences where too large for an efficient direct merge
83f3c2a to
9088777
Compare
see runscripts/foci/README_important.txt for details add temraw output for SOLVe project in foci-moz
compiling, running and restart now work
added missing coupling dir for foci-moz
…-lmod setup on glogin
… to CMakeLists.txt) XIOS still failing fixed RESCUE postprocessing by adding the mandatory new option -l and -a to echam_postprocessing_RESCUE.sh that came in when MOZ was added
mandresm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@seb-wahl, as mentioned in our in-person meeting, I finished this review even if this is not meant to be merged in release so that we can use some of this comments as a basis for the PR og geomar branch into release.
| active: ${lcarbon} | ||
|
|
||
| choose_jsbach_with_hd: | ||
| yes: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to change this to true/false? Just to remain consistent to all of our other switches that normally use booleans.
| choose_icon.version: | ||
| 2.6.6-nwp: | ||
| input_dir: /pool/data/ICON | ||
| icon_pool: ${input_dir}/grids/private/mpim/icon_preprocessing/source | ||
| grid_dir: ${input_dir}/grids | ||
| aerosol_dir: ${icon_pool}/aerosol_29-06-2018/kinne/${resolution}_${icon.grid_label} | ||
| jsbach_ic_dir: ${input_dir}/grids/private/jsbach/mpim/${grid_id}/land/r0002 | ||
| jsbach_forcing_dir: ${jsbach_ic_dir} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would move most of this block into the icon section itself, instead of keeping it inside computer. The only parameter that seems computer-dependent is input_dir and that could probably be made independent by making it be:
input_dir: ${computer.pool_dir}/ICON| destination: oifs-48r1 | ||
| with_xios: false | ||
| major_version: 48r1 | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| uwavein: uwavein | ||
| specwavein: specwavein | ||
| sfcwindin: sfcwindin | ||
| cdwavein: cdwavein |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| uwavein: uwavein | |
| specwavein: specwavein | |
| sfcwindin: sfcwindin | |
| cdwavein: cdwavein |
These files shouldn't be needed here as they are added when wam is turned on:
esm_tools/configs/components/oifs/oifs.yaml
Lines 915 to 927 in 51ef371
| choose_wam: | |
| 1: | |
| wam_number: "1" | |
| add_input_files: | |
| wam_grid_tables: wam_grid_tables | |
| wam_subgrid_0: wam_subgrid_0 | |
| wam_subgrid_1: wam_subgrid_1 | |
| wam_subgrid_2: wam_subgrid_2 | |
| uwavein: uwavein | |
| specwavein: specwavein | |
| sfcwindin: sfcwindin | |
| cdwavein: cdwavein |
| OIFS_CFLAGS: '"-fp-model precise -O3 -g -traceback -qopt-report=0 -fpe0 -qopenmp -march=core-avx2 -mtune=core-avx2"' # -qoverride-limits -fast-transcendentals -m64 -fma -pc64"' | ||
| OIFS_CCDEFS: '"LINUX LITTLE INTEGER_IS_INT _ABI64 BLAS _OPENMP"' | ||
| choose_compiler_mpi: | ||
| intel2022_openmpi: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
check that this does not break anything for AWICM3 (in the esm_tests in GitHub)
| # due to duplicate namelists (occur with ICON) we get an | ||
| # IndexError: list index out of range | ||
| # workaround by seb-wahl | ||
| # TODO: find a better solution | ||
| provenance_comment = f"no provenance info" | ||
| if 0 <= indx < len(config): | ||
| provenance = getattr(config[indx], "provenance", [None])[-1] | ||
| if provenance: | ||
| provenance_comment = f"{provenance['yaml_file']},line:{provenance['line']},col:{provenance['col']}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I need to debug this, there might be a way of getting rid of this problem. It's strange that this does not happen for ECHAM
| # Search for ``post_run_commands``s in the components | ||
| for component in config.keys(): | ||
| post_run_commands = config[component].get("post_run_commands") | ||
| if isinstance(post_run_commands, list): | ||
| for pr_command in post_run_commands: | ||
| if isinstance(pr_command, str): | ||
| extras.append(pr_command) | ||
| else: | ||
| user_error( | ||
| 'Invalid type for "post_run_commands"', | ||
| ( | ||
| f'"{type(pr_command)}" type is not supported for ' | ||
| + f'elements of the "post_run_commands", defined in ' | ||
| + f'"{component}". Please, define ' | ||
| + '"post_run_commands" as a "list" of "strings" or a "list".' | ||
| ), | ||
| ) | ||
| elif isinstance(post_run_commands, str): | ||
| extras.append(post_run_commands) | ||
| elif post_run_commands == None: | ||
| continue | ||
| else: | ||
| user_error( | ||
| 'Invalid type for "post_run_commands"', | ||
| ( | ||
| f'"{type(post_run_commands)}" type is not supported for ' | ||
| + f'"post_run_commands" defined in "{component}". Please, define ' | ||
| + '"post_run_commands" as a "string" or a "list" of "strings".' | ||
| ), | ||
| ) | ||
| return extras |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| # Search for ``post_run_commands``s in the components | |
| for component in config.keys(): | |
| post_run_commands = config[component].get("post_run_commands") | |
| if isinstance(post_run_commands, list): | |
| for pr_command in post_run_commands: | |
| if isinstance(pr_command, str): | |
| extras.append(pr_command) | |
| else: | |
| user_error( | |
| 'Invalid type for "post_run_commands"', | |
| ( | |
| f'"{type(pr_command)}" type is not supported for ' | |
| + f'elements of the "post_run_commands", defined in ' | |
| + f'"{component}". Please, define ' | |
| + '"post_run_commands" as a "list" of "strings" or a "list".' | |
| ), | |
| ) | |
| elif isinstance(post_run_commands, str): | |
| extras.append(post_run_commands) | |
| elif post_run_commands == None: | |
| continue | |
| else: | |
| user_error( | |
| 'Invalid type for "post_run_commands"', | |
| ( | |
| f'"{type(post_run_commands)}" type is not supported for ' | |
| + f'"post_run_commands" defined in "{component}". Please, define ' | |
| + '"post_run_commands" as a "string" or a "list" of "strings".' | |
| ), | |
| ) | |
| return extras | |
| # Search for ``post_run_commands``s in the components | |
| for component in config.keys(): | |
| post_run_commands = config[component].get("post_run_commands") | |
| if isinstance(post_run_commands, str): | |
| post_run_commands = [post_run_commands] | |
| elif isinstance(post_run_commands, list): | |
| pass | |
| elif post_run_commands == None: | |
| continue | |
| else: | |
| user_error( | |
| 'Invalid type for "post_run_commands"', | |
| ( | |
| f'"{type(post_run_commands)}" type is not supported for ' | |
| + f'"post_run_commands" defined in "{component}". Please, define ' | |
| + '"post_run_commands" as a "string" or a "list" of "strings".' | |
| ), | |
| ) | |
| for pr_command in post_run_commands: | |
| if isinstance(pr_command, str): | |
| extras.append(pr_command) | |
| else: | |
| user_error( | |
| 'Invalid type for "post_run_commands"', | |
| ( | |
| f'"{type(pr_command)}" type is not supported for ' | |
| + f'elements of the "post_run_commands", defined in ' | |
| + f'"{component}". Please, define ' | |
| + '"post_run_commands" as a "list" of "strings" or a "list".' | |
| ), | |
| ) | |
| return extras |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO for @mandresm: make it the same for pre_run_commands and generalize it in a single function that can be called both for pre and post run.
| ]["next_submit"] | ||
|
|
||
| # extra entries for each subjob | ||
| post_run_commands = batch_system.get_post_run_commands(config) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| post_run_commands = batch_system.get_post_run_commands(config) | |
| post_run_commands = batch_system.get_commands(config, "post_run_commands") |
TODO for @mandresm: change it to something like this.
| self.process_ordering = full_config[name]["process_ordering"] | ||
| self.coupled_execs = [] | ||
| for exe in self.process_ordering: | ||
| self.coupled_execs.append(full_config[exe]["executable"]) | ||
| self.runtime = full_config["general"]["runtime"][5] | ||
| self.nb_of_couplings = 0 | ||
| if "coupling_target_fields" in full_config[self.name]: | ||
| for restart_file in list(full_config[self.name]["coupling_target_fields"]): | ||
| self.nb_of_couplings += len( | ||
| list(full_config[self.name]["coupling_target_fields"][restart_file]) | ||
| ) | ||
| if "coupling_input_fields" in full_config[self.name]: | ||
| for restart_file in list(full_config[self.name]["coupling_input_fields"]): | ||
| self.nb_of_couplings += len( | ||
| list(full_config[self.name]["coupling_input_fields"]) | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO for @mandresm: if this belongs exclusively to oasis, move it to the oasis.py
| if coupler_name == "yac": | ||
| couplingfile = "coupling.xml" | ||
| else: | ||
| couplingfile = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO @mandresm: check for attributes in self. If none, then just return (make it oasis independend)
!!! nproma = 32 in runscripts/foci/RESCUE/foci-mops-lmu-scenario.yaml !!!
This merge request has the following changes. I started by merging v6.54.5 into geomar_dev:
dr_hook_...*var not found anymore after the removal of the environment_changes stuff.oasis.py. This needs to be reviewed carefully. Please see my comments inoasis.py.${tiles}Known bugs:
-finished_config.yamlremains in the directory where I run esm_master and I get the following "error" message: