Skip to content
Snippets Groups Projects
Commit 51999498 authored by Luke Conibear's avatar Luke Conibear
Browse files

updating docs and conda environment

parent 87c8bab0
No related branches found
No related tags found
No related merge requests found
Showing
with 94 additions and 31 deletions
profile.bash
\ No newline at end of file
profile.bash
**.DS_Store
**/__pycache__
docs/_build/
docs/book/_build/
\ No newline at end of file
# WRFotron
Installation instructions
WRFotron is a bunch of shell scripts to automate WRF(-Chem) simulations (on a cluster with job scheduling system), including the necessary preprocessing steps. Special focus is set to be able to chain simulations, allowing to create free-running (i.e., no nuding) simulations for chemistry-aerosol-meteorology interactions over longer time periods.
## Contributing
We are looking forward to receiving your [new issue report](https://mbees.med.uni-augsburg.de/gitlab/mbees/wrfotron/-/issues/new).
If you'd like to contribute source code directly, please [create a fork](https://mbees.med.uni-augsburg.de/gitlab/mbees/wrfotron), make your changes and then [submit a merge request](https://mbees.med.uni-augsburg.de/gitlab/mbees/wrfotron/-/merge_requests/new) to the original project.
## License and acknowledgment
WRFotron is open source and available under the Gnu Public License (GPL) in version 3. In case WRFotron was used in scientific publications, please acknowledge as follows:
*We acknowledge the use of WRFotron created by the Chair of Model-Based Environmental Exposure Science at University Augsburg, Germany, to automatise WRF(-Chem) simulations.*
## Prerequisites
* A working installation of WRF (and WRF-Chem, separately) v4.0 or greater
* A working installation of WPS v4.0 or greater
* Various preprocessing tools for WRF-Chem (e.g., mozbc, anthro_emis, megan, ...) as required
* All required input data (meteo, geog, emissions, ...)
## General naming conventions
Multiple (chained) simulations that serve a certain purpose, e.g. for a field campaign, are called an `experiment`. The
`experiment` name is the main identifier when working with WRFotron. The set of configuration options (namelist.wps, namelist.input, anthro_emis.inp, ...) for an `experiment` is called `blueprint`, and resides
in the subfolder `blueprints`/`experiment`.
## Main commands and scripts
`master.bash`: main script to start a simulation for an existing `experiment`.
`batch.bash`: main script to start a chain of simulations for an existing `experiment`. Basically calls `master.bash` multiple times.
These scripts can be called without arguments for help:
```
<me@mymachine>:. master.bash
Call with arguments <experiment> <year (YYYY)> <month (MM)> <day (DD)> <hour (hh)> <forecast time (h)> <spinup time (h)>
or <experiment> <year (YYYY)> <month (MM)> <day (DD)> <hour (hh)> <forecast time (h)> <spinup time (h)> <PID of job dependency>
* <experiment> can be a shortcut name (if experiment directory is found in subdirectory 'blueprints',
and you are calling it from the shell), or the absolute path to the experiment's settings directory.
* <spinup time> needs to be a multiple of meteoInc.
-d dry run, prepare only, do not submit or run
-b do not use batch system, execute directly in shell
```
## General workflow
When calling master.bash, a simulation directory is prepared in `$workDir`. Four jobs are submitted to the job scheduler, each depending on the previous:
- `pre.bash`: run WPS (and everything to prepare the -Chem part), create wrfinput and wrfbdy (and other) files in `$workDir`
- `main.bash`: run WRF(-Chem)
- `staging.bash`: move WRF(-Chem) output into the staging location at `$stagingDir`, where it waits to be postprocessed
- `post.bash`: at minimum, move output from `$stagingDir` to `$archiveDir`, possibly apply postprocessing actions on the output
These four general phases may consist of several subjobs, which can be seen (and adapted) in the `jobs` subdirectory.
## Installation
1. Get WRFotron
......@@ -12,19 +71,21 @@ Installation instructions
* subfolder `wrfotron` now contains your WRFotron installation
2. Set up a machine profile (if necessary)
2. Set up a machine profile
* create file with all commands to be executed prior to all runs, e.g. loading modules, setting ulimits, etc.:
`machine_profiles/your_machine.bash`
`machine_profiles/your_machine`
* link to root level as profile.bash
* link to WRFotron root level as profile.bash
`ln -s machine_profiles/your_machine.bash profile.bash`
`ln -s machine_profiles/your_machine profile.bash`
3. Experiment setup
* create a new experiment folder in blueprints, following the example
`cp -r blueprints/example blueprints/your_experiment`
* create a new experiment folder in blueprints, following the existing blueprints as examples
`cp -r blueprints/<chosen example> blueprints/<your experiment name>`
* create an experiment config for your machine
......@@ -34,9 +95,9 @@ Installation instructions
* link to experiment folder
`ln -s blueprints/your_experiment/machine_specific/alcc/batch_preambles blueprints/your_experiment/`
`ln -s blueprints/<your experiment name>/machine_specific/<machine>/batch_preambles blueprints/<your experiment name>/`
`ln -s blueprints/your_experiment/machine_specific/alcc/config.bash blueprints/your_experiment/`
`ln -s blueprints/<your experiment name>/machine_specific/<machine>/config.bash blueprints/<your experiment name>/`
4. Test
* make a run *(dates depend on the available data, obviously)*
......@@ -49,21 +110,18 @@ Installation instructions
($workDir is set in the experiment config)
### Temporary ARC4 instructions
```bash
git clone git@git.rz.uni-augsburg.de:mmbees-a/wrfotron.git
cd wrfotron
git checkout leeds_setup
# alias as leeds
ln -s machine_profiles/arc4 profile.bash
cd blueprints/leeds/
ln -s machine_specific/arc4/batch_preambles batch_preambles
ln -s machine_specific/arc4/config.bash config.bash
cd ../..
. master.bash leeds 2015 10 12 00 24 06
```
### Comments/questions
- How can we use a `main_restart.bash` equivalent here (`jobs/main/02-06`)?
## To create the documentation locally
1. Create the conda environment for the docs (only do this once):
```bash
conda env create --file environment.yml
```
2. Activate the conda environment:
```bash
conda activate wrfotron_docs
```
3. Build the docs locally
```bash
jupyter-book build docs:
# now you can view the in your browser
```
\ No newline at end of file
No preview for this file type
No preview for this file type
File added
No preview for this file type
File added
File added
File added
No preview for this file type
No preview for this file type
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 376f523ca725bdbfcdb55dc9e735c62a
config: b5b6be5672baedf5d037fcf7eda6ad50
tags: 645f666f9bcd5a90fca523b33c5a78b7
......@@ -125,12 +125,12 @@ Crontab script
- This script will change the last accessed date for all the specified directories and files underneath that path.
- Change permissions 755 on .not_expire.sh (`chmod 755 ~/.not_expire.sh`).
- Use the crontab command to edit the crontab file `crontab -e`
- Then add a line: `0 4 4 * * ~/.not_expire.sh`
- Then add a line: `0 0 1 */2 * ~/.not_expire.sh`
- This has now set a cronjob to run that will automatically touch (and thus reset last accessed time) the files once a month at 0400 on the 4th of the month.
- Runs on the login nodes
```bash
cd /nobackup/${USER}
find . -exec touch -ah {} \;
find . -exec touch -ah {} +
```
Acquire meteorological NCEP GFS files.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment