Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
11694e5
Refactored program for single command usage.
XanthronWriter Jun 20, 2023
e55ddcf
Added some paths to ignore
XanthronWriter Jun 20, 2023
cdc366e
Removed requirements.txt
XanthronWriter Jun 30, 2023
2e1986b
Ramoved __main__.py
XanthronWriter Jun 30, 2023
92eb947
tidied up setup.py
XanthronWriter Jun 30, 2023
271a966
Added build section in readme
XanthronWriter Jun 30, 2023
6cc4bc7
Added some things to ignore
XanthronWriter Jun 30, 2023
560d9ee
Updatet Version to 1.0.0
XanthronWriter Jun 30, 2023
ff09871
do MPI only if mpi4py can be loaded
XanthronWriter Jul 7, 2023
b268884
example
XanthronWriter Jul 7, 2023
3a2cdcd
added execution_dir_prefix to wd
XanthronWriter Jul 7, 2023
bcb13da
Added Sampling
XanthronWriter Jul 7, 2023
b3e4c52
run sampler from commandline
XanthronWriter Jul 7, 2023
4aa46fa
Fixed description for installing the package from sours
XanthronWriter Aug 1, 2023
7e2646d
Separated run from library loading.
XanthronWriter Aug 1, 2023
b03fec6
Made mpi4py optional
XanthronWriter Aug 1, 2023
e241a20
Moved the save path generation to separate function
XanthronWriter Aug 1, 2023
cc20d3c
Improved sampler
XanthronWriter Aug 1, 2023
70ed191
Added job creation for the sampler
XanthronWriter Aug 1, 2023
ad6d920
Added sampler structs to __init__
XanthronWriter Aug 1, 2023
607142a
Updated Setup
XanthronWriter Aug 1, 2023
71065a3
Final changes to the sampler and job system
XanthronWriter Aug 28, 2023
95b4245
Updated the example and added an instruction
XanthronWriter Aug 28, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,9 @@
.idea/dictionaries/thehnen.xml
.idea/modules.xml
.idea/workspace.xml
.idea/propti.iml
.idea/propti.iml
__pycache__
build/
propti.egg-info/
dist/
*.log
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,16 @@ Basic functionality for data analysis of the inverse modelling process is provid

Documentation is provided in [Wiki](https://github.com/FireDynamics/propti/wiki). The folder 'examples' contains application examples tested with FDS version 6.7.

## Building the Package from Source

In the event that you have obtained this package directly from the repository, you can build it by executing the following commands:
```bash
python setup.py sdist
pip install --upgrade dist/propti-[version number].tar.gz
# alternatively
pip install --upgrade .
```

## Citation

PROPTI is listed to ZENODO to get Data Object Identifiers (DOI) and allow for citations in scientific papers. You can find the necessary information here:
Expand Down
39 changes: 39 additions & 0 deletions examples/sampling_lhs_01/cone_template.fds
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
&HEAD CHID='#CHID#', TITLE='Example from FDS user guide' /

&MESH IJK=3,3,3, XB=-0.15,0.15,-0.15,0.15,0.0,0.3, MPI_PROCESS = 0 /

&TIME T_END=600.0, WALL_INCREMENT=1, DT=0.05 /

&MISC SOLID_PHASE_ONLY=.TRUE., RESTART=.TRUE./

&SPEC ID='METHANE' /
&MATL ID='BLACKPMMA'
ABSORPTION_COEFFICIENT=2700.0
N_REACTIONS=1
A(1) = 8.5E12
E(1) = 188000
EMISSIVITY=#EMISSIVITY#
DENSITY=#DENSITY#
SPEC_ID='METHANE'
NU_SPEC=1.0
HEAT_OF_REACTION=870.0
CONDUCTIVITY = #CONDUCTIVITY#
SPECIFIC_HEAT = #SPECIFIC_HEAT# /

&SURF ID='PMMA SLAB'
COLOR='BLACK'
BACKING='INSULATED'
MATL_ID='BLACKPMMA'
THICKNESS=0.0085
EXTERNAL_FLUX=50 /

&VENT XB=-0.05,0.05,-0.05,0.05,0.0,0.0, SURF_ID = 'PMMA SLAB' /

&DUMP DT_DEVC=5.0 /

&DEVC XYZ=0.0,0.0,0.0, IOR=3, QUANTITY='WALL TEMPERATURE', ID='temp' /

&DEVC XYZ=0.0,0.0,0.0, IOR=3, QUANTITY='MASS FLUX', SPEC_ID='METHANE', ID='MF' /
&DEVC XYZ=0.0,0.0,0.0, IOR=3, QUANTITY='WALL THICKNESS', ID='thick' /

&TAIL /
49 changes: 49 additions & 0 deletions examples/sampling_lhs_01/input.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# define variable 'params': sampling parameter set
# define variable 'setups': simulation setup set
# define variable 'optimiser': properties for the optimiser

# import just for IDE convenience
import propti as pr

# fix the chid
CHID = 'CONE'

# define the optimisation parameter
op1 = pr.Parameter(name='density', place_holder='DENSITY',
min_value=1e2, max_value=1e4)
op2 = pr.Parameter(name='emissivity', place_holder='EMISSIVITY',
min_value=0.01, max_value=1)
op3 = pr.Parameter(name='conductivity', place_holder='CONDUCTIVITY',
min_value=0.01, max_value=1)
op4 = pr.Parameter(name='specific_heat', place_holder='SPECIFIC_HEAT',
min_value=0.01, max_value=10)
ops = pr.ParameterSet(params=[op1, op2, op3, op4])

# define general model parameter, including optimisation parameter
params = pr.ParameterSet(params=[op1, op2, op3, op4])
params.append(pr.Parameter(name='chid', place_holder='CHID', value=CHID))

# define empty simulation setup set
setups = pr.SimulationSetupSet()

# create simulation setup object
template_file = "cone_template.fds"
s = pr.SimulationSetup(name='cone_pmma',
work_dir='cone_pmma',
execution_dir_prefix='samples_cone',
model_template=template_file,
model_parameter=params,
relations=None)

setups.append(s)

nsamples = 5
sampler = pr.Sampler(algorithm='LINEAR',
nsamples=nsamples)
time = []
for i in range(nsamples):
time.append(f"0-00:1{i}:00")
job = pr.Job(template="fds", parameter=[
("CHID",CHID),
("TIME",time),
("NODES","1")])
66 changes: 66 additions & 0 deletions examples/sampling_lhs_01/instruction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# Instruction
An example for using the sampler to create multiple simulations from one input file with different parameters.
## 1. Clone a Template File
```bash
propti template --name fds
```

By utilizing the `name` variable, it is possible to clone a template from the directory `propti/jobs` into the ongoing working directory. This permits flexible customization of simulations through parameter replacement, denoted by # placeholders.

Template Example:
```bash
#!/bin/bash
#!/bin/sh
# Name of the Job
#SBATCH --job-name=#CHID#_#ID#

# On witch device the simulation is run
#SBATCH --partition=norma
```

The above exemplifies a template script sourced from `propti/jobs/fds`. In this script, `#CHID#` and `#ID#` serve as variables that undergo replacement. While `#CHID#` is a variable requiring definition within the input file, `#ID#` is automatically substituted.


## 2. Create an `input.py` File

A python file that utilizes the propti module is necessary. For an idea how to structure the input file see `input.py`. It is important that the sampler is created before the jobs.

The sampler has currently 2 significant setup parameters:
- `algorithm`: Specifies the algorithm utilized for parameter computation. Currently supported: `LHS` and `LINEAR`
- `nsamples`: Defines the amount of samples that are generated.

To automatically crate jobs there are 3 important settings:
- `scheduler`: set the scheduler that is used. currently only `slurm` is supported, serving as the default value.
- `template`: Refers to the path leading to the template file.
- `parameters`: Defines parameters slated for replacement. There are 3 possible ways to define a parameter:
- `NAME` Solely the parameter name is provided. Subsequent substitution involves the parameter generated by the sampler for the ongoing simulation.
- (`NAME`, `VALUE`) Each placeholder is replaced by the same value
- (`NAME`, `LIST[VALUE]`) Each placeholder is replaced by the corresponding value. substituted by corresponding values. It is essential for the list's length to match 'nsamples'.



## 3. Initiate the Sampler
Execute the subsequent command to launch the input file, thereby initiating simulation creation alongside job execution scripts.
```bash
propti sampler input.py
```
Once simulations are ready for execution, activate the subsequent command:
```bash
propti job start
```
## 4. Query Running Job Information
To acquire real-time insight into job statuses, execute the following command:
```bash
propti job info
```

The output could look something like this.
```
NAME | JOBID | ST | TIME | TIME_LIMIT
sample_000000 | 19932566 | R | 7:17 | 20:00
sample_000003 | 19932569 | R | 7:17 | 20:00
sample_000001 | -------- | F | ---------- | ----------
sample_000002 | -------- | F | ---------- | ----------
sample_000004 | -------- | F | ---------- | ----------
3/5 finished with 0 errors.
```
60 changes: 3 additions & 57 deletions propti/__init__.py
Original file line number Diff line number Diff line change
@@ -1,58 +1,4 @@
import logging
__version__ = "1.2.0"
# __version__ must be the fist line in order for setup.py to read the version.

#########
# LOGGING
# set up logging to file - see previous section for more details

# get MPI rank for individual log files
import mpi4py
mpi4py.rc.recv_mprobe = False

from mpi4py import MPI
my_rank = MPI.COMM_WORLD.Get_rank()

logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='propti.{:03d}.log'.format(my_rank),
filemode='w')

# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.INFO)

# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')

# tell the handler to use this format
console.setFormatter(formatter)

# add the handler to the root logger
logging.getLogger('').addHandler(console)

########
# PROPTI AND SPOTPY

from .spotpy_wrapper import run_optimisation, create_input_file
from .data_structures import Parameter, ParameterSet, \
SimulationSetupSet, SimulationSetup, Relation, DataSource, \
OptimiserProperties, Version
from .basic_functions import run_simulations
from .propti_post_processing import run_best_para

from .propti_monitor import plot_scatter, plot_scatter2, \
plot_para_vs_fitness, plot_box_rmse
from .propti_post_processing import run_best_para, plot_hist, \
calc_pearson_coefficient, collect_best_para_multi, plot_best_sim_exp
from .propti_pre_processing import interpolate_lists

from .fitness_methods import FitnessMethodRMSE, FitnessMethodInterface, \
FitnessMethodThreshold, FitnessMethodRangeRMSE, FitnessMethodBandRMSE, \
FitnessMethodIntegrate


###########
# CONSTANTS

# TODO: respect this variable in scripts
pickle_prefix = 'propti.pickle'
from .lib import *
28 changes: 28 additions & 0 deletions propti/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
def main():
import argparse
import sys
commands = ["analyse","prepare","run","sampler","sense","job"]

command = sys.argv[1] if len(sys.argv) > 1 else None
if command in commands:
sys.argv.pop(1)
if command == "analyse":
from .run import propti_analyse
elif command == "prepare":
from .run import propti_prepare
elif command == "run":
from .run import propti_run
elif command == "sampler":
from .run import propti_sampling
elif command == "sense":
from .run import propti_sense
elif command == "job":
from .run import propti_job
else:
parser = argparse.ArgumentParser(
prog='propti',
description='modelling (or optimisation) of parameters in computer simulation with focus on handling the communication between simulation software and optimisation algorithms',
epilog="use: 'propti <command> -h' for more information about the sub program.")
parser.add_argument("command",choices=commands)
parser.add_argument("args", nargs="*")
parser.parse_args()
83 changes: 83 additions & 0 deletions propti/jobs/fds
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
#!/bin/bash
#!/bin/sh
# Name of the Job
#SBATCH --job-name=#CHID#_#ID#

# On witch device the simulation is run
#SBATCH --partition=normal

# Maximum time the job can run: days-hours:minutes:seconds
#SBATCH --time=#TIME#

# Number of cores
#SBATCH --tasks-per-node=1
#SBATCH --cpus-per-task=1
#SBATCH --nodes=#NODES#

# Output file name
#SBATCH --output=stdout.%j
#SBATCH --error=stderr.%j


cd #EXECUTION_DIRS#

mkdir ./results
cd ./results

# define FDS input file
FDSSTEM=../

# grep the CHID (used for stop file)
CHID=`sed -n "s/^.*CHID='\\([-0-9a-zA-Z_]*\\)'.*$/\\1/p" < $FDSSTEM*.fds`

# append the start time to file 'time_start'
echo "$SLURM_JOB_ID -- `date`" >> time_start

# handle the signal sent before the end of the wall clock time
function handler_sigusr1
{
# protocol stopping time
echo "$SLURM_JOB_ID -- `date`" >> time_stop
echo "`date` Shell received stop signal"

# create FDS stop file
touch $CHID.stop

# as manual stop was triggered, the end of simulation time was
# not reached, remove flag file 'simulation_time_end'
rm simulation_time_end
wait
}

# register the function 'hander_sigusr1' to handle the signal send out
# just before the end of the wall clock time
trap "handler_sigusr1" SIGUSR1

# check for the simulation finished flag file 'simulation_time_end'
# if it is found, just quit
if [ -e simulation_time_end ]; then
## simulation has already finished, nothing left to do
echo "FDS simulation already finished"
exit 0
fi

# simulation not finished yet
# create flag file to check for reaching simulation end time
touch simulation_time_end

# Load FDS
module use -a /beegfs/larnold/modules
module load FDS

# set the number of OMP threads
export OMP_NUM_THREADS=${SLURM_CPUS_PER_TASK}

# run FDS executable
mpiexec fds $FDSSTEM*.fds & wait

# set RESTART to TRUE in FDS input file
sed -i 's/RESTART\s*=\s*\(F\|.FALSE.\)/RESTART=.TRUE./g' $FDSSTEM*.fds

# remove the stop file, otherwise the following chain parts
# would not run
rm $CHID.stop
Loading