Commit d9dab1b0 authored by Robert Schweppe's avatar Robert Schweppe
Browse files

Merge branch 'new_release' into 'master'

New release

See merge request !74
parents 3fa0b79b 9bce4ecd
Pipeline #56133 failed with stages
in 4 seconds
......@@ -152,8 +152,9 @@ show-env-vars:
# template for documentation jobs
.documentation_template: &documentation_template
#only:
# - develop
only:
- develop
- master
stage: build
script:
- module load foss/2019b
......@@ -167,19 +168,17 @@ show-env-vars:
- cd latex/ && make > ../doxygen_latex_dev.txt
- cp refman.pdf ../html/mpr_doc.pdf
- cp refman.pdf ../mpr_doc_dev.pdf
- cp refman.pdf ../mpr_doc_mas.pdf # TODO: remove once we have a decent version on master, too
- cd .. && mv html html_dev
- mv doxygen_warn.txt doxygen_warn_dev.txt
- rm -rf latex
# TODO: activate once we have a decent version on master, too
# # same for master
# - git checkout master
# - test -f doc/doxygen-1.8.8.config && cp doc/doxygen-1.8.8.config doc/doxygen.config
# - doxygen doc/doxygen.config > doxygen_log_mas.txt
# - cd latex/ && make > ../doxygen_latex_mas.txt
# - cp refman.pdf ../html/mpr_doc.pdf
# - cp refman.pdf ../mpr_doc_mas.pdf
# - cd .. && mv html html_mas
# same for master
- git checkout master
- test -f doc/doxygen-1.8.8.config && cp doc/doxygen-1.8.8.config doc/doxygen.config
- doxygen doc/doxygen.config > doxygen_log_mas.txt
- cd latex/ && make > ../doxygen_latex_mas.txt
- cp refman.pdf ../html/mpr_doc.pdf
- cp refman.pdf ../mpr_doc_mas.pdf
- cd .. && mv html html_mas
# care about warnings file (maybe missing on master)
- |
if [ -f doxygen_warn.txt ]; then
......@@ -196,11 +195,11 @@ show-env-vars:
- doxygen_log_dev.txt
- doxygen_latex_dev.txt
- doxygen_warn_dev.txt
#- html_mas
- html_mas
- mpr_doc_mas.pdf
#- doxygen_log_mas.txt
#- doxygen_latex_mas.txt
#- doxygen_warn_mas.txt
- doxygen_log_mas.txt
- doxygen_latex_mas.txt
- doxygen_warn_mas.txt
# ##################
# ### BUILD JOBS ###
......
# Multiscale parameter regionalization -- MPR
- The current release is **[MPR v1.0.0][1]**.
- The current release is **[MPR v1.0.1][1]**.
- The latest MPR release notes can be found in the file [RELEASES][3] or [online][4].
- General information can be found on the [MPR website](https://www.ufz.de/index.php?en=40126).
- The mHM comes with a [LICENSE][6] agreement, this includes also the GNU Lesser General Public License.
- The MPR comes with a [LICENSE][6] agreement, this includes also the GNU Lesser General Public License.
- There is a list of [publications using MPR][7].
**Please note:** The [GitLab repository](https://git.ufz.de/chs/MPR) grants read access to the code.
......@@ -19,7 +19,7 @@ The online documentation for mHM can be found here (pdf versions are provided th
The model code can be cited as:
> Schweppe, R., S. Thober, M. Kelbling, R. Kumar, S. Attinger, and L. Samaniego (2021): MPR 1.0: A stand-alone Multiscale Parameter Regionalization Tool for Parameter Estimation of Land Surface Models, Geoscientific Model Development (to be submitted)
> Schweppe, R., S. Thober, M. Kelbling, R. Kumar, S. Attinger, and L. Samaniego (2021): MPR 1.0: A stand-alone Multiscale Parameter Regionalization Tool for Parameter Estimation of Land Surface Models, Geoscientific Model Development (submitted)
Please see the original publication of the MPR framework:
......@@ -29,7 +29,7 @@ Please see the original publication of the MPR framework:
The model code can be generally cited as:
> **MPR:**
> Schweppe, Robert, Thober, Stephan, Kelbling, Matthias, Kumar, Rohini, Attinger, Sabine, & Samaniego, Luis. (2021, March 31). Multiscale Parameter Regionalization tool -MPR v. 1.0 (Version 1.0). Zenodo. http://doi.org/10.5281/zenodo.4650513
To cite a certain version, have a look at the [Zenodo site][10].
......@@ -43,7 +43,7 @@ See also the [documentation][5] for detailed instructions to setup MPR.
1. Add the required transfer functions to the code.
2. Compile MPR with CMake.
3. Run MPR on the example `./mhm`, which uses settings from [mpr.nml](mpr.nml).
3. Run MPR on the example via `./MPR`, which uses settings from [mpr.nml](mpr.nml).
[1]: https://git.ufz.de/chs/mpr/tree/1.0.0
......@@ -54,4 +54,4 @@ See also the [documentation][5] for detailed instructions to setup MPR.
[7]: doc/mpr_papers.md
[8]: doc/src/02_dependencies.md
[9]: doc/src/03_installation.md
[10]: https://zenodo.org/
[10]: https://zenodo.org/record/4650513
......@@ -2,4 +2,11 @@
## MPR v1.0 (Mar 2021)
- initial release
\ No newline at end of file
- initial release
## MPR v1.0.1 (May 2021)
- reintegrated erroneously deleted tests for Python preproprocessors
- fixed some typos in README.md
- updated the hpc-module-loads subrepo
- activated documentation building for master branch by default as well
\ No newline at end of file
......@@ -5,8 +5,8 @@
;
[subrepo]
remote = https://git.ufz.de/chs/HPC-Fortran-module-loads.git
branch = v1.1
commit = 6829f263b7a64932325f751212a994c03fd0c35b
parent = 6edbc69c5024c0816dcc3d19fec0542903628a8e
branch = v1.2
commit = d0c99c3686409792f6d90b65343a4016b8de9c32
parent = 72dbd446089af04f726c427b3140a8fb10f9d5e6
method = merge
cmdver = 0.4.3
......@@ -10,10 +10,15 @@ All these scripts will load:
- netCDF-Fortran
- CMake
- the MPR Python Environment
- pFUnit - Fortran unit testing framework
- pFUnit - Fortran unit testing framework (_not available for GNU 6.4_)
### Usage
- GNU 6.4 compiler (`foss/2018a` Toolchain):
```bash
source eve.gfortran64 # or
source eve.gfortran64MPI
```
- GNU 7.3 compiler (`foss/2018b` Toolchain):
```bash
source eve.gfortran73 # or
......
module purge
module load foss/2018a
module load netCDF-Fortran
module load CMake
module use /global/apps/modulefiles
module load python_env_mpr
export NETCDF_DIR="$EBROOTNETCDF"
export NETCDF_FORTRAN_DIR="$EBROOTNETCDFMINFORTRAN"
export FC=gfortran
module purge
module load foss/2018a
module load netCDF-Fortran
module load CMake
module use /global/apps/modulefiles
module load python_env_mpr
export NETCDF_DIR="$EBROOTNETCDF"
export NETCDF_FORTRAN_DIR="$EBROOTNETCDFMINFORTRAN"
export FC=mpifort
......@@ -9,18 +9,18 @@ Author : Robert Schweppe (robert.schweppe@ufz.de)
Created : 2019-09-05 11:46
"""
import argparse
import math
import warnings
from itertools import product
# IMPORTS
import geopandas as gpd
import numpy as np
import pandas as pd
import tqdm
import xarray as xr
import numpy as np
import argparse
import sys
from shapely.geometry import Polygon
import tqdm
import math
import warnings
from itertools import product
# GLOBAL VARIABLES
MISSING_VALUE = -9999.0
......@@ -66,7 +66,7 @@ def parse_args():
# CLASSES
class MyGeoDataFrame(gpd.GeoDataFrame):
def get_SCRIP_vars(self):
#self.check_type()
# self.check_type()
print('getting the centroid lon values')
# centroid_lon = self.check_longitude(self.get_centroid('x'))
centroid_lon = self.get_centroid('x')
......@@ -90,22 +90,23 @@ class MyGeoDataFrame(gpd.GeoDataFrame):
# first get the number of corners for each polygon
# subtract 1 because last vertex is double
print('getting number of vertices for each cell')
lengths = [len(item.exterior.coords.xy[0]) - 1 if item.geom_type == 'Polygon' else len(item[0].exterior.coords.xy[0]) - 1 for item in self.geometry]
lengths = [len(item.exterior.coords.xy[0]) - 1 if item.geom_type == 'Polygon' else len(
item[0].exterior.coords.xy[0]) - 1 for item in self.geometry]
max_length = max(lengths)
# init the final arrays and set the default missing value
corner_lon = np.zeros((len(self.geometry), max_length))
corner_lon[corner_lon==0]= MISSING_VALUE
corner_lon[corner_lon == 0] = MISSING_VALUE
corner_lat = np.zeros((len(self.geometry), max_length))
corner_lat[corner_lat==0]= MISSING_VALUE
corner_lat[corner_lat == 0] = MISSING_VALUE
# now loop over each polygon and iteratively set the corner values in the target array
# TODO: sort the nodes so the centroid is always left of the node path
# https://stackoverflow.com/questions/1165647/how-to-determine-if-a-list-of-polygon-points-are-in-clockwise-order
for i_item, item in enumerate(self.geometry):
if item.geom_type == 'Polygon':
corner_lon[i_item,:lengths[i_item]], corner_lat[i_item,:lengths[i_item]] = \
corner_lon[i_item, :lengths[i_item]], corner_lat[i_item, :lengths[i_item]] = \
self._check_order(*item.exterior.coords.xy)
elif item.geom_type == 'MultiPolygon':
corner_lon[i_item,:lengths[i_item]], corner_lat[i_item,:lengths[i_item]] = \
corner_lon[i_item, :lengths[i_item]], corner_lat[i_item, :lengths[i_item]] = \
self._check_order(*item[0].exterior.coords.xy)
else:
raise
......@@ -115,7 +116,7 @@ class MyGeoDataFrame(gpd.GeoDataFrame):
def check_longitude(self, lon_arg):
if lon_arg.min() <= 0:
warnings.warn('Longitude values are ranging from -180 to 180, converting to range 0 to 360')
lon_arg= np.where(lon_arg<0,lon_arg+360, lon_arg)
lon_arg = np.where(lon_arg < 0, lon_arg + 360, lon_arg)
return lon_arg
def _check_order(self, lons, lats):
......@@ -140,11 +141,11 @@ class MyGeoDataFrame(gpd.GeoDataFrame):
min_index = np.argmin(lats)
while True:
# all neighboring indices
indices = [(min_index-1)%len(lats), min_index, (min_index+1)%len(lats)]
indices = [(min_index - 1) % len(lats), min_index, (min_index + 1) % len(lats)]
# calculate determinant as here: https://en.wikipedia.org/wiki/Curve_orientation
det = \
(lons[indices[1]] - lons[indices[0]])*(lats[indices[2]] - lats[indices[0]]) - \
(lons[indices[2]] - lons[indices[0]])*(lats[indices[1]] - lats[indices[0]])
(lons[indices[1]] - lons[indices[0]]) * (lats[indices[2]] - lats[indices[0]]) - \
(lons[indices[2]] - lons[indices[0]]) * (lats[indices[1]] - lats[indices[0]])
if det == 0:
# the three points are colinear, check next vertices
min_index = indices[2]
......@@ -156,6 +157,7 @@ class MyGeoDataFrame(gpd.GeoDataFrame):
# sort ascending
return lons[::-1], lats[::-1]
def handle_latlon_format(ds):
coord_var = ['south_north', 'west_east']
# select all data_vars that have dimension grid_size
......@@ -176,13 +178,13 @@ def handle_latlon_format(ds):
df.columns = [data_var]
dfs.append(df)
# we turn the pandas DataFrame in a Geopandas GeoDataFrame and still need geometry
gdf = gpd.GeoDataFrame(pd.concat(dfs, axis=1), geometry=[[]]*len(dfs[0]))
gdf = gpd.GeoDataFrame(pd.concat(dfs, axis=1), geometry=[[]] * len(dfs[0]))
# convert units to degree
# loop over polygons and calculate number of edges and set geometry item
for i, j in product(ds[coord_var[0]], ds[coord_var[1]]):
xvals = ds['XLONG_bnds'].isel(**{coord_var[1]: j}).values
yvals = ds['XLAT_bnds'].isel(**{coord_var[0]: i}).values
gdf.loc['{}_{}'.format(float(i),float(j)), 'geometry'] = Polygon([
gdf.loc['{}_{}'.format(float(i), float(j)), 'geometry'] = Polygon([
(xvals[0], yvals[0]),
(xvals[1], yvals[0]),
(xvals[1], yvals[1]),
......@@ -227,10 +229,10 @@ if __name__ == '__main__':
ds = xr.Dataset(
data_vars={'grid_corner_lon': (['grid_size', 'grid_corners'], corner_lon),
'grid_corner_lat': (['grid_size', 'grid_corners'], corner_lat),
'grid_center_lon': (['grid_size'], centroid_lon),
'grid_center_lat': (['grid_size'], centroid_lat),
},
'grid_corner_lat': (['grid_size', 'grid_corners'], corner_lat),
'grid_center_lon': (['grid_size'], centroid_lon),
'grid_center_lat': (['grid_size'], centroid_lat),
},
attrs={'title': args.name, 'units': units}
)
# add units globally and to each variable
......@@ -258,7 +260,7 @@ if __name__ == '__main__':
EXCLUDE_DIMS = ['grid_center_lon', 'grid_center_lat', 'grid_corner_lon', 'grid_corner_lat', 'grid_imask']
# all MPI-ICON based variable names and its SCRIP counterparts
RENAME_VAR_DICT = {'clon': 'grid_center_lon', 'clat': 'grid_center_lat',
'clon_vertices': 'grid_corner_lon', 'clat_vertices': 'grid_corner_lat'}
'clon_vertices': 'grid_corner_lon', 'clat_vertices': 'grid_corner_lat'}
# all MPI-ICON based dimension names and its SCRIP counterparts
RENAME_DIM_DICT = {'cell': 'grid_size', 'nv': 'grid_corners'}
# all MPI-ICON based data variable names
......@@ -274,17 +276,18 @@ if __name__ == '__main__':
# do we have special MPI-ICON names?
if all(mpi_icon_format):
# select and rename
ds = ds[list(RENAME_VAR_DICT.keys())+SELECT_VARS].rename({**RENAME_VAR_DICT, **RENAME_DIM_DICT})
ds = ds[list(RENAME_VAR_DICT.keys()) + SELECT_VARS].rename({**RENAME_VAR_DICT, **RENAME_DIM_DICT})
# set the grid_imask property based on sea_land_mask or to default 1
if 'cell_sea_land_mask' in ds:
ds['grid_imask'] = xr.where(ds['cell_sea_land_mask']>1, 1, 0)
ds['grid_imask'] = xr.where(ds['cell_sea_land_mask'] > 1, 1, 0)
else:
ds['grid_imask'] = (('grid_size',), np.ones_like(ds['grid_center_lon'].values, dtype=int))
# select all data_vars that have dimension grid_size
shp_vars = [data_var for data_var in ds.data_vars if 'grid_size' in ds[data_var].dims and data_var not in EXCLUDE_DIMS]
shp_vars = [data_var for data_var in ds.data_vars if
'grid_size' in ds[data_var].dims and data_var not in EXCLUDE_DIMS]
if not shp_vars:
ds['id'] = (('grid_size',), np.array(range(1,len(ds['grid_size'])+1)))
shp_vars =['id']
ds['id'] = (('grid_size',), np.array(range(1, len(ds['grid_size']) + 1)))
shp_vars = ['id']
# empty container
dfs = []
for data_var in shp_vars:
......@@ -300,7 +303,10 @@ if __name__ == '__main__':
df.columns = [data_var]
dfs.append(df)
# we turn the pandas DataFrame in a Geopandas GeoDataFrame and still need geometry
gdf = gpd.GeoDataFrame(pd.concat(dfs, axis=1))
gdf = gpd.GeoDataFrame(pd.concat(dfs, axis=1, keys=shp_vars))
# merge the MultiIndex into a flat index
gdf.columns = ['_'.join((str(item) for item in _)) for _ in gdf.columns]
# convert units to degree
n_corners = len(ds['grid_corners'])
for var_name in EXCLUDE_DIMS:
......@@ -323,7 +329,7 @@ if __name__ == '__main__':
ds['grid_corner_lat'].isel(grid_size=i_polygon)))
# set WGS84 by default
gdf.crs = {'init' :'epsg:4326'}
gdf.crs = {'init': 'epsg:4326'}
gdf.to_file(args.output_file)
else:
raise Exception('Did not get proper filenames. Script only works for conversion of *.shp->*.nc or vice versa')
......@@ -20,6 +20,13 @@ from textwrap import dedent
from src_python.pre_proc.mpr_interface import OPTIONS
import re
DOT2TEX_IS_AVAILABLE = True
try:
import dot2tex as d2t
except ImportError:
d2t = None
DOT2TEX_IS_AVAILABLE = False
DOT2TEX_IS_AVAILABLE = True
try:
import dot2tex as d2t
......@@ -32,6 +39,7 @@ OUTPUT_FILE_BASENAME = 'MPR_graphviz'
DOT2TEX_GUIDE_HTML = 'https://dot2tex.readthedocs.io/en/latest/installation_guide.html'
DEFAULT_PATH_TO_ROOT = '../../'
DEFAULT_CONFIG_FILE = 'predefined_nmls/mpr_mhm_visual.nml'
# DEFAULT_PARAM_FILE = 'mpr_global_parameter.nml' # NOT USED
DEFAULT_OUTPUT_FORMAT = 'pdf'
DEFAULT_ENGINE = 'dot'
# TODO: this is broken?
......
# encoding: utf-8
"""
File Name : test_create_graphviz_plot.py
Project Name: MPR
Description : test suite for create_graphviz_plot.py script
Author : Robert Schweppe (robert.schweppe@ufz.de)
Created : 2019-03-30 13:30
"""
# IMPORTS
import pytest
from src_python.pre_proc.create_graphviz_plot import Redirecter, GraphvizPlotter, DataArray
class TestRedirecter:
def test_one(self):
assert True
class TestGraphvizPlotter:
def test_one(self):
assert True
class TestDataArray:
def test_one(self):
assert True
# encoding: utf-8
"""
File Name : test_create_graphviz_plot.py
Project Name: MPR
Description : test suite for update_tfs_in_fortran_source.py script
Author : Robert Schweppe (robert.schweppe@ufz.de)
Created : 2019-03-30 13:30
"""
# IMPORTS
import pytest
from src_python.pre_proc.update_tfs_in_fortran_source import TFConverter, TF, DASource, TFSource, \
replace_in_string, get_index_in_string, WORD_BOUNDARIES
import copy
from textwrap import dedent
class TestTFConverter:
'''
@staticmethod
def check_fortran_code_for_completion(transfer_func_file, print_status=False):
pass
# read mo_mpr_transfer_funcs and check results
# TODO: this was taken out, because the existing code does not work with more complex where and if clauses
# also, the checking of the proper formulation of if and where clauses (if (...) then (...) else (...) end if)
# was moved to the part where the math_string is created, so that should not be necessary anymore
# maybe this part should be moved to a unit test anyway (pytest)
if self.is_test_mode:
fi = open(transfer_func_file + MODIFIED_SUFFIX, 'r')
else:
fi = open(transfer_func_file, 'r')
lines = fi.readlines()
fi.close()
print('\n start checking transfer functions: ')
current_transfer_func = []
for ll, line in enumerate(lines):
if not re.search('func_result', line.replace(' ', '')) is None:
# check whether line before contains if statement
if not re.search('if.*then', lines[ll - 1][:-1].replace(' ', '')) is None:
start_if = True
store_func = lines[ll - 1][:-1].replace(' ', '')
elif not re.search('endif', lines[ll + 1][:-1].replace(' ', '')) is None:
# if clause already processed
continue
else:
start_if = False
store_func = ''
# get the entire function
store_func, ll, line = self._evaluate_func(store_func, ll, line, lines)
#
if start_if:
# check whether line contains else
if not re.search('else', line[:-1].replace(' ', '')) is None:
store_func += line[:-1].replace(' ', '')
# get the second part of the function function
store_func, ll, line = self._evaluate_func(store_func, ll + 1, lines[ll + 1], lines)
if not re.search('endif', line[:-1].replace(' ', '')) is None:
store_func += line[:-1].replace(' ', '')
else:
raise ValueError('Incomplete if statement in mo_mpr_transfer_func')
if re.search('if.*if', store_func) and store_func[0] != '!':
current_transfer_func.append(store_func)
elif re.search('func_result.*=', store_func) and store_func[0] != '!':
current_transfer_func.append(store_func)
# remove func_result(i)= pattern
current_transfer_func = [re.sub('func_result...=', '', func) for func in current_transfer_func]
inverted_transfer_func = [self._invert(func) for func in current_transfer_func]
if self.raw_tf_string.replace(' ', '') in inverted_transfer_func:
print('')
print(' Transfer function: ')
print(' {}'.format(self.raw_tf_string))
if print_status:
print(' successfully added to Fortran code!')
else:
print(' already contained in Fortran code!')
print('')
print(' Please proceed with re-compiling mpr!')
print('')
else:
raise ValueError('***ERROR: Transfer function {} has not been added correctly!'.format(self.raw_tf_string))
@staticmethod
def _evaluate_func(store_func, ll, line, lines):
store_func += line[:-1].replace(' ', '')
cont_line = True
while cont_line:
if line[-1] == '&':
cont_line = True
store_func += lines[ll + 1][:-1].replace(' ', '')
ll += 1
else:
cont_line = False
ll += 1
return store_func, ll, lines[ll]
def _invert(self, func):
inv_func = copy(func)
# replace parameters
for pp in range(self.n_params + 1):
inv_func = inv_func.replace('param({})'.format(pp), 'p{}'.format(pp))
# replace predictors
for ii in range(len(self.predictors)):
inv_func = inv_func.replace('x({})%data_p(i)'.format(ii + 1), self.predictors[ii])
return inv_func
'''
def test_one(self):
assert True
class TestTF:
def test_translation(self):
raw_strings = (
'p1 + p2 * d1 + p3 * d2',
"thetas1+thetas2*CLYPPT_M+thetas3*BLDFIE_M+thetas4*SLTPPT_M**(2.0)+thetas5*ORCDRC_M**(2.0)+"
"thetas6*CLYPPT_M**(-1.0)+thetas7*SLTPPT_M**(-1.0)+thetas8*log(SLTPPT_M)+thetas9*ORCDRC_M*CLYPPT_M-"
"thetas10*BLDFIE_M*CLYPPT_M-thetas11*BLDFIE_M*ORCDRC_M+thetas12*topsoil*SLTPPT_M",
'p1+exp((p2+d1)*p3)',
'p1+exp(p3*(p2+d1))',
'PTF_Ks_curveSlope * exp ((PTF_Ks_constant + PTF_Ks_sand * SAND + PTF_Ks_clay * CLAY) * log (10))',
'where((z.lower_bound + (z.upper_bound - z.lower_bound) / 2.0) <= topsoil_boundary) 1.0 else 0.0',
'where (x1 > l1 .and. x1 < l2) l5 else where (x1 < l3) l6 else where (x1 < l4) l7',
'p1 + asin(d1)',
'p1 + atan2(d1)',
'p1 + tanh(d1)',
)
predictors_sets = (
['d1', 'd2'],
['CLYPPT_M', 'SLTPPT_M', 'BLDFIE_M', 'ORCDRC_M', 'topsoil'],
['d1'],
['d1'],
['SAND', 'CLAY'],
['SLTPPT_M', 'z.upper_bound', 'z.lower_bound'],
['x1'],
['d1'],
['d1'],
['d1'],
)
global_params_set = (
None,
{'thetas0': 0,
'thetas1': 1,
'thetas2': 2,
'thetas3': 3,
'thetas4': 4,
'thetas5': 5,
'thetas6': 6,
'thetas7': 7,
'thetas8': 8,
'thetas9': 9,
'thetas10': 10,
'thetas11': 11,
'thetas12': 12,
'-1.0': -1,
'2.0': 2},
None,
None,
{'10': 10.0,
'PTF_Ks_curveSlope': 0.000007055555,
'PTF_Ks_constant': -0.6,
'PTF_Ks_sand': 0.0126,
'PTF_Ks_clay': -0.0064,
},
{'topsoil_boundary': 0.0501, '0.0': 0.0, '1.0': 1.0, '2.0': 2.0},
{
'l2': 2,
'l6': 6,
'l1': 1,
'l7': 7,
'l3': 3,
'l5': 5,
'l4': 4,
},
None,
None,
None,
)
translated_names = (
'p1_pl_p2_ti_x1_pl_p3_ti_x2',
'p1_pl_p2_ti_x1_pl_p3_ti_x3_pl_p4_ti_x2_po_bs_p5_be_pl_p6_ti_x4_po_bs_p5_be_pl_p7_ti_x1_po'
'_bs_p8_be_pl_p9_ti_x2_po_bs_p8_be_pl_p10_ti_l2_bs_x2_be_pl_p11_ti_x4_ti_x1_mi_p12_ti_x3'
'_ti_x1_mi_p13_ti_x3_ti_x4_pl_p14_ti_x5_ti_x2',
'p1_pl_ex_bs_bs_p2_pl_x1_be_ti_p3_be',
'p1_pl_ex_bs_p3_ti_bs_p2_pl_x1_be_be',
'p1_ti_ex_bs_bs_p2_pl_p3_ti_x1_pl_p4_ti_x2_be_ti_l2_bs_p5_be_be',
#'wh_bs_bs_x3_pl_bs_x2_mi_x3_be_di_p1_be_le_p2_be_p3_el_p4',
'wh_bs_bs_x3_pl_bs_x2_mi_x3_be_di_p1_be_le_p2_be_p3_el_p4',
'wh_bs_x1_gt_p1_ad_x1_lt_p2_be_p3_el_wh_bs_x1_lt_p4_be_p5_el_wh_bs_x1_lt_p6_be_p7',
'p1_pl_as_bs_x1_be',
'p1_pl_au_bs_x1_be',
'p1_pl_tx_bs_x1_be',
)
math_strings = (
'func_result(:) = param(1) + param(2) * x(1)%data_p(:) + param(3) * x(2)%data_p(:)',
dedent('''\
func_result(:) = param(1) + param(2) * x(1)%data_p(:) + param(3) * x(3)%data_p(:) + param(4) * &
x(2)%data_p(:) ** ( param(5) ) + param(6) * x(4)%data_p(:) ** ( param(5) ) + param(7) * x(1)%data_p(:) &
** ( param(8) ) + param(9) * x(2)%data_p(:) ** ( param(8) ) + param(10) * log ( x(2)%data_p(:) ) + &
param(11) * x(4)%data_p(:) * x(1)%data_p(:) - param(12) * x(3)%data_p(:) * x(1)%data_p(:) - param(13) &
* x(3)%data_p(:) * x(4)%data_p(:) + param(14) * x(5)%data_p(:) * x(2)%data_p(:)'''),
'func_result(:) = param(1) + exp ( ( param(2) + x(1)%data_p(:) ) * param(3) )',
'func_result(:) = param(1) + exp ( param(3) * ( param(2) + x(1)%data_p(:) ) )',
dedent('''\
func_result(:) = param(1) * exp ( ( param(2) + param(3) * x(1)%data_p(:) + param(4) * &