MPR issueshttps://git.ufz.de/chs/MPR/-/issues2018-04-18T13:49:06+02:00https://git.ufz.de/chs/MPR/-/issues/1mask files for catchments2018-04-18T13:49:06+02:00Robert Schweppemask files for catchmentsAllow user to provide mask file for catchment borders, so the catchment does not need to be clipped beforehand.Allow user to provide mask file for catchment borders, so the catchment does not need to be clipped beforehand.https://git.ufz.de/chs/MPR/-/issues/8allow LUT-based input_fields2021-03-26T12:00:16+01:00Robert Schweppeallow LUT-based input_fieldsThis can either be done with nc files. "ID" is an additional vector in the nc files.This can either be done with nc files. "ID" is an additional vector in the nc files.https://git.ufz.de/chs/MPR/-/issues/21allow variable to be dimension2020-03-23T10:08:38+01:00Stephan Thoberallow variable to be dimensionsoil depths can be location dependent. In the vertical averaging, a spatially varying field can be used to calculate the weights.soil depths can be location dependent. In the vertical averaging, a spatially varying field can be used to calculate the weights.Suspendedhttps://git.ufz.de/chs/MPR/-/issues/26Allow differently dimensioned data arrays for a transfer function2018-08-09T16:42:16+02:00Robert SchweppeAllow differently dimensioned data arrays for a transfer functionEnable the following `mpr.nml`:
```
&mainconfig
out_filename = "super_flexible_now.nc"
dim_name_alias(:,1) = "x1", "x2", "x3"
dim_name_alias(:,2) = "y1", "y2", "y3"
dim_name_alias(:,3) = "z1", "z2", "z3"
/
&Data_Arrays
names(1) = "D1"
...Enable the following `mpr.nml`:
```
&mainconfig
out_filename = "super_flexible_now.nc"
dim_name_alias(:,1) = "x1", "x2", "x3"
dim_name_alias(:,2) = "y1", "y2", "y3"
dim_name_alias(:,3) = "z1", "z2", "z3"
/
&Data_Arrays
names(1) = "D1"
from_file(1) = "D1.nc"
! dims "x1", "y1", "z3"
names(2) = "D2"
from_file(1) = "D2.nc"
! dims "z1", "x1", "y2"
names(3) = "D3"
from_file(1) = "D3.nc"
! dims "y3", "z2"
names(4) = "D_final"
transfer_funcs(4) = "something_with_3_args"
from_data_arrays(1:3,4) = "D1", "D2", "D3"
/
```
At subroutine `check_data_arrays` of data array D_final, the following should happen:
* define the outer union of all dimensions at the highest resolution. Let's say that x1 has higher resolution than x2 than x3, it would be: x1, y1, z1
* check for all data_arrays, whether they have the same *order* of their dim_aliases (e.g. x,y,z or z,x,y) -> if not, `transpose` to the order to which to conform to would need the least transpose operations (in this case x,y,z)
* check for all data_arrays, whether they have the same *number* of dim_aliases -> if not, `broadcast`
* check for all data_arrays, whether all dimensions have the same bounds (within one dim_alias group) -> if not, `trim` to inner union
* check for all data_arrays, whether all dimensions have the same vector (within one dim_alias group) -> if not, `resample` to intersect of all vector items
Effectively, this would rearrange:
* D1 to D4 with x1,y1,z1, requiring a `resample` of z3 to z1
* D2 to D5 with x1,y1,z1, requiring a `transpose` of z1,x1,y2 to x1,y2,z1 and a `resample` of y2 to y1
* D3 to D6 with x1,y1,z1, requiring a `broadcast` of y3,z2 to x1,y3,z2 and a `resample` of y3 to y1 and z2 to z1
`resample` and `trim` functionality is already contained in the `get_weights` and `upscale` procedures. `broadcast` and `transpose` would require new procedures. The `concatenate` functionality should of course be preserved.Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/27issue a warning if a transfer function created NaN due to numerical problems2018-08-09T16:45:47+02:00Robert Schweppeissue a warning if a transfer function created NaN due to numerical problemsthis should be relatively easily accomplished and could aid the use to problems in the parameters/transfer functions/input data. Ideally this is issued when stats after the application of the transfer function are printed (min, max, mean...this should be relatively easily accomplished and could aid the use to problems in the parameters/transfer functions/input data. Ideally this is issued when stats after the application of the transfer function are printed (min, max, mean, number of NaN)https://git.ufz.de/chs/MPR/-/issues/33Improve speed2021-11-12T13:52:43+01:00Robert SchweppeImprove speedCurrently the speed of MPR fell far behind that of its previous version that was part of the mHM code (ca. 50x for test basin). There are some possible reasons for that:
* the soil parameters (now 50% of all calculations) of the previo...Currently the speed of MPR fell far behind that of its previous version that was part of the mHM code (ca. 50x for test basin). There are some possible reasons for that:
* the soil parameters (now 50% of all calculations) of the previous mHM were determined based on a lookup-table with soil classes and their properties and determined for existing classes only (read in of integer ascii grid)
* the (spatial) upscaling did not rely on weights, low-res cell boundaries needed to align with high-res boundaries
* 2D-upscaling was done in one go, other dimensions did not have missing values and were always at first positions in order of dimensions
which contrasts the new approach:
* we now do that fully distributed without prior classification (this also means, we read float arrays)
* we allow for non-aligned cell boundaries in high and low-resolution
* we allow for any order of dimensions and also allow missing values within all dimensions (e.g. a missing value in a soil horizon)
So the idea is to check the current code for bottlenecks and identify alternative algorithms. After that we might think about shortcuts that can be used in cases when data fulfill the requirements of mHM data (e.g. dimension order (x,y, remaining dim) and no missing values within a x,y-slice). I was wondering, how @schaefed might help us here?!
The scheme is currently as follows for the example of the case where TargetVar = SourceVar1 + SourceVar2
1. read in the SourceVar1 and SourceVar2
2. store the nDim-mask in a flat array, store the data a flat, packed array, store the dimensions
3. calculate temporary TargetVar (new data, use mask and dimension of SourceVars)
4. check for dimension order of TargetVar, if different than that of SourceVar, transpose temporary TargetVar
5. upscale each dimension individually (loop over the target dimensions)
* check if source dimension and target dimension differ (case1: no difference - continue, case2: source dim not existing - broadcast dimensions, case3: difference - upscale the dimension (see b)))
* compare dimensions, for each target dim id: calculate vector of source dim ids, vector of weights for source dim ids, number of subcells (if same pair of dimensions were used before, skip and use previous information)
* re-insert missing data in 1D flat data array (so has same shape with 1D flat mask), initialize empty missing_value_weight array in same shape
* loop over the n data chunks (n is the product of dimension sizes after the current dimension)
* reshape to 2D and transpose each chunk (new shape is (size of dimension, product of sizes after the before current dimension))
* loop now over 2nd dim, resulting vector is then a slice with the length of our current dimension
* if this vector contains nans, recalculate basic dimension weights, ids and subcells (see 5b) and missing_value_weight
h) loop over target cells and apply upscaling function, yield new dataNext major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/50add possibility to assign netcdf attributes to DataArrays and the file2020-02-19T11:48:31+01:00Robert Schweppeadd possibility to assign netcdf attributes to DataArrays and the fileNext major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/51Investigate usage of InputFieldContainer, why is it used, why only in Transfe...2021-11-12T13:55:32+01:00Robert SchweppeInvestigate usage of InputFieldContainer, why is it used, why only in TransferHelper and not in UpscaleHelperthis concerns API design as it is not clear so farthis concerns API design as it is not clear so farRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/52Allow floats or integers in 'transfer_func' directly2020-02-25T14:52:45+01:00Robert SchweppeAllow floats or integers in 'transfer_func' directlyInstead of:
&Data_arrays
transfer_func(1) = "p1 + p2 * x1"
from_data_arrays(1) = "x1"
from_parameter_values = "1.2", "-0.5"
allow
&Data_arrays
transfer_func(1) = "1.2 - 0.5 * x1"
from_data_arrays(1) = "x1"Instead of:
&Data_arrays
transfer_func(1) = "p1 + p2 * x1"
from_data_arrays(1) = "x1"
from_parameter_values = "1.2", "-0.5"
allow
&Data_arrays
transfer_func(1) = "1.2 - 0.5 * x1"
from_data_arrays(1) = "x1"Next major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/53add check for ascending order of coordinate values2020-03-04T13:44:31+01:00Robert Schweppeadd check for ascending order of coordinate valuesWhen initialized from namelist, we need a sanity check of the values being strictly ascending.When initialized from namelist, we need a sanity check of the values being strictly ascending.Next major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/56add templates for transfer function and upscaling operator2021-11-12T13:55:43+01:00Robert Schweppeadd templates for transfer function and upscaling operatorThis shall simplify the process of defining custom functions and operators for users. Maybe we can even provide an interface to Rosetta-type TFs ([Rosetta3](http://www.u.arizona.edu/~ygzhang/rosettav3/) via [Python2Fortran](https://www.n...This shall simplify the process of defining custom functions and operators for users. Maybe we can even provide an interface to Rosetta-type TFs ([Rosetta3](http://www.u.arizona.edu/~ygzhang/rosettav3/) via [Python2Fortran](https://www.noahbrenowitz.com/post/calling-fortran-from-python/))?Next major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/57configuration file for namelist2021-11-12T13:56:29+01:00Stephan Thoberconfiguration file for namelistwriting the mpr namelist is very tedious. It would be nice to specify in a mark-up language like yml where the objects independent. Something like
L1_fieldcap:
from_file = false
transfer_func = 'x + a * y'
from_data_arrays =...writing the mpr namelist is very tedious. It would be nice to specify in a mark-up language like yml where the objects independent. Something like
L1_fieldcap:
from_file = false
transfer_func = 'x + a * y'
from_data_arrays = 'x', 'y'
From this yml file, the mar namelist is then automatically generated.Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/58use from_file and from_data_arrays simultaneaously for transferhelper2020-04-22T16:57:55+02:00Robert Schweppeuse from_file and from_data_arrays simultaneaously for transferhelperMake the following configuration possible:
```
name(1) = "my_data_array"
from_file(1) = "file.nc"
transfer_func(1) = "my_data_array * other_data_array"
from_data_arrays(1,1) = "other_data_array"
```Make the following configuration possible:
```
name(1) = "my_data_array"
from_file(1) = "file.nc"
transfer_func(1) = "my_data_array * other_data_array"
from_data_arrays(1,1) = "other_data_array"
```Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/59get rid of Warnings concerning hard-coded transfer functions2021-11-12T13:52:37+01:00Robert Schweppeget rid of Warnings concerning hard-coded transfer functionsRemove the warning:
```
'The transfer function name "', trim(name) , '" is not registered.'
```
Also, we need to improve the message about the delete string:
```
log_warn(*) "delete string is not empty, returning input string: " /...Remove the warning:
```
'The transfer function name "', trim(name) , '" is not registered.'
```
Also, we need to improve the message about the delete string:
```
log_warn(*) "delete string is not empty, returning input string: " // trim(funcString)
log_warn(*) " detete string = " // trim(deleteString)
```
It is a common warning/error and should be more helpful to the user.Next major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/61fully support CFConventions2020-05-20T12:12:20+02:00Robert Schweppefully support CFConventionsWe should support the [CF conventions](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.7/cf-conventions.html#missing-data). This shall include adding an attribute to each produced file:
- global attributes:
- `Convention...We should support the [CF conventions](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.7/cf-conventions.html#missing-data). This shall include adding an attribute to each produced file:
- global attributes:
- `Conventions` = "CF-1.7"
- `title` = A succinct description of what is in the dataset. -> read from namelist, issue warning if not present
- `institution` = Specifies where the original data was produced. -> read from namelist, issue warning if not present
- `source` = The method of production of the original data. If it was model-generated, source should name the model and its version, as specifically as could be useful. If it is observational, source should characterize it (e.g., "surface observation" or "radiosonde"). -> MPR version, and path to namelist
- `history` = Provides an audit trail for modifications to the original data. Well-behaved generic netCDF filters will automatically append their name and the parameters with which they were invoked to the global history attribute of an input netCDF file. We recommend that each line begin with a timestamp indicating the date and time of day that the program was executed. -> left blank
- `references` = Published or web-based references that describe the data or methods used to produce it.
comment -> MPR paper
- data variable attributes (only for those that are written to file):
- `units` -> read from namelist, issue warning if not present
- `long_name` -> read from namelist, issue warning if not present
- `standard_name` -> read from namelist, issue warning if not present
- `_FillValue` -> already there
- `scale_factor` -> automatically, if user specifies a target dtype in namelist
- `add_offset` -> automatically, if user specifies a target dtype in namelist
- enforce relative order of T, then Z, then Y, then X coordinates in the CDL definition corresponding to the file, issue a warning if not compatible
- coordinate variable attributes (only for those that are written to file):
- see [here](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.7/cf-conventions.html#_data_representative_of_cells)
Next major releaseRobert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/63parse line breaks correctly in transfer func in mpr.nml2021-11-12T13:54:56+01:00Robert Schweppeparse line breaks correctly in transfer func in mpr.nmlThis applies to python script and preprocessor (replace `\n` by ` `).This applies to python script and preprocessor (replace `\n` by ` `).Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/67ignore order of data arrays in namelist2021-11-12T13:56:40+01:00Robert Schweppeignore order of data arrays in namelistThe data arrays are read from the namelist and the DataArray and TranferHelper types are initialized in a loop there. If a DataArray references another DataArray with a higher index in the list, an error `Data array '<dataarray2>' is not...The data arrays are read from the namelist and the DataArray and TranferHelper types are initialized in a loop there. If a DataArray references another DataArray with a higher index in the list, an error `Data array '<dataarray2>' is not found in available data arrays. Add the data array in the file mpr.nml for DataArray 'dataarray1'` is raised.
It would be good to ignore that, e.g. do not call `check_and_set_input_field_names` during init, but only after all dataarrays are read. Maybe setting a flag in case there is a missing dataarray, then check for the flag after all dataarrays are initialized and revisit that again.Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/68append to existing netcdf file2021-11-12T13:56:46+01:00Robert Schweppeappend to existing netcdf fileThere is a need to append DataArrays to an existing NetCDF file with the use case of appending to an mHM restart file as needed for the FSO project.There is a need to append DataArrays to an existing NetCDF file with the use case of appending to an mHM restart file as needed for the FSO project.Robert SchweppeRobert Schweppehttps://git.ufz.de/chs/MPR/-/issues/74Addition to parameters: Switches and Cases2021-11-12T13:56:58+01:00Sebastian MüllerAddition to parameters: Switches and CasesAs we have seen while converting the option "Feddes and global FC dependency on root fraction coef. at SM process(3)=4 (!43)" to the new MPR style, it would be nice to be able to pass switches (logical) and/or cases (integer) to `mpr_rea...As we have seen while converting the option "Feddes and global FC dependency on root fraction coef. at SM process(3)=4 (!43)" to the new MPR style, it would be nice to be able to pass switches (logical) and/or cases (integer) to `mpr_read_config`.
This way, we don't need to encode a switch or a case as a float float value.https://git.ufz.de/chs/MPR/-/issues/76error message is misleading if parameters are not given2021-12-08T14:19:00+01:00Stephan Thobererror message is misleading if parameters are not givenif parameters that are used in transfer function are not given, MPR states that the transfer function is not registered, but that is not the case. Instead, it should state that the parameter is not provided.if parameters that are used in transfer function are not given, MPR states that the transfer function is not registered, but that is not the case. Instead, it should state that the parameter is not provided.Stephan ThoberStephan Thober