Subject: model sensitivity analysis Posted: 2/20/2014 Viewed: 12048 times
I was doing calibration of my model using observed and simulated water levels of a reservoir.
I had a catchment and routed the run-off to the reservoir and the reservoir thereafter supplies water into Eldoret town. Thus i was calibrating my model using the reservoir observed levels (That i know) and simulated reservoir levels ( i have to convert volume into reservoir levels) so as to compare with the known reservoir level.
In calibration, the land use factors (effective precipitation and ETc-factors) were modified. Effective precipitation was reduced by between 10 to 20 %, by trial and error, this is the percentage of precipitation available for evapotranspiration (the remainder is direct runoff). Each month was varied when doing a sensitivity analysis.
ETc (monthly evapotranspiration for a reference land class) values in the catchment under the climate input in the model was modified by trial and error when doing the sensitivity analysis of the model.
I experience problems since when i run the model the values for the simulated reservoir volumes in terms of m3 are all equal even if i change the model parameters to zero (Both parameters or one parameter)? My supervisors do not understand what might be wrong.
Subject: Re: model sensitivity analysis Posted: 2/20/2014 Viewed: 12047 times
The reservoir level would depend not just on inflow from catchment runoff but on reservoir operating rules, including the top of conservation limit and top of buffer. Look at the inflow to the reservoir. Does it change when you change the parameters of the runoff model?