Subject: Groundwater Recharge in the MABIA model and the f-factor Posted: 1/31/2018 Viewed: 5520 times
I have a question with regards to the MABIA module. I set up a test catchment, called 'MABIA' with two subbranches, one that has fallow as landuse all year and one where sugarbeet is planted in November and Millet in July.
I'm mostly interested in the natural recharge, so I did not set the agriculture branch to irrigated for now. All other settings are the same for both branches.
The actual catchment of concern is located close to Jaipur, northwestern India with an arid climate. The mean annual recharge of that area has been determined to be around 70-100mm/year. The surface runoff in the catchment is zero.
If I run the model, the method calculates an annual recharge of 67mm for the fallow branch, pretty close to the estimate.
However, if I plant _any_crop in the other branch, no groundwater recharge whatsoever occurs in that branch.
Since I left the fraction covered greyed out, it should be automatically calculated according to growth stage, plant type and so on. However, if I read the fraction covered from the model using WEAPs API, it reports a constant zero throughout the year, I'm using the ".ResultValue" for that, reading all values of the last simulation year.
This happens no matter which crop I plant in that branch. However, WEAP reports a significant transpiration for that branch, although transpiration should be basically non-existent if the fraction covered is zero. This also happens if I enter very small constant fc values like 1e-4 manually.
Shouldn't the landcover in that case be identical to fallow and hence produce the same results?
If I set the fraction covered to a constant value of ~0.9, I get recharge -- shouldn't that be consumed by the massively higher transpiration then?
Here you can find plots for the different variables for the last simulation year and a table with annual summaries of the respective variables: