Send to a friend
The current metric for assessing water scarcity in Africa should be abandoned in favour of one tailored to local realities, argues Richard Taylor.
Ensuring that people have access to safe and affordable water supplies will be a major challenge for many developing countries as populations grow and agriculture and industry expand over the coming decades.
One of the greatest threats to water security is simply scarcity, where demand outstrips supply. This can result from geography, overexploitation or inadequate infrastructure, and climate change will put further pressure on the availability of freshwater in many parts of the world.
Nations or regions that are suffering from scarcity can achieve water security by the dual strategy of increasing water availability (such as with improved water storage, wastewater recycling or desalination) and cutting demand (such as with more efficient irrigation or food imports).
Failure to do so can have devastating consequences. Water shortages can ruin crops, increase the incidence of water-borne and water-related diseases, and damage aquatic ecosystems (the shrinking of Lake Chad being a prime example). Water scarcity can also fuel civil strife.
Policies that will help to avert these crises require robust ways of measuring water scarcity to target areas most at risk. But predicting water scarcity, particularly while the climate is changing, is difficult and at present the main method for assessing shortages — the so-called ‘water stress index’ — is not up to the task.
Using this metric, researchers estimate that by 2025, the number of water-scarce nations in Sub-Saharan Africa will increase from 14 to 25, affecting almost half of the region’s projected population of 1.4 billion people.
But such assessments can misrepresent the scale and nature of the water crisis on the continent because the water stress index overestimates water demand and misrepresents freshwater supply.
For example, the water stress index assumes domestic consumption is about 40 cubic metres per person a year. Agricultural and industrial water demand are assumed to be to 20 times more than that.
These assumptions are based on lifestyles in the developed world and do not reflect the reality of water use in Africa. Actual domestic and industrial water use combined on the continent is closer to 25 cubic metres per person a year — about 10 per cent of the combined average in Europe and 6.5 per cent of the combined average in North America.
Furthermore, irrigated agriculture accounts for more than two-thirds of all global freshwater withdrawals whereas more than 95 per cent of all food production in Sub-Saharan Africa is rain-fed. Despite calls for substantial increases in irrigated agriculture to achieve food security in Sub-Saharan Africa, it is unclear where this water will come from — especially while the environment and climate are changing.
So, actual irrigation water demand in Sub-Saharan Africa is, and will likely remain, a small fraction of that used in countries such as Australia, China, India and the United States.
… and underestimating supply
On the supply side, freshwater availability in the water stress index is calculated from observations and simulations of average river flow — the mean annual river runoff (MARR). But river runoff in Africa can often change dramatically from one season to the next.
Indeed, Sub-Saharan Africa is home to some of the most variable rivers in the world, where dry-season flows may be zero or only a tiny fraction of rainy season flows.
Such variability is only expected to increase as a result of climate change. The use of MARR to estimate water availability in the water stress index ignores this variability, despite its critical importance in planning water security.
Another significant problem with the use of MARR to estimate water availability in Sub-Saharan Africa is that it excludes soil moisture derived from rainfall. But in a region where almost all agriculture is rain-fed, soil moisture provides the single largest source of freshwater for food production.
Towards a new metric
We need to abandon the water stress index as a way of defining water scarcity and insecurity in Sub-Saharan Africa, as it fundamentally misrepresents current and future water crises in the region.
Otherwise, there is a serious risk that precious, limited resources will be wasted by policies which address water ‘scarcity’ in regions where freshwater demand has been grossly overestimated, such as the Upper Nile basin, and which overlook actual water scarcity in regions where mean river flows mask seasonal water shortages, such as in the headwaters of the Limpopo.
A new metric is urgently required that is tailored to the local realities of water demand in Sub-Saharan Africa and accounts for both soil moisture and a region’s seasonal variability in freshwater resources.
There is some progress in this direction. Through research in East Africa, a multidisciplinary team under the UK government’s Ecosystem Services and Poverty Alleviation (ESPA) programme is developing a metric that considers seasonally varying demand, soil moisture and freshwater storage to better inform water security planning in Sub-Saharan Africa.
It is with such metrics that policymakers can design effective, targeted strategies to address actual water scarcity today and in the uncertain future.
Richard Taylor is a reader in hydrogeology at the department of geography, University College London, United Kingdom.