Irrigation can unquestionably be a boon for populations living in arid regions, helping them grow food, but water and the pests that can live within it may bring diseases such as malaria. Now details from a 10-year study reported this week in the Proceedings of the National Academy of Sciences reveals the pros and cons of irrigation in a semidesert area in India, suggesting that any similar irrigation plans need to take consider their impacts on health over the long term.
Hundreds of millions of people in the arid region near the border of India and Pakistan depend on the southwestern monsoon for their survival — its regular failure over the centuries has doomed many to drought and famine. The fact that India’s population continues to grow adds further incentive to boost irrigation.
However, irrigation creates standing bodies of water that can last for long periods of time and harbor mosquitoes. Indeed, about 800 million people worldwide may be at risk for contracting malaria due to proximity to irrigation, a number accounting for roughly a sixth of the global malaria burden.
Still, irrigation is not always bad news to health. For instance, while the arrival of irrigation in the latter half of the 19th century initially led to a dramatic spike in malaria in the region of Punjab, the area has been rich in farming regions since 1947 and has a low prevalence of malaria. Irrigation provides a pathway to prosperity that over long time-scales is expected to counteract some of the negative effects of malaria.
“There is a tension between these two effects of irrigation development in arid regions, and their net effect is very relevant to the question of how long does it take to go over the ‘hump’ of worsening the malaria situation on the road to improving it, especially because they potentially operate over very different time scales,” says mathematical and computational ecologist Mercedes Pascual at the University of Michigan.
To learn more about the effects of irrigation on disease, Pascual and her colleagues analyzed data from a long-term malaria surveillance program operating in a semidesert area in northwestern India. A gigantic irrigation project is underway there that is expected to cover more than 1.9 billion hectares and benefit about 1 million farmers.
The scientists combined satellite imagery of vegetation cover with public health records of malaria cases over a large region to track changes that occurred as the irrigation project advanced. They discovered rising irrigation enhanced the risk for malaria in places adjacent to the main branches of the irrigation network, despite intensive and costly spraying of insecticides. This increased risk lasted for more than a decade, longer than past research previously suggested it would.
Intriguingly, “this heightened malaria risk around the main branches of the irrigation project is found directly adjacent to a region with very little malaria, which has been irrigated for a much much longer time and is therefore into what we can call a sustainable state of low malaria, requiring little control despite malaria being present next door,” Pascual says.
Given the magnitude of these projects, the researchers suggest that increased health costs have to be planned for over a long time horizon. The battle against malaria “will not be won with a silver bullet, but with a combination of approaches that include public health, medical, environmental and socio-economic improvements,” Pascual says.
A number of environmental methods for sustainable disease control apparently proved effective, affordable and feasible to carry out at local levels, such as intermittent irrigation and periodic flushing of canals. The challenge ahead then is to apply these methods over extensive regions over long times, Pascual says.