Everyday UFOs: the dark side of drones in development
- Drones and AI can cause stress if communities are not aware of their purpose
- Lack of legislation leaves developing countries vulnerable to data abuse
- Researchers urged to consider ethical implications of AI tools before deployment
They can look like toy planes or alien spacecrafts. They make a buzzing sound so annoying they have been banned from American national parks. In the best scenario, they combat deforestation and deliver medicines. In the worst case, they fire missiles.
Drones are becoming ubiquitous and, while they may be controversial, their contribution to science cannot be denied. The small unmanned aircraft, flown remotely by a person on the ground, can get to previously inaccessible places. They use artificial intelligence (AI) to sense underground water, analyse crop health and assess the impacts of natural disasters.
To do so, they gather data, often in the form of high-resolution images. However, to take these images, they fly at low altitudes over the heads of people who may not have been consulted about their use, and therefore may not agree with their presence. While their contribution to development science has been widely lauded, little research has been done on what impact drones and other artificial intelligence equipment have on the communities they are meant to help.
One person who studies the issue is Chris Sandbrook, a senior lecturer in conservation leadership at the University of Cambridge in the United Kingdom. He has found evidence that the presence of drones can create conspiracy theories and suspicions if people are not consulted before their use, thereby having a negative impact on psychological wellbeing.
“A lot of researchers are well aware that [drones] cause conflict, because their devices get attacked or vandalised. But it is not considered to be a relevant topic for scientific publications,”
Chris Sandbrook, senior lecturer in conservation leadership, University of Cambridge
“People perceive them as having a military association,” he says. “They feel fear about what authorities might know, and what they might do to them.”
Sandbrook uses forest conservation as an example, a field in which it is common to use drones or hidden cameras to monitor wildlife and tree health. He has seen evidence that local people spend less time in forests once they know that camera traps or drones are present, because they feel compromised even if they are doing nothing wrong.
Unfortunately, he says, the ethics of drone and AI use in development research are not discussed as much as they should be. “A lot of researchers are well aware that they cause conflict, because their devices get attacked or vandalised,” he says. “But it is not considered to be a relevant topic for scientific publications.”
Flying responsiblyDrones are now so sophisticated, sturdy and adaptable they are becoming commonplace in development settings. In March 2016, the UN’s children’s emergency fund Unicef started a field test to use drones to deliver medicines in Malawi. Earlier this year, the World Economic Forum in Davos debated a report on how drones can deliver goods to remote places, thereby opening new markets. The South African government uses drones to hunt down poachers, while the Philippines monitors the condition of rural roads through drone imagery.
One company riding the trend is Global Partners, based in Cotonou, Benin. The West African firm provides high-resolution images of land to clients such as farmers, businesses and even the Benin government. Its founder, Abdelaziz Lawani, has seen first-hand the positive impact that drones can have. He and his team undertook a study in which they provided a group of farmers with images detailing the size and layout of their farms, and areas of crop stress.
“Farmers who received those services improved productivity and yield,” says Lawani.
He says his team takes care to consult extensively with communities before using drones, often seeking additional approval locally once a government permit to fly research drones has been obtained. “We’ve never had a case where people were reluctant,” he says, adding that most farmers were so delighted to have a photo of their property they would hang it in their house, “like a picture of your family”.
“But we know that they are excited because they know that the drone taking those pictures is flying with responsibility,” Lawani explains.
The responsibility that comes with operating drones does not just cover community outreach, it also extends to what data is collected and how it is used. Drones can take high-resolution images of identifiable people, while other AI applications, especially those used on phones, gather highly personalised data about the user.
Code of conductDeveloping countries are waking up to the potential dangers of this. In May this year, 42 countries signed up to the OECD Principles on Artificial Intelligence, which act as a set of guidelines to develop national standards on data handling and protection. However, regionally, legislation still varies wildly and, in some cases, is not well implemented, says Effy Vayena, who researches health ethics at ETH Zurich, a university in Switzerland.
“One of the ethical questions around artificial intelligence is the fact that it’s not inclusive in terms of who develops it,” she says. “It’s mostly white men in big companies. But countries have different needs and vulnerabilities. If they are not part of the conversation, we can only speculate about their priorities and values.”
Vayena sees a particular risk in fast-developing countries that need widespread internet access quickly, whose legislators might sign deals detrimental to the people who end up using these services. “A lot of big companies are expanding in these countries in return for their data,” she says. “We need political will at country level, international support and more responsible behaviour from companies — otherwise it will be some other kind of colonialism.” Sandbrook would also like to see more responsibility from the research community, including better planning for how data gathered by drones is handled and stored. One example he mentions is what happens if a researcher accidentally captures data on illegal activity. “If you get a video of someone trying to kill elephants, do you go to the police? Do you delete the video? These conundrums should be addressed in advance, not as and when they happen.”
For Lawani, such planning is essential to the reputation of his company. Global Partners operates on a strict code of practice, with stringent rules on what data the company will or will not provide. These include refusing to supply images at a resolution high enough to identify people or see what they are doing, and any data that would affect people’s privacy.
But the company still faces difficulties in expanding abroad, as legislation varies wildly across the African continent and the rest of the world, and keeps changing. Some countries, such as Kenya, banned drones after previously supporting their use, while others charge up to US$6,000 a year for a licence. Uncertainty about their legal status has stifled conservation projects in Mozambique and India.
For Lawani, however, some regulation, even if overly strict, is better than none at all. “Countries without regulation are basically guinea pigs, and that is very negative,” he says. “We know that drones can contribute to the development of our nations, but most countries, at present, have no way to control their use.”