The World Wide Web's inventor wants to make websites more trustworthy. This should be done by encouraging good practice, not imposing strict rules.
Ever since the World Wide Web was launched in 1992, people have worried about how much it can be trusted. This is primarily because of its deliberate lack of editorial control. That means that inaccurate or misleading information can be distributed and accessed as easily as information that is reliable.
Last week, Tim Berners-Lee, the British physicist who invented the Web as a way of communicating with professional colleagues around the world, voiced such concerns. His remarks came at the launch of the World Wide Web Foundation, a new organisation that will explore how his invention can improve the lives of individuals around the world.
Berners-Lee said that he was particularly keen for the foundation to make the Web more useful to people living in what he described as "underserved communities". For example, the Web could provide information leading to better health care.
He also emphasised the need to minimise the spread of potentially dangerous or misleading information. For example, he said it was important to make sure that people in developed and developing economies alike "can distinguish reliable healthcare information from commercial chaff".
Berners-Lee pointed out how the Web recently spread global speculation that the world could disappear into a black hole created when a new atomic particle collider was switched on at the European Organisation for Nuclear Research (CERN), in Geneva.
This speculation led to some alarming headlines across the globe. But, apart from generating a certain amount of angst, there is no evidence that anyone was harmed. And the blame lies as much with irresponsible journalism as with the speed at which crazy ideas can be propagated.
Indeed, one of the virtues of the Internet is that official denials can be — and were — disseminated just as fast.
Similarly, there is little evidence that obscure cults, another of Berners-Lee's targets, are increasingly influential because the Web exists. Certainly, many such organisations are adept at using the Web to spread their ideas. But the invention of printing, many centuries ago, was used the same way.
Democratising the flow of ideas has inevitable hazards. But seeking to restrict this flow has even greater ones, as countries that respect the freedom of the press explicitly acknowledge. All this suggests that any attempt to regulate what can be read on the Web should be approached with extreme caution.
But there remain plenty of legitimate concerns. The problem is not the ease of access to information, but that Web users cannot always interpret this information and assess its validity. That is particularly dangerous when an individual or society uses the information to make important decisions that affect people's lives.
Take HIV/AIDS in South Africa, for example. It is widely thought that South Africa's reluctance to embrace anti-retroviral drugs stemmed largely from the views of the country's recently unseated president, Thabo Mbeki. He argued that scientists had not "proved" that AIDS was caused by the HIV virus.
Less well known is that Mbeki developed his ideas while surfing the Internet. That was how he became aware of a small group of dissident US scientists who disputed the consensus view on HIV/AIDS. It was sufficient to convince him that the scientific community was split on the issue, and led to delays in implementing aggressive anti-HIV/AIDS policies. The delays probably cost hundreds of thousands of lives.
Other consequences of using misinformation on the web may be less dramatic. But they all tend to endorse Berners-Lee's call for ways to help distinguish what can be trusted and what cannot, particularly when life or death may be at stake.
Most of the responsibility, of course, lies with those who use the Web. Yet urging general caution won't work any better than saying, "Don't believe everything you read in the newspapers."
Readers need to develop their skill in recognising reliable information. And that can be a challenge — who do you trust? Much effort has been expended on persuading the public that scientists are skilled in separating objective fact from subjective opinion, and can be relied on where such separation is important. But scientists may be discussing topics outside their own expertise, whether black holes or HIV/AIDS.
The solution does not lie in gagging those who misuse their scientific reputation, or even in placing warnings on their websites. Rather, people need to better appreciate how science works. Understanding the role of experiment, the significance of statistical evidence, and the importance of peer review helps people make informed judgements.
Websites such as SciDev.Net that deal with science-related issues have a responsibility to foster such skills. Not everything reported needs to be strictly 'scientific,' nor should reporters refrain from criticising scientists. But such websites need to reflect both the values and limitations of scientific evidence in their own coverage.
Much could be gained by creating an international network of websites committed to such principles. This would go some way towards achieving Berners-Lee's goals of a system of trustworthy sites. And it would use a 'bottom-up' approach, embedded in good practice and professional experience, rather than resorting to 'top-down' measures. Such measures should be distrusted when it comes to mass communication, whatever the technology involved.