How to question numbers
Linda Nordling on how questioning numbers in policy statements gives journalists credibility and clout — and reveals new stories.
Journalists are comfortable with words and can usually spot spin in speeches a mile away, yet when it comes to numbers, many develop a mental block. A budget sheet can give many journalists sweaty palms.
Reporting numbers badly means that copy suffers. Silly mix-ups between millions and billions and other basic errors damage reputations for individuals and publications.
Poor numeracy among journalists fails to hold governments to account on dodgy statistics or un-supported rhetoric. And it encourages even well-meaning data producers to be selective about the numbers they publish. Why publish the 'complete details' of the annual budget, they might think, if journalists will misinterpret them?
But when journalists report numbers well, their stories gain depth, accuracy and influence. Overcoming your fear and distrust of numbers will make you a better journalist. And if, like many journalists in the developing world, you live in a country where the government is starting to pay attention to science and technology (S&T), you and your colleagues will need to step up your game to monitor whether ambitions are realistic and promises are kept.
This guide explains how to get to grips with numbers, focus on budgets, and assess numerical goals announced for new government projects.
But I'm a science journalist…
You might expect science journalists to be better at handling numbers than your average hack. But even when reading a data-rich scientific paper, most of us rely more than we care to admit on scientists' conclusions, rather than cross-checking with the data.
But that mindset serves journalists poorly when writing about government budgets and initiatives to build scientific capacity in the country where they live, which is an increasingly important side of being a science journalist.
This is where journalists in Africa — my 'beat' — urgently need to improve in reporting numbers. And probably the same goes for many other developing, and developed, regions of the world.
Watch out for 'single numbers'
'Single numbers' are my biggest gripe in science for development reporting. Let's say a president has made a speech, announcing an unprecedented push for S&T in her country. An article soon hits the Internet quoting the president talking about S&T in rosy terms like 'crucial for development' and 'becoming a knowledge economy'.
Usually, there is a single headline number, such as "we will spend one per cent of GDP on S&T by 2015", or "we will spend US$30 million on science in the next budget year". These figures are easy enough for a journalist to pick up in the hubbub, and are pushed by press officers as flagship ambitions.
But what do they mean? Single figures float around copy like questions, begging to be answered. How much is one per cent of GDP, approximately, and what sort of percentage is being spent this year? Or how does US$30 million next year actually compare with current and previous years?
Single figures, especially when they are sold as ambitious targets set by governments, need other figures to provide context. One per cent might not be an increase at all. Ask press officers or officials for comparative data. Accepting figures uncritically is very poor journalism.
Do your sums
It may sound obvious, but simply adding things up, and then doing it again to make double sure, can avoid many number-related mistakes.
For example, if 37 per cent of university applicants are male, and 62 per cent female, ask yourself about the missing one per cent. Is it a mistake? Did one per cent not answer the survey? It's better to double check your figures than to assume either that they were transgender or that your readers won't spot the inconsistency.
And if you break down a budget, make sure the parts add up to the whole. Let's say the Bill & Melinda Gates Foundation announces US$20 million for hiccups research, and you report figures allocated to each country. If your breakdown adds up to US$18 million, readers will query the remaining US$2 million (and if the parts sum to over US$20 million, you will look foolish).
Check what you've missed. Is the omitted US$2 million allocated for non-research activities? If so, be clear. Your story will be more informative than the 'official line'.
How big is that?
It often helps to put numbers in context. For instance, a 20-acre wind farm may be difficult to picture. Easier is to say that the farm will be the same size as 'about ten football pitches'.
When it comes to money, know the difference between a million (1,000,000) and a billion (usually 1,000,000,000 i.e. a thousand million, but some countries use a million million so check with your editor). Try to express science funding as what it might buy, particularly in local currency.
For example, a US$3 million grant may not sound like a lot in the context of a top US research university, which administers dozens, or even hundreds such grants each year. But in an African university such a grant could double the available research budget.
These basics make it easier to spot unrealistic announcements. South Africa's Department for Science and Technology, which bankrolls most of the country's research councils, has a total budget for 2012/13 of just under five billion rand (US$675 million). If you get a press release saying that "the University of the Witwatersrand has received five billion rand for water research", that is probably too much to be realistic, and there might be a typo in the release, so it is worth checking.
Or if the African Union announces it will build 30 new nanotechnology centres for ten million rand, that is too little (at 330,000 rand per centre — the annual salary for one mid-career researcher).
A little digging added some realism to Ethiopia's aim to produce 5,000 PhDs
Numbers that beg questions
Questioning numbers can expand a story and inspire more questions. Not just about more numbers, but about the story in general.
A few years ago, the government of Ethiopia announced a plan to produce 5,000 new PhDs over a decade. A little bit of research showed the country had produced fewer than 100 PhDs in the previous 50 years. That illustrates the ambition, but begged the question of feasibility. Clearly, Ethiopia didn't have the capacity to train the new PhDs.
It turned out that the government was not planning to use only local training capacity. In fact, the programme pivoted around a plan to invite foreign academics to teach and supervise the PhDs. Doing it all in-house would take 400 years, one Ethiopian academic confirmed.
So checking figures gave the reader more information about Ethiopian science and put the government's ambitions in context.
Check others' work
Always check 'simplified' numbers in speeches or press releases.
For instance, if the government announced that it will double science spending in next year's budget, don't just report it. Ask for recent budget reports and check whether the claim has merit. If not, do your own sums and put them past a government official for a response.
And remember that speeches and press releases often show numbers in the best possible light. Does the 'doubling' fund higher salaries for ministry officials, rather than more research? Does it actually amount to much less when accounting for inflation?
Always remember inflation. A stable budget can actually hide a cut in real terms. Science, like everything else, gets a little more expensive every year. If US$20,000 funded a PhD last year, you might have to fork out US$21,000 this year. Science organisations that have flat budgets can often get into financial difficulties. And that could result in good stories.
And check whether announcements are 'anticipated budgets', i.e. estimated costs, not secured money. If the money isn't in hand yet, where is it expected to come from?
Also, make a habit of checking figures from interviews if at all possible. Even the best spokesperson will make occasional mistakes. If you want to use a number in your story, ask your interviewee if they are certain of it, and if not, who could corroborate it.
Always check numbers you plan to use with interviewees
Trust the 'this doesn't make sense' feeling at the pit of your stomach and start asking questions. Perhaps the press release is wrong. Perhaps there are plans for raising the rest of the money that you've not been told about. Step back and ask "do things add up?".
Numbers can become angles
Diligently and thoughtfully engaging with numbers can identify new story angles, and even get you a scoop.
For instance, a government budget announcement might headline an increase in S&T spending. But spotting a cut in an interesting-sounding budget line should always prompt questions. It could mean job cuts or research funding being channelled from one area to another. Always look out for declining trends and ask an official or press officer to explain them to you.
Currency fluctuations also have implications. A country with weakening currency could struggle to pay its dues to large international science collaborations. And a strengthening currency makes foreign grants worth less locally, creating potential problems for scientists who depend on such funding.
Going it alone
Editors will come in handy for checking your calculations
As journalists have less and less time to write their stories, it is tempting to cut corners by accepting numbers without questioning them. But once you start to engage critically with numbers you'll find yourself reporting the stories no one else has covered. And when you do, expect closer scrutiny.
If you mess up your calculations and run with a story based on bad maths then you might find government officials or donor press officers less than helpful in the future. But not investigating the numbers lets down your readers and also the taxpayers whose money bankrolls such initiatives.
So use editors and subeditors to help probe your conclusions; or just talk numbers through with a colleague.
The more you get used to working with numbers, the easier it becomes — and the better your stories will be.
Journalist Linda Nordling, based in Cape Town, South Africa, specialises in African science policy, education and development. She was the founding editor of Research Africa and writes for SciDev.Net, Nature and others.