Radical
Statistics

The Journal

The Subjects

The Books

News

Links

About

Home

The Presentation of Statistical Information in the Broadsheet Press

John Martyn

The paper which follows is an overview of the way in which statistical information is presented in the broadsheet press. It is based on an analysis of the available Stats Watch data, a small survey among leading journalists, and press treatments of three monthly series of statistics issued by Government Departments (1). The three series were Labour Market Statistics (ONS), the Retail Prices Index (ONS) and NHS Hospital Waiting Lists (DoH). Each of these, i.e. the three series, the survey among journalists, and the analysis of Stats Watch data, were reported on separately to the RSS. This paper draws on these reports, discusses some of the more important findings, considers their implications and makes suggestions designed to be helpful. It makes use also of incidental, yet relevant, knowledge acquired over the Stats Watch years.

The term Stats Watch was coined to describe the monitoring of press comment on official statistics appearing in the broadsheet national press. The monitoring was made possible thanks to an initial grant from the Joseph Rowntree Charitable Trust. Over a period of just over three years (April ’95 to June ’98) monthly bulletins were issued to some 40 subscribers and other interested parties. These bulletins drew attention to, and reproduced newspaper cuttings, which were judged to be of particular interest to those who seek to improve the quality and availability of official statistics in the UK.

The national press

The importance of national papers in drawing attention to matters and aspects of public concern is immense and their role is a vital one. The average issue daily readership ranges from just under ten million for the Sun to about 650,000 for the Financial Times (2).

The most successful daily broadsheet papers, based on circulation and readership, are the Daily Telegraph and Times on weekdays, and the Sunday Times and Sunday Telegraph on Sundays. However, when it comes to comprehensive reporting on and discussion of important official statistics, then the Financial Times, Times, Guardian and Independent on weekdays, and the Sunday Times, Observer and Independent on Sunday at weekends, are of more consequence. To these should be added the London Evening Standard (Monday - Friday) and the Economist. The Evening Standard is important because, whilst not a national paper, it has a substantial readership in London and the South East and is often the first newspaper to carry news from, and thoughtful comment on, press releases issued earlier that day. The Economist, alone among weeklies, provides authoritative comment and occasional trenchant criticism of important economic and social statistics.

The preceding comments are supported by detailed coding and analysis of the 625 items selected from the papers for inclusion in the Stats Watch bulletins over a three-year period.

Leading journalists

The first listing of leading commentators on official statistics was based on the last 12 months of Stats Watch data (i.e. July ’97 - June ’98). This was then checked against the names of those journalists who had commented in late ’98 and early ’99 on the case study material. This comprised 6 issues of Labour Market Statistics and two each dealing with the Retail Prices Index and NHS Hospital Waiting Lists. Using both listings, some 20 leading journalists were invited to complete and return a short self-completion questionnaire and, if possible, grant a short subsequent personal (phone or face-to-face) interview and 12 did so. This relatively good response rate (after one reminder) ‘spoke well’ for journalists’ interest in the inquiry.

This goodwill towards the monitoring of press comment on official statistics and the inquiry was also evident in conversations with the journalists. It is to be noted and consideration given to how it might be further exploited in the interest of better statistics. In particular there are lessons for those who want their statistics to be sympathetically received and reported on. One caveat, however. When speaking to leading journalists it was found that they expect their ‘contact’ to be familiar with their latest writing. They also change jobs and move around quite frequently.

The questions asked.

Asked about the main components of good practice in the presentation of statistics, seven (of the 12 journalists) said they should be ‘clear’, with a further two saying ‘clear labelling’. Next came mentions of ‘source’/basis with 5 mentions, accuracy (incl. honesty and reliability) with 4, then 3 mentions of ‘time’ (period) or ‘timeliness’.

Another question asked the journalists to rate five different types of statistics in respect of their presentation to them. The five types were: official statistics, market research, social research, opinion surveys and medical research. Best rated were official statistics with 3 journalists saying ‘very good’ and 8 saying ‘good’. Two said ‘not so good’.

Next, a long way behind, came market research and social research. Market research had 1 saying ‘very good’ and 3 saying ‘good’ but 4 saying ‘not so good’ and 1 saying ‘poor’. Four said ‘good’ of opinion surveys, 7 ‘not so good’ and 1 said ‘poor’. Most journalists could not give a rating to medical research.

Five journalists made suggestions which they said would help improve their use of published statistics and research reports. These ranged from ‘ready availability in Internet or Excel files that would allow manipulation of the data’ to a request for plainer English, fewer graphs and less jargon. Others mentioned ‘better presentation and design of tables’, ‘treat journalists as intelligent and offer full tabular presentation of findings and one deplored ‘the increasing assumption that media is not concerned with details’.

Seven journalists were asked whether they made much use of Internet. Four said they used it quite widely, two said they did not, whilst one said ‘only the ONS (Official for National Statistics) website.

Press releases and briefings

There is not much information on the importance and usefulness of these. In the 7 full personal interviews with journalists, when asked how they received or accessed the data they were interested in, only 2 of the 7 mentioned press releases. Similarly, when asked which press briefings they attended, half of those questioned said ‘don’t go’ or ‘not a lot’.

This may be at variance with actual practice for the half-dozen or so press briefings I attended always appeared well-supported with those attended by Ministers generally expecting ‘questioners’ to give their name and paper. At the 1 ONS Labour Market Statistics briefing I was able to attend in the ‘run-up’ to the 1997 general election, all desks were occupied and those present had their questions dealt with fairly and adequately.

It is possible that much greater use was and is made of hard copy, including ONS News Releases, than is first apparent, for press releases may be picked up by more junior newspaper staff, received by fax or from the ONS website.

The process of reporting

ONS News Releases provide a model for the clear and detailed presentation of key government statistics. Whilst relatively few people may be aware of their content, all papers and news agencies receive copies and can access the ONS website. Most papers, however, appear to make very limited use of the detailed analysis when reporting on the figures. That this is so was confirmed by a comparison of the press reports in three series of monthly statistics with the ONS and Department of Health (DoH) press releases on which they were based. A total of 10 ONS and DoH releases were compared with the reports on them which had appeared in national papers, tabloids as well as broadsheets, on the day following their issue. Six dealt with Labour Market Statistics, and two each with the Retail Prices Index and NHS Hospital Waiting Lists. All ten of these monthly releases were reported on separately to the RSS (3).

Press reports on these focus mainly, and often solely, on how the key statistics compare with the corresponding figures for the preceding month. It seems that little space, and perhaps time, are available for more detailed analysis and comment. In general too there is negligible later analysis of such data, with the notable exception in the past of the unemployment statistics. The latter had been subject to much warranted criticism.

My impression, borne out by my attendance at the labour market statistics briefing, is that most journalists know beforehand which particular figures they are most interested in. Not unnaturally this will limit their appreciation of new non-headline figures.

Subjects given most attention in the press

The analysis of Stats Watch data over the three years July ’95 - June ’98 showed that the main subjects reported on were: the economy (184), health (128), unemployment (58) and education (42). However, the reader is reminded that this classification by subject area is limited to the Stats Watch (SW) listing and that these items had originally been included in the monthly bulletins because they were judged to be of considerable interest to those wishing to improve the integrity and availability of official and other important statistics. The basis of their selection, whilst slightly subjective, is not considered a problem.

Newspaper reports were also categorised by the type of comment made on the statistics. The code ‘important set of statistics’ headed the list with 121 mentions, followed by ‘improving official statistics’ with 82, and ‘questionable statistical procedures’ with 65. ‘Doubtful (or unreliable) statistics’, secured 39 mentions and ‘misleading statistics’ 38. These mentions are of a total of 625 press reports included in the analysis, whilst the importance of an item was indicated by the nature of the covering headline and the supporting text.

Whilst there is interest in monitoring and classifying comments on official and other important statistics, there is value in such work only if it is intended to make use of such information. Even, if agreed, such a conclusion requires answers to related questions concerning who should act on the information and how. Organisations such as the Institute for Fiscal Studies (IFS) and the National Institute for Economic and Social Research (NIESR) have an interest which is limited to a fairly narrow, and certainly limited, series of statistics such as those affecting taxation (IFS) or economic and monetary policy (NIESR). Other bodies too, such as the RSS, the Market Research Society and the Social Research Association have a limited interest in official statistics, which stays well short of the looked-for systematic monitoring. Hence Stats Watch!

In the run-up to the 1997 general election, the London Evening Standard commissioned, and presumably paid, the Institute for Fiscal Studies to assess the competing claims of the Labour and Conservative parties on taxation and average income. The point here is that such work has to be paid for and undertaken by a well-qualified and prestigious organisation.

Press coverage of selected items

Two measures were used when analysing the Stats Watch data - the number of broadsheets covering each item and the number of column inches devoted to it. Both are crude measures of the extent of coverage, but can be readily applied to the monthly bulletins with their supporting cuttings.

International competitiveness and ‘the true jobless figure’ came top based on the number of broadsheets reporting. The first of these was based on the World Competitiveness Report(4), produced by the World Economic Forum in Geneva, whilst the second was a report by HSBC, the parent company of Midland Bank, which had estimated that the true unemployment figure was four million, rather than the official 1.7 million at that time. Reports on both these items appeared in 6 of the broadsheets.

Next, after unemployment, came two education items, both of which are of particular interest. There was the RSS report on school league tables which stated that attempts to gauge the quality of teaching at a school from the Government’s exam performance figures were flawed, and one on School tests for 11 year olds. This item covered the Schools Curriculum and Assessment Authority report that national tests showed more than 300,000 pupils are entering secondary schools with an appalling lack of reading, writing and number skills.

A further report on the teaching of mathematics was also well ‘covered’, with four broadsheets commenting on it. It was published by the London Mathematics Society, the Institute of Mathematicians and the Royal Statistical Society. It had found that maths teaching in schools had been thoroughly undermined by ‘trendy’ theories and was in a state of crisis.

Given the widespread coverage given to these three education items, it is worth considering how this was achieved so that the reasons for this success can be noted and used again. Presumably the press releases reporting on these items were cleared well in advance by the sponsoring bodies, and their issue was well timed and directed. Such knowledge will doubtless have assisted the Department of Education in preparing primary schools to adopt a national numeracy strategy and a Government target of 75% of 11 year-olds achieving level 4 in the Key Stage 2 national tests in mathematics by the year 2002. Noted too was the UNESCO World Mathematical Year 2000 which aims to raise the public profile of mathematics and its educational value. These initiatives all served, indirectly, to draw attention to the importance of reliable statistics.

Attention-getting headlines

Competing newspapers, because of the nature of their business, vie with one another in the headlines they use to capture the attention of prospective readers. None more so than the Financial Times, Guardian and Economist.

Headlines which also feature actual numbers and proportions appear to magnify the attention likely to be given to a report. Examples include:

‘A triple whammy’, ‘Lies, damned lies and PM’s question time’ over a Guardian editorial. (31/1/96)

Similarly, ‘Fewer damned lies?’ over an Economist editorial calls for immediate attention. (30/3/96)

‘Statistics’ is quite properly, a weightier word than numbers or proportions and an examination of the words with which it is used in headlines is revealing. Outstanding among them are ‘weapons’, ‘analyses’ and ‘official’ followed by ‘standards’, ‘shake-up’ and ‘criminal’.

There are other uncomfortable associations, viz. ‘lies, damned lies and statistics’, ‘the truth behind the statistics’, ‘nothing more than statistical glitch’ and ‘statistical tricks of the trade’. By comparison there are few that are reassuring, certainly not ‘dazzling statistical magic’. In general the use of the word ‘statistics’ in the press is not associated with authoritative figures and pronouncements. This said, however, there are many more headlines suggesting dishonest (or questionable) practices which use other words (5).

This finding does suggest that care is needed when using the words ‘statistics’ or ‘statisticians’ in headlines since many readers may find such references ‘off-putting’, kindling thoughts of ‘clever’ (suspect) figures and procedures.

Graphics and tables

These are seldom used to support the presentation and interpretation of reports on statistics. This presumably is because most reports on numeric changes are based on comparisons with the preceding month’s figures or ‘ ‘with last year’. Considering graphics and tables together, the Stats Watch data indicated that the Financial Times and Guardian were most likely to use them, followed by the Independent and Economist.

In general tables were and are clearly labelled and bear separate titles. The source of the data was almost always shown, with evidence that the graphics are sometimes provided by the same source bodies, e.g. the World Economic Forum and the Institute for Fiscal Studies. Good graphics provided by the originating source almost certainly enhance the publication prospects of such data.

Other media, TV and radio

In the early Stats Watch weeks an attempt was made to obtain transcripts of important BBC broadcasts affecting statistics. This was unsuccessful and it was only through the offices of the Central Statistical Office, now part of the Office for National Statistics, that I was able to obtain a transcript of Claus Moser’s views on the then Government’s suppression of Muriel Nissel's invited editorial in Social Trends.

Two other examples come to mind. Jeremy Paxman interviewed Martin Neale of the NIESR in connection with his report for the Treasury on the ONS’s earnings data. His remarks included comments on the margins of error which are associated with survey-based estimates. They were clearly expressed and worth obtaining had this been possible. The NIESR might have asked for and obtained a copy.

Another example is a series of very relevant talks given on the BBC by Professor John Allen Paulos. The title of the series was ‘A Mathematician Reads the Newspapers’, and they were broadcast at 5.40 p.m. on Radio 4 on Sunday evenings. The first, on 11 April ‘99, was entitled ‘My mobile gave me cancer’ and focused on scare stories. The second, on 18 April, had the caption ‘Pregnant Diana Murdered for Love of Muslim’ and revealed how journalists often interpret statistics incorrectly, whilst the third, on 25 April, entitled ‘Ten Reasons Why We Hate Our Bosses’ explained the popularity of top-ten lists. They contained material which is of considerable interest to statisticians, especially those who have dealings with the media.

Inherent variability

The RSS specification for the analysis of Stats Watch data and the additional research and analysis asked for an investigation of ‘the extent to which press reports of statistical information give some indication of the inherent variability in statistical results’, asking that this should be done for the case study material and for the reporting of statistics generally.

The finding is that in general this did not occur, nor was it found in the press write-ups of the 10 News Releases which together comprised the ‘case study material’. Indeed it is only rarely made clear whether the statistics reported on derive from a ‘universal return’ or are survey-based. One conclusion is that it may be thought that such discussion would undermine the credibility of the statistics that are being reported on.

More relevant material on this aspect is to be found in the full report on the Stats Watch data submitted to the RSS. This is headed ‘Discussion and Treatment of Statistical Data’ and begins with an early Stats Watch bulletin which reported some of David Bartholomew (then President of the RSS)’s remarks made to the House of Commons Select Committee on Employment. This section of that report also includes some noteworthy and highly relevant remarks made by Philip Howard in the Times, together with extracts from the Economist (‘Economic statistics are in a bad way!’) and a report on opinion polling by John Curtice taken from the Scotsman (‘Wobbly Thursday is nothing more than statistical glitch’). All three of these examples discuss the inherent variability of statistical data, two in general wide-ranging terms, the third relates to a particular MORI poll taken shortly before the 1997 general election. This said, there are few instances of major attention being devoted to this aspect, i.e. the inherent variability of statistical data, in national newspapers. It is much more common to find isolated examples where statistical terms are used, the statistics given, then passed over with little or no explanation or discussion (6).

General

Given that this article is written for Radical Statistics, it is appropriate to consider what are the lessons and implications of the research and analysis reported on here for the Radical Statistics Group.

Among the particular concerns of the Group are ‘The mystifying use of technical language’. In this connection the language used in press headlines when writing about statistics is particularly relevant as are the comments made by journalists when questioned in the small-scale survey.

‘The lack of control by the community over the aims of statistical investigations’ is also important and the abuse of official statistics over the last two decades evinced in the Stats Watch analysis provide powerful support for those who argue the case for an independent statistical service with strengthened representation for statistics’ users.

The power structures within which statistical and research workers are employed in the UK often serve to stifle discussion and inhibit the looked-for improvements in the integrity of official statistics. In particular there is still too much diffusion of responsibility for official statistics between Government Departments.

The fragmentation of social problems into specialised fields is bound to obscure connectedness. This imposes a special responsibility on those whose responsibilities and interests bridge such fields (7).

Other organisations

The UK, it seems, is particularly well-wedded to time honoured ways of doing things. In support of looked for change and improvements there is much to be learnt from other statistical organisations, particularly other English speaking statistical societies and other professional bodies where the proper use of numbers is important.

The Market Research Society (MRS) conducts periodic surveys among its members. It has a robust professional ‘Code’, augmented by a ‘Guide to Good Practice’. These are generally subscribed to, widely supported, and ‘policed’ by the MRS’s Professional Standards Committee. The keenly competitive nature of market research, as between agencies, ensures that companies and the senior researchers that lead them, ‘fight their corner’ and are quick to criticise shortcomings and lapses in good practice that are apparent in published (and some unpublished) research. Statistical organisations such as the RSS and the Radical Statistics Group, of necessity, lack comparable reinforcing agents and there are few instances of journalists and writers being taken to task for their improper use of statistics.

Finally, attention is drawn to an article by Willem de Vries, Deputy General of Statistics Netherlands, in the April ’99 issue of the International Statistical Review. Its title is ‘Are We Measuring Up -? Questions on the Performance of National Statistical Systems’. It begins with the statement ‘The ranking of national statistical offices, published by the Economist some years ago, caused mild shock-waves among official statisticians around the world. The first Economist ranking (1991) was primarily based on the timeliness and accuracy of some major statistical series. The second round, in 1993, also took into account judgments of chief government statisticians about the objectivity of statistics (in terms of absence of political interference), reliability of the numbers, the statistical methodology that was applied and the relevance of the published figures’. The Economist’s research is a signal instance of an important contribution to the furtherance of better statistics being made by the media and it is hoped that the de Vries article prompts considered further comment by eminent statisticians in this country(8). Its aim, indeed, was to stimulate discussion about the Fundamental Principles of Official Statistics as adopted by the United Nations(9). These 10 Principles are now a widely agreed framework for the mission of national statistical offices and indeed for the statistical work of official international organisations. A reading of the article confirms that the UK still has some way to go.

Some suggestions

Such conclusions as can be drawn from the research and analysis have already been stated. There follow some suggestions.

  • Statisticians, and those dependent for their livelihood on good, timely and reliable statistics share an interest in continuing to press for better statistics. The media, and broadsheet national press, together with the Economist, are essential allies in this.
  • The monitoring of press comment has its uses and could serve to alert the new Statistics Commission and others to deficiencies in the Government’s statistical output. Some independent monitoring could prove valuable.
  • In particular, resources are needed to consolidate and extend the role and influence of statistics users. Some relatively small-scale and inexpensive research among ‘key players’ bearing on ‘the best way forward’ could prove useful. This would take in the Radical Statistics Group, the RSS and the Statistics Users’ Council (SUC) as well as the Institute for Fiscal Studies and the National Institute for Economic and Social Research. Properly planned and executed such research could assist the case for funding a Statistics Users’ Centre.
  • Individual events, such as the coming SUC Annual Conference on Health and Care Statistics could be taken advantage of and used to further individual user group objectives. For example, a self-completion questionnaire could be sent to all those who attend such conferences.
  • Membership surveys among practicing statisticians and others could be used to collect information about areas of interest and explore attitudes to possible new initiatives.
  • And finally, more attention needs to be paid to appointing trouble-shooters whose role is one of rebutting unfair criticism of statistics in the press and drawing attention, preferably on behalf of some august body, to abuses. Better, more general, knowledge of who these trouble-shooters are is essential.

Notes

  1. The ‘leading journalists’ participating in this survey were: Anthony Brown (The Observer), Diane Coyle (The Independent), Liam Halligan (Channel 4 News), G.Q. Jones (Daily Telegraph), David Lipsey (The Economist), Jeremy Laurance (The Independent), Ben Leapman (London Evening Standard), Larry Elliott (The Guardian), David Brindle (The Guardian), William Keegan (The Observer), Peter Kellner (London Evening Standard) and Janet Bush (The Times).
  2. NRS Readership Survey (December 1996 - November 1997).
  3. Labour Market Statistics for September, October, November, December ’98 and January and February ’99 - ONS First Releases.
  4. Retail Prices Index for December ’98 and January ’99: ONS First Releases.
  5. NHS Hospital Waiting Lists for January and February ’99: Department of Health Press releases.
  6. The World Competitiveness Report ranks 48 economies on the basis of 380 economic, financial, social and political criteria and an opinion poll of international executives. The report is produced by the World Economic Forum and the Institute for Management, a leading Swiss business school.
  7. An analysis of the Stats Watch items resulted in the listing of headlines suggesting dishonest and suspect practices. These included: ‘Lies, damned lies and the fact of the matter’ (Stephen Glover, Daily Telegraph 21/1/96); ‘A number game lies, damned lies and statistics’, (London Evening Standard editorial 1/10/96); ‘Damned lies and statistics’ (letter to Guardian from SCPR 23/1/97). ‘The truth behind the statistics (Gerald Baker, FT 17/2/97); ‘Wobbly Thursday’ is nothing more than a statistical glitch’ (John Curtice, Scotsman 11/4/97); ‘Statistical tricks of the trade’ (Lucan Camp, Sunday Telegraph 26/10/97) and ‘Dazzling statistical magic fails to blind the euro-sceptics’ (Marcus Warren, Sunday Telegraph 22/2/98).
  8. ‘Measuring unemployment, the debate continues’ David Bartholomew in the November 1995 RSS News (Vol. 23, no.3).

    Philip Howard in the Times (5.4.96) ‘They may average out right but some statistics are less then vital’.
  9. ‘Damned lies. Economic statistics are in a bad way’ (The Economist 23/11/96).
  10. ‘Wobbly Thursday is nothing more than statistical glitch’ John Curtice in the Scotsman (11/4/97).
  11. Note in the inside front cover of Radical Statistics 77 Spring 01 listing the particular concerns of the Radical Statistics Group.
  12. Willem de Vries' article in the April ’99 issue of the International Statistical Review entitled ‘Are We Measuring Up - ? Questions on the Performance of National Statistical Systems’.
  13. ‘The Fundamental Principles of Official Statistics’ were first adopted by the Economic Commission for Europe during its 47th Session, Geneva 15 April 1992 and subsequently endorsed by the United Nations Statistical Commission (after some minor amendments).

NB: Research and analysis based substantially on an analysis of Stats Watch data commissioned by the Royal Statistical Society, the findings reported on here have not hitherto been published.

John Martyn,
University of Surrey Roehampton

Home Page Top of page

Valid HTML 4.01!