Radical
Statistics

The Journal

The Subjects

The Books

News

Links

About

Home

The Statistics Commission's First End-of-Term Report

Ulric Spencer

Background

Our readers will no doubt be keen critical observers of the statistical landscape and may therefore learn little from a straightforward description and summary of happenings since October 2000 concerning the Statistics Commission. However, not all of us have time to monitor in detail the minutiae of its activities. It will be recalled that it was set up as a result of an extended and widespread consultation exercise on the Green Paper Statistics, a Matter of Trust published in February 1998. Of the four options offered therein, the establishment of an independent Statistical Commission was the one most favoured by respondents. About 60 per cent of those responding, and over 70 per cent of those expressing a view, encouraged the setting up of such a body. In Building Trust in Statistics, published in October 1999, the government accepted that the best way forward was its creation, independent of government and the producers of official statistics. It was designed to introduce the necessary distance between Ministers and statistical operations, improve transparency and the public perception of independence, strengthen the priority setting and quality assurance processes and provide users with a clear channel to express their views.

Following a transparent appointment process based on merit in accordance with the Code of Practice published by the Commissioner for Public Appointments, the government stated it would appoint a Commission of seven members, supported by a small permanent staff. It would need to be seen to be independent, command authority and embody a good understanding of statistical issues and the value of trustworthy statistics in democratic debate. It would also have to have a definite user focus and its members need not be professional statisticians. In the event, over 100 applications were received for the post of Chair and over 700 for members.

The Commission was asked to recommend to Ministers within six months its own machinery for covering the interests of users and producers of national statistics - including the arrangements for taking account of country/regions and subject matter dimensions.

The announcement by the Chancellor of the Exchequer, as the Minister for National Statistics, of the appointment of the Chairman, Sir John Kingman, was made on 29 March 2000 and was followed on 7 June 2000 by that of the other six members - Colette Bowe, Sir Kenneth Calman, Patricia Hodgson, Professor David Rhind, Janet Trewsdale, Derek Wanless and Martin Weale, with Gill Eastabrook, a government statistician from the Department of Health, as Chief Executive. This coincided with the publication of the Framework for National Statistics, which set out the responsibilities of Ministers, the Commission, the National Statistician and Heads of Profession for Statistics.

The Commission was to ensure its ability to assess the needs of users; consider and comment on the high-level programme for National Statistics drawing on the views of users and suppliers, taking into account available resources, providers' compliance costs and management need of data suppliers; and advise on areas of widespread concern about quality. It was also to comment on: the application of the Code of Practice, arrangements for promoting professional standards, quality assurance processes, if necessary carrying out or commissioning audits. It was required to respond to Ministers' requests for advice, and after two years to review and report on the need for statistical legislation, thereafter keeping the legislative framework under review. It needed to maintain effective communication channels with the Minister for National Statistics, the National Statistician, and the devolved administrations to ensure the consistency and co-ordination of statistics on a UK-wide basis.

Finally it should submit an annual report to the Minister for National Statistics commenting on the annual report of the National Statistician and on the fulfilment of its remit.

The Commission's birth was welcomed by the user community and thus initially received a fair measure of goodwill from it.

Following recruitment of five staff, the Chief Executive devoted a considerable amount of time to making contact with and attending meetings of user groups to find out their concerns and to encourage them to keep the Commission informed of any relevant matters. It also set up a website (www.statscom.org.uk) on which it made available the Commission's meetings agendas, minutes and papers. Since October 2000 it has met six times, monthly during 2000 and subsequently bimonthly.

Treasury Committee

The Commission was invited to give evidence to the Treasury Sub-Committee and Sir John Kingman and Gill Eastabrook appeared before it on 16 November 2000. They were quizzed on the details of its timetable, its role and scope, how it would set about its tasks and the adequacy of its budget. Sir John emphasised that its only role was to advise and comment; it could comment on the misuse of statistics but this was not its prime concern. Despite the Treasury's repeated insistence that the RPI's scope and definition remained the responsibility of the Chancellor of the Exchequer, the Commission stated that it could give this attention. The importance of measuring public confidence in official statistics was reiterated.

Annual report

The Commission's first annual report was published and launched at an open meeting on 18 July 2001. It covers the period from launch, June 2000 to March 2001. Bearing in mind the time taken to set up its office and that the Commission's first meeting was not until mid-October, the effective active time described is less than half a year.

Of its 47 lavishly illustrated pages, 20 are taken up by its financial statements and about 11 contain text. Its opening page states that it works on behalf of all users and gives as examples of these the Monetary Policy Committee, epidemiologists, local and central government, businesses, parents, disability organisations and citizens. In monitoring National Statistics, the Commission is continuously considering integrity, quality and relevance and takes the opportunity of providing a glossary of terms to demonstrate that its use of such words can be made clear and are not necessarily used in the same way as by other stakeholders.

The stakeholders are identified as users, producers, providers and a wider group which includes Parliament, employers, trade unionists, professional societies and the general public. Initially it has sought to build informal links, followed by considering the need for more formal structures in order to take account of stakeholders' views.

Relationships with the devolved administrations have been agreed as similar to those it has with the UK government and Parliament.

Whilst recognising that in its first year there would not be enough time to complete much quality assurance work, five topics were selected for 'audit'.

The reliability of National Statistics, ('reliability' being defined in the Glossary as including all aspects of accuracy and validity) was recognised as a major task. Having asked the ONS to establish a baseline position, a full list of National Statistics outputs was produced in February 2001. Though this was also supplied to the Treasury Committee, whose report did not publish it, despite recognition that the original list published as an annex to the Framework document was not as fully informative as to the scope of National Statistics as it might be. Oddly enough this list is not yet accessible via the www.statistics.gov.uk. Can't be that difficult! But some departments dragged their feet in selecting only some of their outputs for inclusion, which makes one wonder about the status of those excluded in relation to National Statistics standards.

The Commission expressed its disappointment with the rate of progress at ONS and hoped that follow-up work would be properly resourced.

Monitoring the NHS Cancer Plan was chosen because of high levels of concern and recognition that policy needs to be rooted in facts. Progress had not advanced beyond the initial stage.

Use of seasonal adjustment in ONS was an example of a technical/methodological issue which had applications across a number of outputs. ONS was already undertaking an internal review of this subject so that the Commission's study would not be progressed until it had seen the ONS's report.

The quality of Regional GDP estimates, which are used in a variety of contexts such as the allocation of EU structural funds, public service agreement targets for regional growth and by the devolved administrations, were already included in the National Statistics quality review programme for 2001-2. Therefore the Commission would confine itself to making recommendations to the National Statistician on the scope of the review.

Checking on progress with the Average Earnings Index resulted in the Commission expressing disappointment that neither substantial development work nor strengthening some of the management procedures had yet been completed.

Consequent upon the pursuit of these topics, it had become evident to the Commission that the resources available to the National Statistician were insufficient for him to be able to respond readily to the Commission's requests for information and background material. From its frustration it will pursue the resourcing issue vigorously during the next year.

In addition the Commission considered a number of other issues - the census, price indices and deflators, numbers of welfare claimants, job vacancy figures, and treatment of mobile phone licences in the National Accounts. It also started to plan its review of the need for statistical legislation and accepted an offer from the RSS to organise a meeting of international experts to obtain independent briefing on the role and organisation of official statistics as seen from around the world.

The Commission's strongest criticism was of the failure to issue a consultation draft of the National Statistics Code of Practice a year after the introduction of National Statistics. Nor had the High-Level programme and the quality assurance structure been made available to the Commission, on which it had therefore not been able to comment.

18 July 2001 Meeting

Those who attended this meeting in Great George Street - at which User Groups' representatives were well in evidence - were treated to a lively question and answer session. Six of the eight Commissioners and the Chief Executive lined the top table; the highest scoring batsman was the Chairman, Sir John Kingman.

Introducing the Report, Sir John said it was a new organisation, outside the government 'tent' and not subject to the Official Secrets Act. It was critical of the length of time it was taking for the new National Statistics Code of Practice to see the light of day because departmental Ministers were foot-dragging.

Questions (and answers) included: what efforts had been made to contact official statistics 'non-users' ie those who did not trust them? (Commission was not a 'Post Office', just 'observers'); what active support were users getting? (Commission was open to suggestions); unsatisfactoriness of user groups being run on a shoestring by volunteers (Commission was not a market research outfit and was concerned with the views of more than just users, but its door was open); large companies' views should be tapped (would welcome input from them); could user groups' resources be augmented by using the Commission's website? (could be considered, but Commission was not a user group 'mother figure' or funder); how are decisions made on which issues to pursue? (could not pursue all issues raised [ie question not answered]); do users want to see accuracy measures published alongside the data? (impractical but would like to encourage publication along the lines of US weather reports - eg '70% chance of rain')

The hitherto silent Commissioners were then invited to indicate their contribution to the Commission's work and each did so briefly.

It was evident from the nature and tone of the questions that some of the user groups were registering a degree of disappointment with the attitude and reactions of the Commission to their concerns (cf goodwill expressed when Commission was announced). In this respect it appeared to be reactive and not proactive. No doubt users will continue to maintain pressure on the Commission to heed their concerns so that it does not merely pay lip service. After all, users are 'written into' the scenario as prime stakeholders.

Ulric Spencer

Home Page Top of page

Valid HTML 4.01!