Implementation of quality indicators in the Finnish statistics production process Kari Djerf...

Preview:

Citation preview

Implementation of quality indicators in the Finnish statistics production process

Kari DjerfStatistics FinlandQ2008, Rome Italy

2

Contents

1. Current situation

2. Challenges

3. Steps to proceed and time schedule

3

1. Current situation

Data exist - but:

lots of data are collected for various reporting purposes which does not necessarily serve this purpose, and

(too) much of data stored in inconsistent form

3

4

Current situation – Strategic and/or performance indicators

Collected centrally for all follow-up operations(now 2/4 times a year)

Partly at the agency level, partly at department level

Many of these indicators can be retrieved from a general planning and performance database

5

Current situation – Strategic and/or performance indicators - 2

Examples of indicators:Public confidence (every 2nd year)Delays in publications and releasesNonresponse rates of some key sample surveysShare of electronic data collection of all data collectionResponse burden

→ Not very suitable for continuous follow-up of quality of individual statistics!

6

Quality indicators to be collected and compiled

Product quality indicators

Process indicators

Very often these two are inseparable – a matter of opinion to which category an indicator actually belongs

Goal: accumulate information, do not add unnecessary burden on subject-matter departments!

7

Quality indicators to be collected and compiled - 2

Structure to follow the ESS quality dimensions:

RelevanceAccuracyTimeliness and punctualityAccessibility and clarityComparability Coherence

8

Quality indicators to be collected and compiled - 3

Obviously focus on accuracy and timeliness/punctuality but probably all dimensions will be covered

Most ESS standard quality indicators are suitable as such, some may not be directly applicable

9

Traditional indicators from sample surveys

Unit response/nonresponse rateshousehold surveys: long time-seriesbusiness surveys: incomplete data

Unit nonresponse rates divided into some key domains or classifications:

Reason for nonresponseDemographics (gender, age, region, education,

industry, size…)

10

Traditional indicators from sample surveys - 2

Item response/nonresponse ratesnot sufficiently calculated in household and business

surveys

Evaluation of both types of nonresponse effects on survey results of key parameters:

some results exist both on household and busness surveys

11

Traditional indicators from sample surveys - 3

Reliability estimates (standard errors, CV’s or CI’s)have been reported from most household surveyslesser extent in business surveys

(cut-off samples problematic) Survey specific indicators most probably to be included Editing and imputation indicators

currently under development: indicators to be retrieved from the validation and editing process

e.g. edit failure rates, imputation rates and their effect(esp. important in business statistics)

12

Traditional indicators from sample surveys - 4

Response burden of household surveys measured since 1970s as interview time + occasional evaluation of self-completeted questionnaires or diaries

Measurement of response burden of business and institutional surveys is currently under development

Cost model to be developed

13

Traditional indicators from censuses and administrative data

Coverage rates – to be evaluated with respect to critical contents!

Measurement errors, esp. correspondence between administrative and statistical concepts – important but normally they stay stable unless changes occur

Editing (and imputation) rates

13

14

2. Technical challenges

Periodicity

Requirements by various stakeholders

Metadata standard(s)

Various data sources

14

15

Technical challenges - periodicity

Example: Labour Force Survey

In Finland a monthly survey since 1959: many indicators in comparable since 1984

Current EU-LFS regulations: quarterly with annual combination of data

→ Indicators must be calculated monthly, quarterly and annually – some can be aggregated, mostly not

15

16

Technical challenges - stakeholders

EU regulations, IMF, OECD etc. different in definitions and requirements

EU regulations differ VERY much from each other on the extent of quality reporting and derivation of the indicators(EU-SILC, LFS, PEEIs etc.)

→ New Statistical law may improve the situation in general, but some very subject-dependent indicators might be left aside

16

17

Technical challenges – different types of statistics

Sample surveys Censuses and other total enumeration Administrative sources and registers Indices National accounts

→ Technical solutions must be flexible to allow different types of indicators

17

18

Technical challenges - metadata

SDMX standard to take over

New ESMS is to include some indicators which may or may not be similar between different statistical domains

→ Technical allowance to retrieve directly as many of the required indicators as possible

18

19

Technical challenges – existing data sources

Obviously the biggest challenge!

Subject-matter statistics do not compile and store data in a similar manner: many data warehouse systems were developed for one purpose only. New harmonised statistics production model will improve it gradually.

Next proper database tools must be found to store data and facilitate easy reporting

19

20

3. Steps to proceed - Cross-sectional data collection

A self-assessment of all statistics in next autumn:Quality reportsAvailable indicatorsAvailable metadata

Obviously it will resemble the DESAP approach in contents

Analysis of indicators to include important ones and exclude redundancies

20

21

Cross-sectional data collection - 2

Find a ”good cocktail” of indicators and start retrieving them

Database construction 2008/2009

Programs for reporting

… and system to work in 2-3 years!

22

THANK YOU VERY MUCH

FOR YOUR ATTENTION !

Recommended