View the PDF print version of the report
Performance indexes, maturity models, and other assessment tools provide an analytical framework for evaluating the capacity of government data systems. In this stocktaking report, commissioned for the Data for Development Global Research Hub (D4D.net) with the support of the International Development Research Centre (IDRC), Open Data Watch (ODW) maps twelve widely used performance indexes and assessment tools to the pillars of the Global Data Barometer (GBD) and the stages of Data Value Chain (DVC) to identify similarities and differences in concept and coverage. The results show that none of the indexes or tools currently available provide balanced coverage of government data systems, suggesting that a more comprehensive measure or a combination of complementary indexes and tools are needed to fully capture the functions of the data ecosystem.
Composite indicators that measure the performance of statistical systems or government functions and produce a numerical score.
Survey-like processes that document the functioning and management of statistical systems without producing a summary score.
In the next section describes the inventory of currently available performance indexes and assessment tools. We use the term performance index to describe composite indicators that measure the performance of statistical systems or government functions and produce a numerical score. Assessment tools are typically survey-like processes that document the functioning and management of statistical systems without producing a summary score. After introducing the indexes and tools, we review their country coverage, maturity, and sustainability.
In following sections, we map the contents of the indexes and tools to the conceptual frameworks of the four GDB core pillars and the stages of the -DVC. To gain further insight into the relationships of the quantitative performance indexes, we examine the correlations of their scores and sub-scores. In the final section we summarize the results of the stocktaking with a set of findings and recommendations.
INVENTORY OF INDEXES AND TOOLS
The performance indexes and assessment tools included in this analysis were selected by Open Data Watch after a review of possible candidates and consultations with other index or tool producers. To ensure their relevance and comparability to the GDB, only indexes and tools applicable to government data in multiple sectors were included. Assessment tools that are currently in use and performance indexes that have at least one edition available since 2019 were included in the research inventory. Seven performance indexes provide cross-country comparisons using a numeric score, while five assessment tools enable countries or organizations to conduct assessments of government data and statistical systems based on qualitative and quantitative measures. All indexes and tools included in the inventory seek to measure some element of data governance, availability, quality, openness, or use and impact. The index and tools included in the inventory are shown in Table 1.
Table 1: Inventory of performance indexes and assessment tools
|NAME||PUBLISHER||COUNTRY COVERAGE||FIRST EDITION||LATEST EDITION|
|OURdata Index||Organization for Economic Co-operation and Development (OECD)||OECD countries||2015||2020|
|Open Data Inventory (ODIN)||Open Data Watch||Global||2016||2020|
|Statistical Performance Index (SPI)
|Ibrahim Index of African Governance (IIAG)
||Mo Ibrahim Foundation||Africa||2007||2020|
|Worldwide Governance Indicators (WGI)
|Use of Statistics Index||PARIS21 (OECD)||Global||2019||2020|
|European Open Data Maturity Assessment (EODMA)||European Data Portal||Europe||2015||2020|
|Open Data Readiness Assessment||World Bank||NA||2013||2015|
|Data Quality Assessment Framework||International Monetary Fund (IMF)||NA||2003||2012|
|Joined-Up Data Maturity Assessment||Global Partnership for Sustainable Development Data (GPSDD)||NA||2019||2020|
|Open Data Demand Assessment||The GovLab||NA||2018||2018|
|Open Data Maturity Model||Open Data Institute||NA||2015||2015|
The list below describes the selected indexes and tools.
- OURdata Index: This index measures data availability, data accessibility, and government support for data in 37 Organization for Economic Co-operation and Development (OECD) member and partner countries to identify open data policy achievements and challenges.
- Open Data Inventory (ODIN): This index assesses the coverage and openness of official statistics in 187 countries to identify gaps, promote open data policies, improve access, and encourage dialogue between national statistical offices (NSOs) and data users.
- Statistical Performance Indicators (SPI): This index assesses the maturity and performance of official statistical systems to improve development outcomes and track progress towards the Sustainable Development Goals (SDGs). The SPI covers 218 countries and territories, but complete data are only available for 174.
- Ibrahim Index of African Governance (IIAG): This index measures and monitors governance performance in 54 African countries. It is comprised of four sub-indexes that measure dimensions of governance. The sub-index on foundations for economic opportunity includes measures of the capacity of the statistical system.
- Worldwide Governance Indicators (WGI): This index is a composite of other indexes that measure perceptions of the quality of governance in 200 or more countries and territories. There are six WGI indexes. The Voice and Accountability index includes indicators of the reliability of basic economic and financial statistics for 204 countries.
- Use of Statistics Index: This index measures the use of official statistics in development plans and documents by government decision makers in 184 countries with a focus on basic use, data disaggregation, monitoring and evaluation, and national policy documents such as national development plans and poverty reduction strategy papers.
- European Open Data Maturity Assessment (EODMA): This index assesses the level of open data maturity in 32 European countries through its policies and programs to promote and support open data to government data and to foster high-quality data publication and increase impact.
- Open Data Readiness Assessment (ODRA): This tool can be used to conduct an action-oriented assessment of the readiness of a government or individual agency to evaluate, design, and implement an open data initiative.
- Data Quality Assessment Framework (DQAF): This tool allows countries to conduct self-assessments of five dimensions of data quality to guide country efforts to strengthen statistical systems, to inform preparation of International Monetary Fund (IMF) reports, and to guide data users in evaluating data for their own purposes such as policy analysis, forecasts, and economic performance. There are seven dataset-specific DQAFs, all concerning macroeconomic statistics, and one DQAF on household income and poverty.
- Joined-Up Data Maturity Assessment: This tool can be used by national statistical offices (NSOs) and other entities that control or process data in the development sector to assess organizational, human, data, and technological interoperability within a statistical system.
- Open Data Demand Assessment: This tool provides open data policy makers and practitioners with an approach for identifying, segmenting, and engaging with demand for open data to empower data champions within public agencies.
- Open Data Maturity Model: This tool enables an organization to assess how well it publishes and consumes open data and to identify actions for improvement.
Assessments are needed in low- and middle-income countries as well as in high-income countries.
Country coverage of the global indexes has increased in recent years. For example, in 2015 ODIN provided scores only for 125 low- and middle-income countries before including high-income countries in 2016 for a total of 173; in 2020 ODIN included 187 countries. The Use of Statistics index included 69 countries in 2019 but expanded to 174 in 2020.
Most assessment tools do not have a central registry of their application. Because of this and because self-assessments may not have been reported outside the country, it is not possible to give a definitive list of where and how often they have been applied. The World Bank lists 18 countries that have conducted an Open Data Maturity Assessment between 2013 and 2020. The IMF includes data quality assessments as part of its Reports on Standards and Codes (ROSC). The IMF’s Dissemination Standard Bulletin Board lists 83 countries that have had data ROSCs conducted since 2001. Similar statistics for the Joined-Up Data Maturity Assessment, the Open Data Demand Assessment, and the Open Data Maturity Assessment are not available.
Maturity: Characteristics of older and recent indexes and tools
The length of time an index or tool has existed can be an indication of its maturity, as each iteration involves reviews and critiques that provide an opportunity to further refine and hone the index or tool. The oldest index or tool in the inventory is the WGI, first published in 1996. The only index or tool introduced before 2010 was the IIAG (2007) and the IMF’s DQAF (2003). The most recently introduced are the SPI (a successor to the World Bank’s Statistical Capacity Index) and the Use of Statistics Index.
Newer indexes can contribute to the exploration of ways to provide objective measures of data use and impact.
Recent indexes and tools are exploring ways of measuring data use and impact.
- The Use of Statistics Index is the only index or tool focused exclusively on the use of statistics. It employs keyword searches to track the use of terms referring to data and statistics in national development plans and documents.
- The SPI includes indicators in its first pillar to measure data use by different sets of stakeholders, but so far, direct measurement of data use is only included for dimension 1.5 (data use by international bodies) due to a lack of an established methodology for data use by government, civil society, and academia.
- The EODMA seeks to capture examples of re-use and impact. Recognizing that there is no generally accepted methodology for measuring impact, the EODMA questionnaire asks for examples of the re-use of government data in policies, publications, or research and for subjective impressions of the impact of open data in particular settings.
- The Open Data Readiness Assessment is designed to address both the supply and demand for open data, but it advises “… to use this tool alongside other tools that focus more deeply on specific areas of interest (e.g., civil society demand for Open Data or technical capacity of the public sector).“
- The Open Data Demand Assessment evaluates the demand for open data through a series of questions concerning the impact of open data.
Sustainability: Regular financing for indexes and tools
Assessments require sponsors that are able and willing to bear the cost of regular updates while making them available as public goods.
Many of the indexes and assessment tools are initiatives of large international organizations such as the IMF, OECD, the European Commission, and the World Bank. In fact, a quarter of all assessments—the WGI, SPI, and ODRA—are funded directly by the World Bank. Because these tools and indexes are sponsored by large and well-funded government organizations, the risk of financial instability should be minimal, but changes in organizational priorities or budgeting pressure can put even well-established programs in jeopardy.
For other measures, sustainability depends on the ability of non-governmental organizations to support them. The IIAG, for example, has been developed by and funded by the Mo Ibrahim Foundation, a non-grant making and non-fundraising organization in Africa with a focus on governance and leadership. ODIN was developed by Open Data Watch, a nonprofit organization that relies on grants for the sustainability of ODIN. Similarly, The GovLab and GPSDD are non-profit organizations funded by a combination of grants and contracts.
MAPPING THE INDEXES AND TOOLS
The GDB intends to provide a comprehensive framework for assessing a country’s data ecosystem. The mapping of each index or tool examines the indicators that contribute to their overall assessment and locates them in the relevant GDB pillar. The purpose of the mapping exercise is to document how and where the 12 indexes and tools overlap with the GDB’s core pillars and where there may be gaps in their coverage.
A similar mapping has been carried out using the framework of the DVC. The DVC adds a different perspective that describes stages in data’s life cycle from production to use. It uses a process-orientated approach to describe value added to data at each stage, and it provides a way to examine which parts of the data lifecycle have been emphasized or overlooked in assessments of data systems.
Mapping to the Global Data Barometer’s core pillars
The four pillars of the GDB measure data governance, data capabilities, data availability, and data use and impact. Definitions for each pillar, taken from the Global Data Barometer’s Handbook, are summarized in Table 2.
Table 2: Global Data Barometer core pillars
|GLOBAL DATA BAROMETER PILLAR||DEFINITION|
|This pillar includes factors related to the management of data and the presence of legal and policy frameworks to guide that management (e.g., presence and content of policies on data protection, open data policies, or open data licenses).|
|This pillar includes factors related to a country’s ability to create, manage, and use data effectively, including data producer and data user capabilities (e.g., the implementation or existence of data literacy training for civil servants or non-governmental stakeholders, presence and quality of data portals, or the ability to monitor reuse of data).|
|This pillar includes factors related to a country’s data completeness, sectoral coverage, data quality, and data openness (e.g., the timeliness and frequency of data publication, the interoperability of data, or the implementation of international data standards and classifications).|
|DATA USE and IMPACT
|This pillar includes factors related to a country’s level of data use by various actors and the impact of their data (e.g., evidence of data use by the private sector, evidence of data use by the public sector, or evidence of data use by the academia).|
The indicators of each of the indexes and tools were mapped to the four GDB pillars. The mapping is summarized in three levels of emphasis shown in Table 3:
|Blue represents pillars with high emphasis, which occurs when the percentage of indicators included in the index or tool and mapped to a pillar is significantly higher than in other pillars.|
|Yellow represents low emphasis, which indicates only a handful of indicators are mapped to a pillar.|
|No color represents no emphasis when no indicators were mapped to a given pillar.|
Indicators that do not fit into any pillar were rare, but most often occurred in assessments that concerned topics outside the scope of the four GDB pillars such as financing for data collection and publication, as well as infrastructure capabilities.
Note that the emphasis on a pillar is not an indication of its significance relative to other indexes or tools. The WGI, for example, has a high emphasis on data availability and none on the other pillars but includes a more limited set of availability indicators than, say, ODIN. Furthermore, the categorization of each indicator according to the four pillars was a subjective process, and therefore the percentages in Table 3 are meant to provide only a general sense of emphasis rather than an exact weight.
Data availability receives greatest emphasis with data use and impact receiving the least emphasis.
Differences are also apparent when comparing performance indexes to diagnostic tools. For instance, indicators on data availability are more common in performance indexes, while indicators that measure data capabilities are more commonly found in the assessment tools. This likely reflects the relative ease of assigning objective scores to track data availability by external reviewers, while the measurement of capabilities requires an internal review. Data use and impact also receives slightly greater focus among indexes, but it remains a particularly challenging area of research that many indexes are still developing. The SPI proposes five dimensions to measure data use, but all but one are still under development according to its methodology.
Table 3: Indicator coverage across GDB pillar (% of indicators in each pillar)
|NAME||DATA GOVERNANCE||DATA CAPABILITIES||DATA AVAILABILITY||DATA USE AND IMPACT|
|Open Data Inventory||10%||0%||90%||0%|
|Statistical Performance Index||6%||6%||76%||12%|
|Ibrahim Index of African Governance||20%||20%||60%||0%|
|Worldwide Governance Indicators||0%||0%||100%||0%|
|Use of Statistics Index||0%||0%||0%||100%|
|European Open Data Maturity Assessment||37%||20%||5%||37%|
|Data Quality Assessment Framework||14%||5%||81%||0%|
|Open Data Readiness Assessment||22%||72%||6%||0%|
|Joined-Up Data Maturity Assessment||37%||25%||25%||13%|
|Open Data Demand Assessment||0%||19%||12%||69%|
|Open Data Maturity Model||25%||67%||0%||8%|
The most common and less common indicators used to measure concepts within each GDB pillar are shown in Table 4. Examples illustrate the type of indicator that could be included in each category. They are based on indicators from existing indexes and tools but are not shown verbatim. Annex 1 provides a representative list of examples from specific indexes and tools that correspond with each pillar.
Table 4: Common and less common indicators grouped by GDB pillars
|GLOBAL DATA BAROMETER PILLAR||COMMON INDICATORS||
LESS COMMON INDICATORS
Data management policies
Ex: Mechanisms are in place to monitor data quality and openness
Open data policies
Ex: Existence of an open by default policy and open data initiatives
Open data licenses
Data protection/privacy policies
Ex: Existence of legal and policy framework for the protection of personal privacy
Ex: Existence of formal requirements to consult with users prior to data publication
Data producer skills and resources
Ex: Designated staff for data management and data stewardship
Government support for reuse
Ex: Existence and frequency of programs/events designed to promote data reuse among different types of users
Data user skills and the enabling environment
Ex: Strength of research communities’ capabilities in data analysis
Ex: Visible political support for open data
Ex: Availability of specific indicators
Ex: Adherence to international guides or use of internationally accepted classification systems
Data openness and accessibility
Ex: Availability of data in machine-readable and non-proprietary formats made available free of charge
Availability and quality of metadata
Ex: Comprehensiveness of metadata, use of internally accepted standards for metadata dissemination
Ex. Availability of indicators by sex, age, disability status, and other characteristics
|DATA USE and IMPACT||
Evidence of use
Ex: Examples of data reuse by civil society, national legislature/executive branch, academia, and other users
Evidence of impact
Ex: Examples of data reuse that show impact (economic growth, innovation, policy development, improvement of service delivery, increase in institutional transparency and accountability)
Mechanisms in place to monitor use or impact
Ex: Existence of activities or mechanisms to monitor data use/impact, existence of methodology to measure impact of data
The mapping exercise in Table 3 shows the least emphasis is on the data use and impact pillar. Table 4 shows that indicators in all four pillars tend to focus more on data production than data use. For example, under data capabilities, indicators on the capabilities of producers are common, but indicators on data user capabilities are less common.
Mapping to the Data Value Chain
The DVC allows us to look at indicators along different dimensions, complementing the structure of the GDB.
Table 5: Stages of the Data Value Chain
|DATA VALUE CHAIN STAGE||DESCRIPTION|
|DATA COLLECTION||This stage includes indicators concerning policies on and practices of data collection, data quality, adherence to standards and classifications, and data openness. It also includes feedback mechanisms to inform data collection processes and indicators of the capacity to collect data.|
|DATA PUBLICATION||This stage includes indicators concerning the amount and type of data and metadata published, along with how the data are published (file formats, download options, data portal functionality).|
|DATA UPTAKE||This stage includes indicators concerning explicit actions taken by the data producers to increase the use of data. It includes open data licenses, measures to increase data accessibility, data literacy programs, hackathons, and other similar activities.|
|DATA IMPACT||This stage includes indicators that measure evidence of use or reuse of data for policy decision making, business, or project creation (ex: application development), academic, news or other reports, or other uses.|
The mapping exercise for the DVC stages uses the same methodology as the mapping done for the GDB pillars, with blue reflecting high emphasis, yellow reflecting low emphasis, and no color reflecting no emphasis. While the majority of indicators within the indexes and tools could be mapped across the DVC stages, indicators related to enabling factors such as governance, financing, producer capabilities, infrastructure, and feedback loops do not relate directly to the stages of the DVC and were not included.
As Table 6 shows, data publication receives the most emphasis across the indexes and tools, which coincides with the emphasis on data availability in the GDB. Data uptake, which relates to connecting users to data and active promotion of use, receives significant focus among the assessment tools, but less among the performance indexes, which reflects the relative ease of including a focus on capacity as part of internal assessments. Data collection receives less emphasis by both indexes and tools. As with the GDB pillars, there is a lack of emphasis on data impact. The mapping to the Data Impact DVC stage closely matches the Data Use and Impact pillar of the GDB.
Table 6: Indicator coverage across DVC stages (% of indicators in each stage)
|NAME||DATA COLLECTION||DATA PUBLICATION||DATA UPTAKE||
|Open Data Inventory||0%||87%||13%||0%|
|Statistical Performance Indicators||69%||17%||3%||11%|
|Ibrahim Index of African Governance||17%||67%||17%||0%|
|Worldwide Governance Indicators||0%||100%||0%||0%|
|Use of Statistics Index||0%||0%||0%||100%|
|European Open Data Maturity Assessment||9%||40%||15%||37%|
|Data Quality Assessment Framework||41%||55%||4%||0%|
|Open Data Readiness Assessment||9%||5%||62%||24%|
|Joined-Up Data Maturity Assessment||19%||45%||36%||0%|
|Open Data Demand Assessment||0%||8%||0%||92%|
|Open Data Maturity Model||9%||36%||55%||0%|
Table 7: Common and less common indicators grouped by DVC stages
|DATA VALUE CHAIN STAGE||COMMON INDICATORS||LESS COMMON INDICATORS|
Data quality measures
Ex: Source data are routinely assessed, e.g., for coverage, sample error, response error, and non-sampling error
Ex: Various open data events are organized by a mix of actors (public and private sector, civil society, and academic bodies) throughout the country to foster the exchange on the open data topic
Ex: Measures of the extent to which sources being used enable the necessary statistical indicators to be generated
Open data policies
Ex: Open Data policies and strategies are in place at national level
Ex: Compliance with privacy legislation; the development of data inventories in the public bodies at national, regional, and local levels is defined as priority in the national policy and/or strategy
Dissemination mechanisms, standards, and activities
Ex: Portal visibility is enhanced by organizing/attending info sessions and/or events to promote the national portal
Accessibility of data
Ex: Data accessible free of charge and in open formats on the central/federal data portal.
Availability of data
Ex: How many data sets are available?
Existence of metadata
Ex: Documentation of concepts, scope, classifications, basis of recording, data sources, and statistical techniques is available
Quality of published data
Ex: Statistics are consistent within the dataset.
Data promotion and engagement
Ex: Existence of formal partnerships with businesses and the civil society to support data re-use
Open data plans and policies
Ex: The national open datastrategy incentivizes the re-use of open data by both the public and private sectors and access to real-time data.
Capabilities of users
Ex: The organization promotes the availability of third-party learning resources and tools, and training activities for civil servants working with data are in place
Evidence of impact
Ex: Conducted or financed research on socio economic impact of open data.
Use and re-use of data
Ex: Various re-use examples exist that show the impact of open data on enable better policy and decision making processes.
Demand for data
Ex: What is the extent of intra- and inter-government actual demand and latent demand for data?
Indicator mapping: Summary
The previous two sections have examined the contents of the performance indexes and assessment tools by mapping their indicators to the pillars of the GDB and the stages of the DVC. The purpose of the mapping was to identify concepts that are well-measured by the indicators used by the indexes and tools and, conversely, concepts that are not well measured or for which few indicators have been found. During the mapping exercise, two pillars of the GDB were found to align closely with two stages of the DVC: Data Availability in the GDB and the Publication stage of the DVC and Data Use and Impact in the GDB and Impact in the DVC. This report will discuss the mapping of these pillars and stages jointly and the remaining pillars and stages separately.
Data governance (GDB)
Few indexes or tools included indicators strictly related to data policies or management practices such as protection or privacy.
Data capabilities (GDB)
This pillar represents a broad collection of activities, some of which could be included in other pillars. Government support for re-use of data also appears in the Impact and re-use pillar; data management functions may be included under data governance; and data creation activities may be included in the data collection and publication stages of the DVC and the data availability pillar of the GDB.
Quantifiable measures of user skills are not included among existing indexes.
Survey design, adherence to standards, data compilation, and timeliness are all relevant to data collection.
Data availability (GDB) and publication (DVC)
Disaggregation is needed to measure differences in gender, age, ethnicity, and other important variables.
BOX: Elevating a Focus on Gender Data
Data that are disaggregated by sex or that reflect gender-related issues allow decision makers to develop better policies and initiatives that improve lives and help achieve gender equality. Sponsors of indexes and tools should ensure that a gender focus is included in all dimensions. However, only four of the indexes reviewed here have at least one indicator related to gender. And none of the diagnostic tools have an explicit focus on gender.
Gender-related indicators included in performance indexes:
Open Data Inventory
Statistical Performance Indicators
Use of Statistics Index
Many indexes and tools include indicators that may measure sex-disaggregation indirectly. For example, the SPI covers the availability of data to measure SDGs beyond Goal 5, and while many indicators needed to measure these goals require sex disaggregation, it is not made explicit, and the extent of sex-disaggregation is not included in their assessments.
Uptake involves connecting users to the data through intermediaries and encouraging use. Most indicators related to this topic focus on producer activities to promote uptake or engagement with government data. Some also take note of official plans and policies. Legislation that promotes data use is also monitored to some extent. However, as noted under the GDB capability pillar, there are few measures of data literacy or the ability of users in government or in civil society to understand and make decisions using data.
Data use and impact (GDB) and impact (DVC)
The PARIS21 Use of Statistics index is an exception to this, directly measuring the occurrences of statistical terms and indicators in national policy documents.
A closely related problem is measuring the demand for data. The demand for data reflects the benefits they bring, but because open data are a public good, it is difficult to get data users to reveal the value they receive from using data. The ODDA assessment tool attempts to gauge the demand for open data by first identifying a significant public problem, the data needed to solve that problem, and actors that can use the data. ODDA’s systemic approach follows a series of steps that have parallels in the GDB pillars and the DVC stages. They begin with an assessment of data quality, data governance, and data availability and then engage with stakeholders who will use the data, producing valuable outcomes. Valuing those outcomes remains the core problem of quantifying benefits and measuring impact.
Quantitative comparison of performance indexes
The seven performance indexes produce scores and rankings that can be compared with one another. As a test of their similarities and differences, we computed correlation coefficients using the index scores and, where available, the sub-indexes from which they are composed. Indexes that measure similar concepts may be expected to produce similar scores (allowing for differences in scaling), and therefore exhibit positive correlations. Weak correlations do not imply that the indexes are deficient but only that they are measuring different concepts. However, correlations may also be affected by confounding variables, such as income levels or geographic location, that have a similar effect on all indexes.
Table 8 shows the pairwise correlations of the overall index values. Because the WGI does not produce an overall score from its six indexes, only the value of the WGI index of voice and accountability, which includes data-related indicators, was included. Comparisons between the seven indexes are limited to the countries they have in common in the most recent year for which data are available. Four of the performance indexes are global in scope: ODIN, SPI, Use of Statistics, and the WGI. The remaining three are limited to a region (European Open Data Maturity Assessment and the Ibrahim Index of African Governance), or international organization (OECD’s OURData Index). Gaps in country coverage further reduce the countries shared between pairs of indexes.
However, similar measures are not widely available for other types of government data.
The EODMA is less strongly correlated with the other indexes. It may be measuring a different concept than the SPI and ODIN and the two governance indexes, perhaps one that is less closely linked to the capacity of official statistics systems. It may also be the case that among the subset of countries included in the EODMA, there is less of association between the adoption of open data policies and the outcomes measured by the other indexes. Finally, PARIS21’s Use of Statistics index and the OECD’s OURData index give quite different signals for the countries to which they apply. The Use of Statistics index is negatively correlated with SPI, OURData, EODMA, and WGI, and has positive but near-zero correlations with ODIN and IIAG. OURData is negatively correlated with the SPI, ODIN, and the WGI. Correlations between IIAG and OURData and EODMA cannot be computed because the latter two indexes do not include African countries.
Table 8: Correlations between overall index values (%)
|SPI Overall 2019||ODIN Overall 2020/21||PARIS21 Use of Statistics 2019||OURdata Overall 2019||European Open Data Maturity Assessment Overall 2020||World Governance Indicators Voice and Accountability 2019||Index of African Governance Overall 2019|
Looking at their areas of emphasis within the GDB pillars in Table 3, the four indexes with the greatest emphasis on data availability—SPI, ODIN, WGI and IIAG—are the most strongly correlated. EODMA and OURData, which have similar profiles and place the greatest emphasis on governance are moderately correlated, although EODMA is somewhat more strongly correlated with SPI, ODIN, and WGI while OURData, which measures adherence to the International Open Data Charter, exhibits weakly negative correlations with the rest of the indexes. This suggests that measurements of policies or commitments as reported by governments may not be good predictors of outcomes.
It is the only index directly measuring data use, and the anomalous correlation results merit further investigation.
The relationships between the performance indexes were further investigated at the sub-index level. A mapping of indicators between the sub-indexes was used to identify similarities and differences along with correlations between the subindexes. A discussion of these results and the full correlation matrix is available in a supplementary report.
FINDINGS AND RECOMMENDATIONS
This report has compared the scope and contents of selected performance indexes and assessment tools with the framework of the GDB pillars and the stages of DVC. The mapping exercise has identified concepts within these frameworks that are well measured by current indexes and tools and some that are less well measured. A correlation analysis has revealed other similarities and differences across the performance indexes. The findings suggest several general recommendations.
The mapping of the twelve indexes and tools to the GDB pillars shows that none have a strong emphasis across all stages of the GDB. Data availability is the primary focus of most indexes, while the assessment tools place greater emphasis on data capabilities. Data impact and use is least measured. Employing the DVC framework, the performance indexes focus on data publication, while the assessment tools put greater emphasis on measures of data uptake. The least well-measured pillar is data use and impact.
This final section highlights the principal findings from this review and makes recommendations to address them. These recommendations are not intended solely for the development of the GDB but address more generally the strengths and weaknesses of the performance indexes and assessment tools currently used to monitor the production and use of government data. Taken together, the indexes and tools offer a multidimensional view that can be used to identify needs, set policies, and improve the performance of national data systems. Collaboration between the producers of indexes and tools, statistical agencies, and data users can make them even more useful.
Measuring GDB pillars
|DEFINING DATA GOVERNANCE||Data governance is a broadly defined concept. Many activities included under data governance may also be included as elements of data capabilities, availability, and use. Few indexes or tools included indicators of data protection or privacy policies.
|Recommendation:||Indexes seeking to measure governance should adopt a unique definition of the domain of data governance and seek indicators that provide evidence of good policies and data management practices, including data protection and privacy measured from both a producer and a user point of view.|
|GREATER FOCUS ON USER CAPABILITES||Data capability receives relatively little attention from the performance indexes but somewhat greater attention from readiness or maturity assessments such as ODRA and ODMM. Indicators of internal capabilities for data management and programs to encourage data use are more readily available than measures of user skills.
|Recommendation:||The readiness and maturity assessments offer useful measures of data capabilities with a particular focus on user skills and uptake. A broad-based index measure could set itself apart by developing quantifiable measures of user capabilities. Doing so would contribute to the better integration of users into the data ecosystem.|
INCREASE ATTENTION ON DATA DISAGGREGATION
|Data availability and the corresponding publication stage of the DVC are headline topics that receive attention from many indexes and tools that draw attention to data gaps. But only a few fully assess the availability of disaggregated indicators needed to measure differences in gender, age, ethnicity, or other characteristics of vulnerable populations.
|Recommendation:||Measures of data availability provided by indexes and assessment tools should go beyond counts of aggregate indicators to include assessments of the availability of all relevant disaggregations. Greater attention should also be given to the availability of adequate metadata and to demonstrated adherence to open data practices.|
|MEASURING DATA USE AND IMPACT||Older indexes tend to emphasize indicators that measure data production while more recent indexes have begun to measure data use and impact. Data use and impact has received greater emphasis in performance indexes applied to high-income countries. However, these measures often rely on indicators that report the existence of policies or programs to encourage data use or on anecdotal reports of data used in policy and planning documents, but they do not directly measure data use or quantify the results. An exception is the PARIS21 Use of Statistics index that was found to be weakly and negatively correlated with most other indexes.
|Recommendation:||Further research should be done to evaluate policies and programs that monitor data use and its impacts. Can these methods be applied more generally to produce quantified measures of data use and impact that can be applied to countries across the world? These efforts are still in their infancy. They need encouragement and rigorous testing.|
|STARTING WITH DATA COLLECTION||Data collection—the identification, collection, and processing of raw data—is a complex process that requires capable, well-governed, and adequately funded statistical organizations. The indexes and tools reviewed here are more concerned with the outcome of the data collection stage than data collection activities.
|Recommendation:||Data collection through census, surveys, and administrative records is the starting point for producing high quality data. A comprehensive assessment of the data ecosystem should include measures of the data collection stage, including survey design, adherence to standard definitions and classifications, frequency of data collection, good practices in data compilation, and timeliness of publication.|
Structure of indexes and tools
|OBJECTIVE, VERIFIABLE MEASURES NEEDED||Some indexes and many of the assessment tools rely on self-reported indicators of the adherence to policies or subjective assessments of current practices. Objective measures of the implementation and outcomes of policies and programs are less often available.
|Recommendation:||Self-assessments are important tools for internal evaluations but are less useful as a yardstick measure for comparison with other organizations or countries, particularly when they are based on qualitative or subjective assessments. Reliance on self-reports—through surveys or interviews—may introduce biases from self-interested reporters or through non-response. Indexes or tools used to make comparisons between countries or programs should be based on objective indicators than can be reliably measured over time. Verifiable indicators provide incentives to make productive changes and reduce incentives to “game the system.”|
|INDEXES OF DATA AVAILABILITY ARE STRONGLY CORRELATED||Strong correlations between ODIN and the SPI and the broad-based indexes of government performance—the WGI and IIAG—suggest a consensus on the measurement of the coverage, openness, and capacity of official statistical systems. However, similar measures are not widely available for other types of government data.
|Recommendation:||Measures that encompass all data produced by governments should consider extending the ODIN and SPI methodology to produced datasets outside the national statistical system to provide a comprehensive measure of the availability and openness of public data.|
|BALANCE DATA PRODUCTION AND DATA USE||The indicators used by performance indexes and assessment tools are more likely to reflect the activities of data producers than data users.
|Recommendation:||The GDB should balance its four pillars by including indicators that provide robust measures of their concepts from both a producer and user viewpoint. Although ex-post weights can be used to provide a numerical balance of the pillar results, they do not compensate for information that goes unmeasured because of missing or inadequate indicators.|
|IMPROVING COUNTRY COVERAGE||
Gaps in country coverage and infrequent or irregular updating limit the usefulness and comparability of indexes between countries and over time.
|Recommendation:||Assessments of the data ecosystem are needed in low- and middle-income countries as much as in high-income countries to guide their development. Indicators used to measure performance should be available for all countries at regular intervals. These assessments require sponsors willing to bear their cost while making them freely available as public goods.|
|STRENGTHENING COLLABORATION||None of the indexes or tools cover the full scope of the GDB pillars or the stages of the DVC, but taken together and with recommended changes, they can provide an informative view of the current state of data systems and guidance for their development.
|Recommendation:||This report has benefited from the work of the sponsors of the indexes and tools discussed here, including a productive webinar that discussed their purpose and role. Continuing this collaboration with country representatives and other stakeholders would provide a mechanism for the further development of these measures and efficient use of resources for implementation and sustainability. As with all efforts to increase the use and impact of data, a data collaborative dedicated to improving the measurement of data systems should include both the producers of tools and indexes, their users, and the intended beneficiaries of their use.
Dang, H.-A., Pullinger, J., Serajuddin, U., & Stacy, B. (2021). Statistical Performance Indicators and Index: New Tool to Measure Country Statistical Capacity. Retrieved from https://documents1.worldbank.org/curated/en/440191616164007723/pdf/Statistical-Performance-Indicators-and-Index-A-New-Tool-to-Measure-Country-Statistical-Capacity.pdf.
European Data Portal. (2020). Measruing Open Data Maturity, sixth edition. Retrieved from https://data.europa.eu/sites/default/files/method-paper_insights-report_n6_2020.pdf.
Lafortune, G., & Ubaldi, B. (2017). OECD OUEData Index 2017: Methodology and Results. Paris: OECD. Retrieved from https://www.oecd-ilibrary.org/governance/oecd-2017-ourdata-index_2807d3c8-en.
PARIS21. (2020). About Use of statistics index. Retrieved December 3, 2021, from PARIS21 Statistical Capacity Monitor: https://statisticalcapacitymonitor.org/indicator/127.
PARIS21. (2021). Measuring References to Statistics in National Policy Documents. Paris: OECD. Retrieved from https://paris21.org/sites/default/files/2021-05/PARIS21-paper_Measuring%20References%20to%20Statistics.pdf.
World Bank. (2020). Worldwide Governance Indicators. Washington, DC: World Bank. Retrieved from https://info.worldbank.org/governance/wgi/Home/Documents.
ANNEX I. Example indicators mapped across the GDB pillars
|DATA USE AND IMPACT||
ANNEX II. Example indicators mapped across the DVC stages
 The EODMA questionnaire notes that “[A]lthough the impact dimension sets a strong focus on open data re-use cases, the European Data Portal does not consider the availability of re-use examples as direct evidence of impact.”
 The PARIS21 index of data literacy was not included in this study because it was last produced in 2016.
 The IIAG includes ODIN subscores for coverage and openness among its 46 indicators of the Foundations of Economic Opportunity, which also includes the World Bank’s Statistical Capacity Index that has since been replaced by the SPI.
View the PDF print version of the report