Open Data is a much-debated topic and – since the Obama administration launched Data.gov on May 21, 2009 – an international competition, too. Nearly 400 Open-Data Portals emerged meanwhile. But very often there is more concern about the number of published data than about the content of these data.
This issue has been addressed by Open Knowledge (okfn) with its Global Open Data Index (Global Open Data Index). ‘ …simply putting a few spreadsheets online under an open license is obviously not enough. Doing open government data well depends on releasing key datasets in the right way.
Moreover, with the proliferation of sites it has become increasingly hard to track what is happening: which countries, or municipalities, are actually releasing open data and which aren’t? Which countries are releasing data that matters? Which countries are releasing data in the right way and in a timely way?
The Global Open Data Index was created to answer these sorts of questions, providing an up-to-date and reliable guide to the state of global open data for policy-makers, researchers, journalists, activists and citizens.’
The Challenge: Be more than a simple measurement tool
The result is a list of datasets that can be found on Google docs.
For National Statistics (in okfn’s definition), these are the (few) chosen sets:
‘Key national statistics such as demographic and economic indicators (GDP, unemployment, population, etc).
To satisfy this category, the following minimum criteria must be met:
– GDP for the whole country updated at least quarterly
– Unemployment statistics updated at least monthly
– Population updated at least once a year’
One thought on “From Quantity to Quality”
Great blog post – you and readers might also be interested in http://opendatamonitor.eu which measures and monitors the quality and quantity of open data across Europe.