Winners of the 4th European Statistical Competition are announced!

Over 11,000 students from 16 European countries signed up for the fourth edition of the European Statistics Competition (ESC). The ESC is a competition organized by Eurostat and volunteering National Statistical Institutes addressed to secondary education students with the purpose of encouraging students to get familiar with statistics and official statistical sources.

This year 61 teams participated in the European finals where they prepared a video on the topic »Information and disinformation: Official statistics in world overflowing with data«. Teams were very creative and demonstrated the importance of official statistical data in everyday life in various ways.

A jury of European experts selected the top five videos in both the age group 14–16 (category B) and the age group 16–18 (category A). The winners of each category will receive their prize at a virtual award ceremony that will take place on 16 June 2021.

Winning team Category A: VERDE (Portugal)

Other finalist teams Category A:

Winning team Category B: TEGLZAROŽE (Slovenia)

Other finalist teams Category B:

Congratulations to all winning teams and their menthors!

The Challenge of Smart Data

Official statistics have never been exempt from the changes taking place around them. Numerous organisations at national and international level are constantly dealing with it and it is always interesting to see what the current 2019 assessment of future challenges is.

One example

Here is an example: Kurt Vandenberghe from the EU Commission (Directorate A) in his closing speech at the conference on New Techniques and Technologies for Official Statistics (NTTS 2019).

He focuses on data collection – especially smart data – , necessary qualifications and possible support from AI. Dissemination, contact to data users and questions about the comprehensible presentation and correct use of the data are left out. And also no reference to the potential of linked data, with which more can be pulled out of existing sources.

The following text includes the last part of Vandenberghes speech with the conclusion. I have adjusted the layout a bit with highlights:

‘So how will the future look like?

I recently came across a statement on a Eurostat website that in the course of the third decade of this century “Most if not all data is expected to be organic, i.e. by-products of people’s activities, systems and things, including billions of smart devices connected to the internet”. In that new context there is a growing need to examine and make use of the potential of “B-to-G”, business to government data transfer. This involves data from social media, mobile phones, Internet of Things, etc. There should be a new role for statistical institutions, captured by the term
smart statistics”.
I quote from the same Eurostat NTTS related page: “Smart Statistics can be seen as the future extended role of official statistics in a world impregnated with smart technologies.” Finally there is the issue of trusted smart statistics, again with an important role for official statistics, ensuring not only the validity and accuracy of the outputs, but also respecting data subjects’ privacy and protecting confidentiality.

Privacy and confidentiality
are a growing concern and we need more research on techniques and technologies helping to avoid misuses of data on individuals and enterprises.

I guess what we will see in the coming years is, however, not one technique replacing existing ones, but a
coexistence of and synergies between
established and new data sources
and techniques, of public and private ones, and of general and specialised providers that complement each other. This will include traditional questionnaire-based surveys, and administrative data sources, alongside new techniques such as big data. While some of these sources will provide basic structural information in high quality, others will provide more timely data on key trends.
What will be increasingly important is to have rich meta-information and knowledge about the quality of these sources and to guarantee and create trusted statistics, including trusted smart statistics.

And in all of this we cannot forget the role that
people with the right skills
will play. We saw already in the last few years that there is a strong growth in Europe in the demand for big data analysts and for managers who know how to deal with big data. This is only expected to grow further. To avoid a skills gap we will have to encourage young people to take up studies in these fields and educational institutions to provide corresponding courses. In the debate around “the future of work”(future technological change might endanger traditional jobs), there is one thing that is certain: the need for data analysts will grow further.

And I guess it is safe to say that they will be increasingly supported by Artificial Intelligence.
Artificial Intelligence
can help to make sense of increasingly large amounts of data, to check the validity and improve their quality, relieving statisticians from routine tasks. Artificial Intelligence could help us analysing data with greater scope, scale and speed. In fact, a lot of what I said before and what you have discussed during the conference relates – directly or indirectly – to artificial intelligence – although AI does not seem very prominent on the programme. Paraphrasing Isaac Asimov’s quote about computers, we could say ‘I don’t fear AI, I fear the lack of it’. And maybe we should especially fear a lack of a European AI. Europe needs to lead on AI and develop AI that respects European values and makes the lives of Europeans better. The Commission is therefore increasing its annual investments in AI by 70% under the research and innovation programme Horizon 2020. It will reach EUR 1.5 billion for the period 2018-2020, and resources will grow further after 2020. ‘

Smart data and appropriate processes

Smart data is the challenge in data collection. What has to be considered, how the processes have to be adapted in order to connect the different data sources to the standard of public statistics – this is the subject of discussion. Here, too, are two examples (from 2018).


Are Current Frameworks in the Official Statistical Production Appropriate for the Usage of Big Data and Trusted Smart Statistics? Bertrand LOISON Vice-Director, Swiss Federal Statistical Office, Diego KUONEN CEO, Statoo Consulting & Professor of Data Science, University of Geneva

From the abstract:
‘As a sequential approach of statistical production, GSBPM (“Generic Statistical Business Process Model”) has become a well-established standard using deductive reasoning as analytics’ paradigm. For example, the first GSBPM steps are entirely focused on deductive reasoning based on primary data collection and are not suited for inductive reasoning applied to (already existing) secondary data (e.g. big data resulting, for example, from smart ecosystems). Taken into account the apparent potential of big data in the official statistical production, the GSBPM process needs to be adapted to incorporate both complementary approaches of analytics (i.e. inductive and deductive reasoning) … . ‘

[4] Kuonen D. (2018). Production Processes of Official Statistics & Data Innovation Processes Augmented by Trusted Smart Statistics: Friends or Foes? Keynote presentation given on May 15, 2018 at the conference “Big Data for European Statistics (BDES)” in Sofia, Bulgaria
(https://goo.gl/RMfpfB).

Towards a Reference Architecture for Trusted Smart Statistics
Fabio Ricciato, Michail Skaliotis, Albrecht Wirthmann, Kostas Giannakouris, Fernando Reis EUROSTAT Task Force on Big Data, 5, rue Alphonse Weicker, L 2721 Luxembourg

From the abstract:
‘ …. we outline the concept of Trusted Smart Statistics as the natural evolution of official statistics in the new datafied world, where traditional data sources (survey and administrative data) represent a valuable but small portion of the global data stock, much thereof being held in the private sector. In order to move towards practical implementation of this vision a Reference Architecture for Trusted Smart Statistics is required, i.e., a coherent system of technical, organisational and legal means combined to provide an articulated set of trust guarantees to all involved players. In this paper we take a first step in this direction by proposing selected design principles and system components …. .’

What are they doing …. ?

… and how do statistical institutions present what they do?

In times of fake news and austerity measures, statistical offices are feeling more and more the urge to orientate the public about themselves and the usefulness and necessity of trustworthy statistics.

But how to proceed?

Public relations specialists know countless ways to get messages to the target groups. A traditional and usually quite boring way are annual reports. They’re usually just an obligatory thing and treated accordingly.

Annual reports as ambassadors for public statistics

Is this still a quite boring lecture under the changing circumstances mentioned above? Let’s look at a few examples.

 

#1 European Official Statistics

The European Statistical Governance Advisory Board publishes the report, which focuses on fake news and trust issues. It’s  mainly a control report with recommendations to be re-evaluated next year.

Not everyone’s reading but with some interesting facts about the European statistical infrastructure.

ESGAB-Titel-2017

‘ … this year’s Report focuses on the importance of good governance to maintain and increase trust in official statistics, ensuring appropriate access to administrative and privately-held data, and the practical challenges of coordinating NSSs.
Chapter 1 looks first at the challenge of maintaining and enhancing trust in official statistics when there is conflicting information provided by non-official sources or when statistical indicators fail to relate to citizens’ actual experiences. Access to administrative records and privately-held data is then examined, highlighting some of the difficulties encountered by NSIs and the need to ensure that the transposition of the new Regulation on General Data Protection into national law does not hinder access to data for statistical purposes. Finally, the challenge of coordination within NSSs is discussed, particularly in relation to ONAs.
Chapter 2 provides ESGAB’s overview of the implementation of the Code of Practice, ..
Chapter 3 reviews ESGAB’s activities over its first nine years, … ‘ (p.10)
ESGAB-Recommendations-2017
p.8

Glossary
European Statistics
Code of Practice (‘the Code’)
The European Statistics Code of Practice sets
the standards for developing, producing and
disseminating European statistics. It builds on
a common definition of quality in statistics used
in the European Statistical System, composed of
national statistical authorities and Eurostat. ….
European Statistical Governance
Advisory Board (ESGAB, ‘the Board’)
ESGAB provides an independent overview of
the implementation of the Code of Practice. It
seeks to enhance the professional independence,
integrity and accountability of the European
Statistical System, key elements of the Code,
and the quality of European statistics …..
European Statistical System (ESS)
The European Statistical System is a
partnership between the European Union’s
statistical authority, i.e. the Commission
(Eurostat), the National Statistical Institutes
(NSIs) and Other National Authorities (ONAs) ….

Some interesting facts given in this report:

gdp-EU

.

#2 UK

UK is of a similar type to the EU. Somewhat more systematic, with clear performance targets and evaluated indicators …. and tons of financial data.

‘This year has been a challenging one for those of us working in official statistics. Numbers were very much in the news in the run-up to the EU referendum and since. Examples of bad use of numbers and misrepresentation of statistics can cast a shadow over the validity and integrity of evidence. However, information that can be accepted and used with confidence is essential to good decision making by governments, businesses and individuals.’ …’ (John Pullinger,p.4)
‘The 2007 Act requires that the Authority produces a report annually to Parliament and the devolved legislatures on what it has done during the year, what it has found during the year and what it intends to do during the next financial year. This report fulfills that responsibility.’ (p.9)
‘STRATEGIC OBJECTIVES
To achieve its mission, over five years the Authority will focus on five perspectives:
a helpful, professional, innovative, efficient and capable statistical service will, we believe, serve the public good and help our nation make better decisions.’ (p.9)
.
‘KEY PERFORMANCE INDICATORS
The Authority’s Business Plan includes a number of Performance Metrics through which we monitor performance. Our performance against these indicators is summarised in the table below. It is important to note our targets are always used to stretch performance ..’ (p. 9)
And some interesting facts:

 

#3 Sweden

Sweden reports concisely on a few central goals and with the obligatory information on the organisation and infrastructure.

‘Statistics Sweden plays a key role in public infrastructure. Its task is to develop, produce and disseminate official and other government statistics. The Official Statistics Act sets out a number of criteria concerning statistical quality, in which statistical relevance is a top priority.’ (Joakim Stymne, p. 4)

‘Punctuality in publishing remained high and amounted to 99 percent. No corrections that were considered serious were made to the published statistics during the year, and there were fewer internal error reports than in 2016.’ (p. 7)


‘During 2017, Statistics Sweden has studied how its customers and users view the agency and its products in different ways.’ (p. 10)

#4 Switzerland

Switzerland differs from other reports in two ways:
– The report shows not only the activities of the Office, but also the state of the country according to various topics (the milestones of the multi-annual statistical programme, and at the same time a small Statistical Yearbook).
– And it is very personal, responsible persons behind the statistics become visible.

German and French only

‘Die erste Halbzeit der Legislatur ist um und damit auch die ersten zwei Jahre des statistischen Mehrjahresprogramms 2016–2019. Die darin festgelegten Ziele und Schwerpunkte bilden die Leitlinien für die Arbeit der Bundesstatistik. Die für das Jahr 2017 geplanten Meilensteine konnten erfolgreich umgesetzt werden. … … der Auftrag der Bundesstatistik wie folgt zusammengefasst: «Im Zentrum des Auftrags der Bundesstatistik stehen die Erstellung und die Vermittlung von nutzergerechten Informationen zu wichtigen Lebensbereichen unserer Gesellschaft. Diese Informationen dienen unter anderem der Planung und Steuerung zentraler Politikbereiche, deren Stand und Entwicklung mit Hilfe der statistischen Informationen beobachtet und beurteilt werden können.” (Georges-Simon Ulrich, p.5)

The state of statistics in the topic areas: e.g. Population

And the targets for the future: focal points and priority developments in the coming year:

Some interesting facts about structure and publishing

Staff

Publishing

.

# 5 Germany

Germany is taking a quite different approach: the annual report is more like a scientific magazine. With interviews and contributions to focal topics.

D-title

‘ People are being guided more by their emotions and less and less by facts – this is how we might sum up the post-truth debate which reached its hitherto climax last year, culminating in “postfaktisch” (post-factual, or post-truth) being chosen as the German Word of the Year 2016. …
I hope that all of the other topics dealt with in this report provide you with a good insight into all matters figure-related and that, in so doing, we can enhance your trust and confidence in official statistics.’ (Dieter Sarreither, p.3).
.
The table of contents shows how this report is designed as a magazine
.
Some interesting information about the office
.
This report also gives itself a personal touch and shows the responsible management personnel

.

# Conclusion

Annual reports are certainly not the most effective way of informing the public about the activities and importance of statistical institutions. They must be approached with other measures; they must be embedded in PR measures. Then they can – especially if they are well made – contribute a lot to understanding official statistics.

 

 

 

 

 

Easy-to-understand Statistics for the Public

In a recently published EUROSTAT publication, the authors demand innovative forms of communication from public statistics in order not to lose their socially important role. Among other things, they demand ‘…. to tell stories close to the people; to create communities around specific themes; to develop among citizens the ability to read the data and understand what is behind the statistical process.’

Telling Stories

The UNECE hackathon that has just been completed responds to this challenge.
‘A hackathon is an intensive problem-solving event. In this case, the focus is on statistical content and effective communication. The teams will be challenged to “Create a user-oriented product that tells a story about the younger population”. During the Hackathon, fifteen teams from nine countries had 64.5 hours to create a product that tells a story about the younger population. The teams were multidisciplinary – with members from statistical offices and other government departments. The product created should be innovative, engaging, and targeted towards the general public (that is, not specialists). There was no limit on the form of the product, but the teams had to include a mandatory SDG indicator in the product.
The mandatory indicator was “Proportion of youth (aged 15-24 years) not in education, employment or training” SDG indicator (Indicator 8.6.1).‘ (Source)

Winners

And the hackathon shows impressive results, even if only a few organisations have participated.

The four winners are:

My Favourites

My favourites are number 3 from the National Institute of Statistics and Geography (INEGI-Mexico) and number 2 from the Central Statistical Office of Poland.

Why?

The Mexican solution…

…is aesthetically pleasing and easy to use. The interaction is left to the user and can be individually controlled by him/her in the speed.

The diagrams do not stand alone, but are explained by short texts while scrolling.

The results are not just being accepted. Rather, the concepts are explained and questioned – statistics are presented with the methodological background.

The Polish solution…

…starts with a jourmalistic approach. Here too, the interactivity can be controlled by the user at the desired speed.

At the end, the authors also seek direct contact with the users; a quiz personalizes the statistical data and gives an individual assessment of where the users stand personally with regard to these statistics.

Success Factors

The two applications mentioned above combine decisive user-friendly features:
– visually attractive,
– easy-to-understand navigation that can be controlled by the user according to his needs,
– the journalistic approach,
– concise and instructive explanations,
– personalization,
– hints on the methodological background.

Many of the other applications show the frequently encountered weaknesses: Too much information should be provided, no courage to leave something behind and concentrate on the most important elements. And this leads to long texts and complex navigation with the effect that users quit quickly.

The Good, the Bad and the Ugly

Communication of statistics in times of fake news

In a recent paper Emanuele Baldacci, (Director, Eurostat) and Felicia Pelagalli, (President, InnovaFiducia) deal with the ‘challenges for official statistics of changes in the information market spurred by network technology, data revolution and changes in information consumers’ behaviours’ (p.3)

Three scenarios

The status-quo or bad scenario:

‘Information will continue to be consumed via multiple decentralized channels, with new information intermediaries emerging through social platforms, digital opinion leaders, technologies that reinforce belonging to peers with similar profiles and backgrounds, including in terms of beliefs.’  … ‘Under this scenario it is likely that increased competition from alternative data providers will put pressure on the official statistics position in the information ecosystem and lead to drastic reduction of public resources invested in official statistics, as a result of the perceived lack of relevance.’ (p.8)

 

The ugly scenario:

‘Big oligopoly giants will emerge by integrating technologies, data and content and providing these to a variety of smaller scale platforms and information intermediaries, with limited pricing power for further dissemination. In this scenario, data generated by sensors and machines connected to the network will increasingly create smart information for individuals. However, individuals will not participate in the data processing task, but will be mostly confined to crowdsourcing data for digital platforms and using information services.’
‘In this scenario, official statistics will be further marginalized and its very existence could be put in jeopardy. More importantly, no public authority with significant influence could be in charge of assessing the quality of data used in the information markets. Statistics as a public good may be curtailed and limited to a narrow set of dimensions. …  Official statisticians will appear as old dinosaurs on the way to extinction, separated from the data ecosystem by a huge technology and capability gap.’ (p.9)

 

The good scenario:

The authors do not stop here. They also see a good scenario, but a scenario that implies a huge engagement.

This scenario is ‘predicated on two major assumptions.
First, the information market will be increasingly competitive by sound regulations that prevent the emergence of dominant positions in countries and even more important across them.
Second, official statistics pursue a strong modernization to evolve towards the production of smart statistics, which fully leverage technology and new data sources while maintaining and enhancing the quality of the data provided to the public.
In this scenario, official statistics will generate new more sophisticated data analytics that cater to different users by tailored information services. It uses network technologies (e.g., blockchain, networks) to involve individuals, companies and institutions in the design, collection, processing and dissemination of statistics. It engages users with open collaborative tools and invests heavily in data literacy to ensure their usability. It strengthens skills and capacity on statistical communication to help users understand in transparent manners what are the strengths and limitations of official statistics.’ (p. 9/10)

 

Actions needed to face the challenges ahead

The good scenario already depicts some needed actions to be taken by official statisticians. The authors conclude with proposals that are not really new, ideas that have been on the table for some time but are not so easy to implement.

‘It is important to change mindsets and practices which have been established, in order to put in contact the citizens with official statistics, to make data accessible, to expand the understanding of their analysis, to support individuals, business and institutions in the decision-making process.

The key issue is how to be authoritative and to develop quality knowledge in the new and changing information market. It is important to know the rules and languages of the media platforms used for communication; to overcome the technicalities; to tell stories close to the people; to create communities around specific themes; to develop among citizens the ability to read the data and
understand what is behind the statistical process. In summary, put people at the center (overused phrase, but extremely valuable):
⎯ communicate statistics through engaging experiences and relevant to the people who benefit from them;
⎯ customize the content;
⎯ adopt “user analytics” to acquire the knowledge of the “users” through the analysis of data (web and social analytics) and the understanding of people’s interaction with the different platforms.’ (p.11)

And the concluding words call for external assistance:

‘It will be essential for statisticians to build more tailored data insight services and team up with communication experts to play a more proactive role in contrasting fake news, checking facts appropriately and building users’ capacity to harness the power of data.’ (p.12)

 

 

 

 

 

Corporate nieuws

Eurostat’s biennial scientific conference on New Techniques and Technologies for Statistics (NTTS) is over, a labyrinth of a website is online and tons of documents are somewhere published.

CBS Corporate nieuws summarizes the important trends discussed:
1) New data sources and the consequences
2) The importance of a proactive communication
3) Big Data and algorithms in official statistics

trends.pngCBS06-06-2017 Miriam van der Sangen 

Corporate websites

Why taking this information just from CBS (the Dutch Statistical Office)? Because CBS Corporate nieuws is an excellent example of the second trend: proactive communication, proactivity in delivering (statistical) information to users. The website makes corporate information public and gives insights into activities of CBS and statistics. You see topics …

… and the people behind it.

The target public of this corporate website are enterprises, administrations, journalists, students and whoever may be interested.

A shorter English version is integrated into the CBS website.

Corporate websites like CBS’ are not quite usual. They are resource consuming but are probably very good in helping to understand statisticians’ mission and work .. and in motivating employees.

 

 

 

 

There is no New Thing under the Sun – Yes and No

Twitter reminded me that there’s #NTTS2017 going on, Eurostat’s biennial scientific conference on New Techniques and Technologies for Statistics (NTTS).

The opening session also focused on official statistics and its actual and future role in a world of data deluge and alt-facts. What will be Official Statistics in 30 years?
In Diego Kuonen’s presentation and discussion on ‘Big Data, Data Science, Machine Intelligence and Learning’ I could hear an answer to this question reminding me of a text in the Bible: “… that [thing] which is done is that which shall be done: and there is no new thing under the sun”.
And this not to be understood in a static but in a dynamic interpretation:
The work statistical institutions are doing today will be the same that they will do tomorrow … BUT a work adapted to the changing context.
The algorithms (understood in a broader sense as ‘a set of rules that precisely defines a sequence of operations ->‘) used in collecting, analyzing and disseminating data will be changing, manual work will / must be replaced by automation, robots. But the core role of being a trusted source of data-based and (in all operations) transparently produced information serving professional decision making will remain.
The challenge will be that these institutions
– are known,
– are noted for their veracity,
– are consulted
and with all this can play their role.
In this fighting to be heard humans will always play a decisive part.
That’s a clear message (as I understood it) of a data scientist looking ahead.
.
PS. A step towards automation consists of preparing and using linked data. See the NTTS 2017 satellite session “Hands-on workshop on Linked Open Statistical Data (LOD)”

Post Post-Truth

postfaktisch

‘Fake-news’ and ‘post-truth’ (postfaktisch) are the words dominating today many discussions about truth in communication.

' ... in post-truth [post] has a meaning more like ‘belonging to a time in which the specified concept [truth] has become unimportant or irrelevant’' (https://www.oxforddictionaries.com/press/news/2016/11/15/WOTY-16).

False information or even lies are not new in the information business. And therefore many, and many more websites help to separate wrong from right:

The Reporters’ Lab maintains a database of global fact-checking sites.

snip_factcheckingplaces

And Alexios Mantzarlis ‘collected 366 links, one for each day of the year …  to understand fact-checking in 2016′.

 

Official Statistics’ Ethical Codex

Officials Statistics collect, analyze and disseminate statistical information since long and are also confronted with wrong citations, misuse of statistics and lies. Many of the ethical codices of official statistics recommend acting against such false information.

‘In 1992, the United Nations Economic Commission for Europe (UNECE) adopted the fundamental principles of official statistics in the UNECE region. The United Nations Statistical Commission adopted these principles in 1994 at the global level. The Economic and Social Council (ECOSOC) endorsed the Fundamental Principles of Official Statistics in 2013; and in January 2014, they were adopted by General Assembly. This recognition at the highest political level underlines that official statistics – reliable and objective information – is crucial for decision making.’

snip_funprinciplesun

Two paragraphs are of special interest:

‘ 2. Professional standards and ethics
To retain trust in official statistics, the statistical agencies need to decide according to strictly professional considerations, including scientific principles and professional ethics, on the methods and procedures for the collection, processing, storage and presentation of statistical data.’

AND:

‘4. Prevention of misuse
The statistical agencies are entitled to comment on erroneous interpretation and misuse of statistics.’

The European Statistics Code of Practice says in principle 1:

1.7: The National Statistical Institute and Eurostat and, where appropriate, other statistical authorities, comment publicly on statistical issues, including criticisms and misuses of statistics as far as considered suitable.

snip_funprinciplesess

N.B: Wikipedia’s page on Misuse of statistics presents a broad view how readers can be fooled by many types of misuse.

It’s dissemination – …

False – and especially deliberately false – information as a weapon in manipulating decisions isn’t new either. But new is how such information spreads: with the help of social media dissemination gains a new level  (some say like earlier Gutenberg’s printing press ).

Anne Applebaum gives a practical illustration of how it can work:

‘I was a victim of a Russian smear campaign. I understand the power of fake news.

It was a peculiar experience, but I learned a lot. As I watched the story move around the Web, I saw how the worlds of fake websites and fake news exist to reinforce one another and give falsehood credence. Many of the websites quoted not the original, dodgy source, but one another. There were more phony sites than I’d realized, though I also learned that many of their “followers” (maybe even most of them) are bots — bits of computer code that can be programmed to imitate human social media accounts and told to pass on particular stories.
….
But it is also true that we are living through a global media revolution, that people are hearing and digesting political information in brand-new ways and that nobody yet understands the consequences. Fake stories are easier to create, fake websites can be designed to host them, and social media rapidly disseminates disinformation that people trust because they get it from friends. This radical revolution has happened without many politicians noticing or caring — unless, like me, they happened to have seen how the new system of information exchange works.’

 

2017

May 2017 become the year of people who know about the power and the dangers of misleading information!
My best wishes to the colleagues in Official Statistics and their professional producing and disseminating information …. and perhaps statistical dissemination will need to be more active on social media, too.

 

 

20 Years Ago

1996

On the 2nd of September 1996, Statistics Switzerland published its brand-new website, www.bfs.admin.ch. It was one of the first (if not the first) of the Swiss Administration (www.admin.ch).

info-internet

In three languages…

… and already with quite rich structure and content.

SwissStats-April1997

The Wayback Machine …

… shows the developments since 1996

snip_waybackmachine

https://archive.org/web/.

1996:
Handmade with Frontpage as editing software

SwissStats-fields

https://web.archive.org/web/19970502093430/http://www.admin.ch/bfs/stat_ch/eber_m.htm

November 2004:
New layout, made with Day Communiqué as Content Management System and a database for file download

StatSchweiz-November2004

December 2007 ff
Layout adapted to the general layout of admin.ch. The same CMS now bought by Adobe and renamed Adobe Experience Manager AEM

StatSchweiz-Dezember2007

snip_bfs-2016

The next one …

… must be based on a new technology:

  • CMS remains state of the art for content presentation
  • Assets come from databases
  • Web services (via a web service platform) deliver assets from databases to the presentation platform.

And with such a three-layer architecture the website will be able to display data from ubiquitous databases and also offering data to third parties via web services: Open data compatible.

Disrupting Dissemination – From Print to Bots

Digitally disrupted data production

“The collection of statistics has been digitally disrupted, along with everything else, and there are important questions about collection methods and whether or not “big data” genuinely offers promise for a giant leap forward in the productivity of official statistics.”

This statement in Financial Times’ edition of August 20th, 2015 deals with UK’s Office for National Statistics ONS. Its title:  “UK needs a statistical strategy to catch up with digital disruption”. Its message: ONS (and I think all Official Statistics) has problems to keep up “with the profound changes in the structure of the economy during recent decades.”

The “Independent Review of UK Economic Statistics” by Professor Sir Charles Bean, Professor of Economics at the London School of Economics in March 2016 goes deeper and gives 24 recommendations, some of these obviously valid for statistics’ production and producers in general.

snip_Beans-independentReview

“Innovation and technological change are the wellspring of economic advancement. The rapid and sustained rise in computing power, the digitisation of information and increased connectivity have together radically altered the way people conduct their lives today, both at work and play. These advances have also made possible new ways of exchanging goods and services, prompted the creation of new and disruptive business models, and made the location of economic activity more nebulous. This has generated a whole new range of challenges in measuring the economy.” (p.71)

“Measuring the economy has become even more challenging in recent times, in part as a consequence of the digital revolution. Quality improvements and product innovation have been especially rapid in the field of information technology. Not only are such quality improvements themselves difficult to measure, but they have also made possible completely new ways of exchanging and providing services. Disruptive business models, such as those of Spotify, Amazon Marketplace and Airbnb, are often not well-captured by established statistical methods, while the increased opportunities enabled by online connectivity and access to information provided through the internet have muddied the boundary between work and home production. Moreover, while measuring physical capital – machinery and structures – is hard enough, in the modern economy, intangible and unobservable knowledge-based assets have become increasingly important. Finally, businesses such as Google operate across national boundaries in ways that can render it difficult to allocate value added to particular countries in a meaningful fashion. Measuring the economy has never been harder.” (p. 3)

And: “Recommended Action 4: In conjunction with suitable partners in academia and the user community, ONS should establish a new centre of excellence for the analysis of emerging and future issues in measuring the modern economy.”  (p.118)

Disrupting Dissemination of Statistics

The rise of new technologies followed by new information behavior has also disrupted existing dissemination formats (from print to digital) and dissemination practices (from quasi-monopolistic to open and multiple).

A well-known example for disrupting dissemination is given by Wikipedia and its subject is Wikipedia itself:
“The free, online encyclopedia Wikipedia was a disruptive innovation that had a major impact on both the traditional, for-profit printed paper encyclopedia market (e.g., Encyclopedia Britannica) and the for-profit digital encyclopedia market (e.g.,Encarta). The English Wikipedia provides over 5 million articles for free; in contrast, a $1,000 set of Britannica volumes had 120,000 articles.” (Article: https://en.wikipedia.org/wiki/Disruptive_innovation)

In fact, disruptive tendencies happen on both sides: in producing and in presenting or accessing statistical information.

Some thoughts about this:

  1. Until the end of the 20th-century, print was the main channel for disseminating statistics. Libraries in Statistical Offices and Society had their very vital role.
  2. With the internet opened a new channel: Statistical Offices’ Websites appeared, access to databases and attractive data presentation (visual, storytelling, see i.e. this) were top themes and stuff for long discussions. Access to data was now simple and for everyone.
  3. With the open data initiatives not only accessing but also disseminating statistical information got much easier. Nearly everyone could become a data provider. License fees no longer hindered the redisseminaton of official statistics and APIs or webservices provided by statistical offices made this possible in an automated way.
    Statistics can be easily integrated into websites and apps of non-official data providers, this with all the chances to enable democratic conversation and the risks of data misuse.
  4. All this gives statistics a much more important role in communication processes. On the other hand communicating with statistics gets simpler: Letters, telephone calls and even e-mails become cumbersome seen the possibility bots (will) provide. With a stats bot in my daily used messenger, I ask for a statistical information, and the bot uses a search engine or connects me directly to a statistical expert.
    Brands that already have full-fledged apps and responsive websites can take advantage of bots’ ability to act as concierges, handling basic tasks and micro-interactions for users and then gracefully connecting users with apps or websites, as appropriate, for a more involved experience.” (Adam Fingerman, venturebeat, 20.7.2016)
  5. What’s next? Innovation with disruption goes on, but disruption does not always mean destruction: It’s still a wise decision to keep some information in paper format. A statistical yearbook with key data lasts for centuries, not so a website, an API or a bot.