When William Playfair started using visualisations in his books he saw it as a means to bring information faster to his readers:
From William Playfair’s ‘Lineal Arithmetic; applied to shew the progress of the commerce and revenue of England’, 1798
And he did it with copperplate charts, like this:
Today search engines have the same ambition: to open up information fast and efficiently. And tons of articles help making this better than competing information providers. The following table does this in visual form like Playfair would have done (?).
For statistical websites these elements play a positive role: Cf, Lq, Ln and Ta. Some work has surely to be done for Sr and Ss.
Search on Google for cinema or weather in a region and you will get more than a link: the weather forecast and the showtimes for today or tomorrow … .
Increasingly, search engines are going to provide more than just links, that is the information looked for. To do so Google already uses (since 2009) semantic markup on web pages in order to present search results with information instead of links to sites containing that information. Such so-called rich snippets describe people, reviews, products, recipes, etc.
Wolfram Alpha has this ambition, too. But Wolfram follows another road: Incoming search questions are analyzed via language recognition, linked to the Wolfram Alpha knowledgebase which then delivers corresponding content:
For weather Spain Wolfram Alpha does even better than Google ;-)
And now we see a step forward by Google & Co in direction of the Semantic Web: Second of June 2011 Google, Bing and Yahoo! announced schema.org, a ‘new initiative to create and support a common set of schemas for structured data markup on web pages. Schema.org aims to be a one stop resource for webmasters looking to add markup to their pages to help search engines better understand their websites.’
This is the next step after rich snippets and one further step towards the Semantic Web in action. But: Google unfortunately doesn’t use an existing standard like RDF! :-(
Many new markup categories will be added. Something relevant for statistical sites? Perhaps ‘GovernmentOrganization’ and ‘DataType’.
Providers of websites have now to decide how they will integrate such new markup in their content in order to get a good representation in search engines.
Widgets or gadgets or however named are little programs that can be embedded in your website or your netvibes, iGoogle, pageflakes, Facebook etc. . They take the intelligence from other sites on the web and give it to your site. Popular examples are calendars, weather forecasts, tag clouds …. or soon trendalyzer ;). A complex widget is for instance yahoo.pipes. There are thousands of them out there on www.widgetbox.com or www.programmableweb.com.
Official statistics are using RSS feeds extensively, but not the more elaborate widgets (true?). Search widgets could be a first and very useful test case on huge statistics sites where informations are often hidden to impatiently searching users. A small test can be seen on Statistics Switzerland (German , French ). It uses a widget for Google and a widget, that introduces a search tool with user feedback (www.eurekster.com).
There are several tools that offer interactive comparisons between websites. It’s quite amusing playing with such tools. I did it with websitegrader and got results like these for some statistics websites … and for blogstats:
In general I think these comparisons give a first hint and are just interesting but should never be used or understood as scientific explanations. Deeper investigations are needed.
A test with Statistics France shows only very small differences in a second report. The tool seems to be coherent in itself.
Some explanations about the measured categories (partially from Websitegrader):
“A website grade of 97/100 means that of the hundreds of thousands of websites that have previously been evaluated, Websitegrader’s algorithm has calculated that this site scores higher than 97% of them in terms of its marketing effectiveness. The algorithm uses a proprietary blend of over 50 different variables, including search engine data, website structure, approximate traffic, site performance, and others.” – From Websitegrader. Not very transparent!
“Google PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves important weigh more heavily and help to make other pages important.” – From Google
Traffic rank by Alexa . Alexa is an online service measuring millions of sites on the Internet and comparing them.
Inbound links: One of the most important measures for a website is how many other sites link to it. The more links the better.
Google inexed pages: This number is the approximate number of pages that have been stored in the Google index. The Google web crawler will visit the website periodically and look for new content for its index.