eGovernment Monitor Network
Project no.: CIP 224998
Measure paper 4: Innovative measurement
Deliverable Number: D 4.4
Editor: Xavier Heymans
Authors: Eleni Vergi, Lasse Berntzen, Annika Nietzio,
Eric Velleman, Christophe Strobbe, Juliane
Jarke, Mikael Snaprud, Andrea Velazquez
Dissemination Level: This version RESTRICTED FOR eGovMoNet
Status: Draft Version Report
Insert abstract here
Purpose and context of measure paper 4 in eGovMoNet
Present an overview of lessons learned regarding challenges on
implementation measurement innovation, including a revised version of
the measurement description template and the associated indicators
based on the results of the 4th meeting 30 November / 1 December 2009
Version Status Date Change Author
0.1 DRAFT 07/01/10 First preliminary draft, Andrea Velazquez
compilation of the
0.2 DRAFT 22/01/10 Adding purpose in Xavier Heymans
abstract + annexes (other
source of information to
consider for paper)
0.21 DRAFT 31/01/10 Adding input: Seconf life, Andrea Velazquez
Annex 3, More eGov 2.0
and expand Facebook
Introduction .................................................................................................... 1
Structure of the measure paper .................................................................... 1
Section 2: Innovative tools used in eGovernment ...................................... 2
1 Twitter ......................................................................................................................................... 2
.1.1 Exploring the use of web 2.0 technologies: Twitter .............................2
.1.2 Twitter in Government .........................................................................3
.1.3 Twitter & Politics ..................................................................................3
.1.4 Twitter @ the European Parliament elections ......................................6
.1.5 Twitter & protesting ..............................................................................6
.1.6 Twitte can succeed where websites fail ...............................................7
.1.7 Stats on the growth of Twitter ..............................................................8
.1.8 Twitter @ Government: Objectives and metrics...................................8
2 Facebook .................................................................................................................................. 10
.2.1 Innovative measurement using social media .....................................10
3 Blogs ........................................................................................................................................ 11
.3.1 Blogs in eGovernmnet: benefits and disadvantages ..........................11
.3.2 Examples of Blogs in eGovernment ...................................................11
4 Policy consultation .................................................................................................................... 13
.4.1 Participatory budgeting in the City of Freiburg ..................................13
.4.2 Osalle: the Estonian eParticipation tool .............................................13
5 Non-government initiatives and participatory measurements .................................................. 14
.5.1 Literature review ................................................................................15
.5.2 Software and tools .............................................................................15
.5.4 Screenshots .......................................................................................19
6 Privacy and security issues ...................................................................................................... 21
.6.1 Introduction: why privacy and security isssues are included in the
innovative paper .......................................................................................21
7 Games ...................................................................................................................................... 21
8 Second life ................................................................................................................................ 21
9 Methodology impact of the eGovernment awards ................................................................... 21
Section 3: Innovative tools used for eGovernment
measurements and what could be measured ..................................... 21
Section 4: Innovative measurement approaches: Cases ......................... 21
1 Innovative measurement case in France: ............................................................................... 21
2 More innovatives approaches (to complete) ............................................................................ 22
Annexes – Additional sources and input to evaluate ............................... 22
1 Annex - Input from the note takers and from the Knowledge Cafe table
hosts. ................................................................................................................................. 22
2 Annex – Link to folder on eGovMoNet site ............................................................................... 22
3 Annex – Social Value Case Study ........................................................................................... 22
4 Annex – Input Case Piemonte Italy .......................................................................................... 25
Insert introduction here
This paper consists of XXX sections, including this introduction and XX Annexes:
Section 1: Introduction
Section 2: Innovative tools used in eGovernment (how they are used?)
Section 3: Innovative tools used for eGovernment measurements and what could be
Section 4: Innovative measurement approaches cases
.1.1 Exploring the use of web 2.0 technologies: Twitter
Twitter is a free social networking and micro-blogging service that enables its users to
send and read messages known as tweets. Tweets are text-based posts of up to
140 characters displayed on the author's profile page and delivered to the author's
subscribers who are known as followers. Senders can restrict delivery to those in their circle
of friends or, by default, allow open access.
Several applications have been designed in order to make Twitter compatible and easily
accessible through PCs and mobile phones, e.g the Social Whale or the several respective
The community of Twitter is constantly growing and the „social footprint‟ of this new digital
medium is anything else but ignorable. A very representative example is the case of NASA,
receiving the specially designed Shorty Award for its near real-time status updates in 2008
of the unmanned Mars Phoenix Lander mission through Twitter. In May 2009, astronaut Mike
Massimino used Twitter to send updates during the Hubble Space Telescope repair mission
(STS-125), the first time Twitter was used by an astronaut in space.
NASA ISS Tweetup 2009-10-21
.1.2 Twitter in Government
Twitter, which is characterised as the most popular web-based 'micro-blogging' service is
applied successfully in various initiatives ranging from policy development, customer service
and public sentiment monitoring to traffic management and disaster recovery.
The platform is experiencing a phenomenal adoption curve and is used increasingly by
government departments, Members of Parliament, a number of stakeholders as well as
millions of businesses, non government organisations and individuals. It is free to use with a
relatively low impact on resources and has the potential to deliver various benefits.
Below, there is a selection of cases illustrating the successful uses of Twitter in the
.1.3 Twitter & Politics
“Europatweets” provides everyone the opportunity to follow what European actors
say and retweet interesting thoughts.
Europatweets (http://europatweets.eu) is an effort for the promotion of the use of new
technologies in democracy in order to encourage transparency in Europe. At many levels,
Europe was often criticized for its lack of transparency and communication with the public,
while the European Parliament was said to be “disconnected from the people they are
supposed to represent”. During the European elections, in June 2009, Europatweets was
released as a twitter-based platform to deliver more information to the people.
According to Europatweets, a hundred elected officials and candidates to the European
Parliament are using Twitter to keep their supporters informed. A modest use yet, but that
increases day by day. The site ranks users by country, by political group, by number of
tweets sent and the most followed candidates.
.1.4 Twitter @ the European Parliament elections
The European Parliament, meanwhile, is now on Twitter in 22 languages and benefited from
the immediacy of Twitter to support the election night of June 7 in real time. Tweets in
english were tagged as @EU_Elections_en.
A characteristic example of the „power‟ of Twitter is reflected to the following statement: "The
Alliance & Progressive Socialist Democrats in the European Parliament website's audience
has doubled in one year, because of Twitter" said Tony Robinson, head of the Internet Unit
for this Parliamentary Group in Strasbourg during the European e-Democracy Award
presentation for the 2009.
.1.5 Twitter & protesting
With the Iranian government blocking mobile phone networks and access to many
websites, the social utility Twitter has emerged as a powerful tool for protesters to get
their stories out.
In June 2009, following allegations of fraud in the Iranian presidential election, protesters
used Twitter as a rallying tool and as a method of communication with the outside world after
the government blocked several other modes of communication. On June 15 Twitter
rescheduled a planned 90-minute maintenance outage after a number of Twitter users and
the US State Department asked Twitter executives to delay the shutdown because of
concerns about the service's role as a primary communication medium by the protesters in
Iran. CNN's coverage of the conflict was criticized in tweets with the hashtag #CNNfail.
Twitter was also used to organize DDoS attacks against Iranian government websites.
The digital wave of support from thousands of users to the UK’s National Health
A lot of „noise‟ was created around the “We love the NHS” campaign on Twitter, launched by
a television comedy writer, Graham Linehan, and supported by tens of thousands of users of
the microblogging service rushing to defend the British health system. The campaign, that
run under the #welovetheNHS hashtag, was initiated as a protest against President Barack
Obama's health insurance reform proposals, attacking the British National Health
Service. The campaign received the support of several politicians including British Prime
Minister Gordon Brown.
.1.6 Twitte can succeed where websites fail
Washington State Department of Transport (WSDOT) is using Twitter to aid the
organisation to manage the impacts of major weather events
The US Government is slowly finding real-world purposes for Twitter, according to the
Govtech article: Twitter is a Continuity of Operations Tool, State Agency Discovers .
The example of the Washington State Department of Transport (WSDOT) is indicative of
how Twitter can be integrated to an actual operations tool.
WSDOT preserves a web site, informing citizens on emergency situations, like snowstorms
and other major weather events. However, as the web site is a popular source of traffic
updates, sometimes it can't handle a sudden spike in page hits.
Therefore, WSDOT considered the option of posting a "neutered" bare-bones version of its
website that contains a link to the Twitter feed. Quoting from a statement of the
Organisation‟s representative: "One of the things that we're considering if we get into an
emergency situation like that, we can update Twitter and our blog with our handheld
BlackBerry or iPhone or whatever we have. It's a continuity of operations opportunity for us"
And he continues: “on July 31, three major traffic incidents nearly brought the Web site down
- it's a very popular site for getting traffic information. Our Web guru started „tweeting' on the
situation, and suddenly the number of people who were following us went from 20 to 160”.
Ever since, WSDOT has been spreading the word about its Twitter feed”.
At the moment http://twitter.com/wsdot has nearly 7,300 followers (Dec 2009).
.1.7 Stats on the growth of Twitter
According to a release from Nielsen Online in February 20091 about the growth of Twitter,
the penetration of this technology is remarkable:
1,382% year-over-year growth in February 2009.
Total unique visitors grew from 475,000 in February 2008 to seven million in January
Twitter is the fastest growing member community site for the month of February.
Zimbio (240%) and Facebook (228%) were the second and third-fastest growing
Twitter is not just for kids: In February 2009, adults ages 35-49 had the largest
representation on Twitter - almost 3 million unique visitors from this age group
(almost 42% of the entire audience).
62% of the audience access Twitter from work only, while only 35% access it only
from home - watch out for those, "Twitter is killing our productivity at work"
articles to start flowing out of the mass media any day now.
The above information is from a news item in Marketing Charts titled, Twitter Posts Meteoric 1,382% YoY
Growth. The news item was based on a Blog posting from Nielsen Wire titled, Twitter's Tweet Smell Of Success.
It's all about mobile: In January 2009, 735,000 unique visitors accessed Twitter via
their mobile device.
The average unique visitor went to Twitter 14 times during the month.
They spent an average of 7 minutes on the site.
In Q4 2008, 812,000 unique users sent or received Twitter text messages (via AT&T
or Verizon cell phones).
The average tweet per person for the quarter was nearly 240
.1.8 Twitter @ Government: Objectives and metrics
Template Tweeter strategy for Government departments
In July 2009, a very informative paper was released by Neil Williams, head of corporate
digital channels at the Department for Business, Innovation and Skills (BIS), UK, turning the
BIS's Twitter strategy into a generic template Twitter strategy for Departments.
In this paper there is an overview of the various objectives that a governmental department
could set in view of applying a Tweeter strategy, as well as the metrics for assessing these
Moreover, the author provides an explicit presentation of sources for the collection of
evaluation data, in order to assess the performance of Twitter strategy. Quoting from the
We will gather evaluation data using a range of methods.
Web analytics and clickthroughs from URLs in our tweets (using bit.ly) – to
track referrals from Twitter to our web pages
Twitter surveys – regular ‘straw poll’ surveys on Twitter to ask for feedback
Twitter data – the follower/following data presented in our Twitter account
Third party tools – analytics tools including measures based on:
o re-tweeting (Retweet Radar, Twist)
o online reputation (Monitter, Twitter Grader)
o impact and influence (Twinfluence, Twittersheep)
o unfollowers (Qwitter)
Alert services – tweetbeep.com and other methods for tracking mentions of
Real time observation - http://twitterfall.com/ and similar tools
Analysis of our followers using http://tweepler.com/ and similar tools”
Social media is about networking and collaboration. The most widely
used social media platform, Facebook, has 350 million registered users.
Most of these users frequent Facebook on a regular basis, often several
times every day. According to statistics provided by Facebook, 50% of
its users visit Facebook every day.
.2.2 Facebook used by Government
Today, Facebook is adopted by different levels of government, agencies, politicians
and other stakeholders and stakeholder groups. The initial Facebook gave users the
possibility to create persons and groups, where persons could join any number of
groups. Later, Facebook introduced the possibility to create pages for some cause,
politician or business.
Persons could then become fan of a page. Groups and pages are the appropriate
way to introduce an organization to the Facebook community.
Facebook could be used for the following purposes:
- Network building
- Information provision, e.g. post news and upcoming events
- Mobilizing the community, e.g. to work as volunteers for a specific cause
- Interacting with the community, e.g. to consult with community on specific themes
.2.3 Facebook used for measurements
Can Facebook be used for e-government measurements? There are two
reasons why it can:
1) Facebook has an open API that allows applications to be executed within the
Facebook environment. Among other thing, this API has been used to make surveys
and quizzes. It would be possible to use such surveys to measure customer
satisfaction in the context of electronic service provision. The API even provides
some basic authentication to avoid tampering by repeated submissions by the same
2) The number of municipalities using Internet as a communication channel with their
citizens is steadily increasing. Data collected by the author in November 2009
showed that 26 Norwegian municipalities were actively using Facebook to interact
with and inform their citizens. Altogether 73 municipalities were present on
Facebook, but the remaining profiles were either established by third-parties or used
So far, there is no indication that Facebook has been used to collect measurement
data. However, some of the 26 municipalities have asked their citizens to submit
comments on municipal performance.
.2.4 Metrics for Facebook use
How to measure the use of Facebook (metrics)? The following list shows some
possible metrics for evaluating the use of social media like Facebook:
Number of friends/group members/fans
Number of posts
Number of pictures
Number of discussion topics, number of comments
Network building / Information provision / Mobilizing community /
Links to government/agency website
Weblog is used to describe a website or web page which is typically maintained by
an individual or an organisation who posts brief entries regurlarly on an unlimited
number of subjects. Blogs are often considered an online diary of sorts, and articles
cover topics encompassing every possibile subject matter. Entries are typically listed
in reverse chronological order.
Some blogs offer commentary or opinion, while others report facts and are an actual
source of news. A blog often includes pictures of the subject, and links to other blogs
or websites that discuss the subject.
.3.1 Blogs in eGovernmnet: benefits and disadvantages
Through blogs, government can share information to citizens. Tools as blogs favor
the development of the activity in network of the citizens and their political activity.
Blogs in eGovernment prove attractive to citizens because they are easy to use,
accessible, and dynamic. Blogs offer the possibility of categorizing and unifying
contents. Moreover, they provide means to have feedback with the readers. Perhaps,
the main benefit of governmental weblogging is the ability to communicate directly
with the community giving a human face to organisations or politicians.
Blogs can assist the public admnistration in the following tasks: the dissemination of
information, the promotion of participation, the taxation of an activity (or tracking a
project), and the generation of a social network.2
However, the use of blogs poses some disadvantages, e.g., the lack of editorial
control and ethical framework (especially when it comes to ethical issues and
defamation), legal accountability, and misinterpretation of personnel comments as
the position of the government. In addition, political parties often use blogs like a
diffusion platform to spread information against their opponents in order to get media
spotlight. Civil servants experience difficulties to update their blogs and read the
feedback from the readers due to their tight schedules.
.3.2 Examples of Blogs in eGovernment
1. A Port Talbot councillor in the United Kingdom has embraced the internet to
communicate with his electorate. Labour councillor Anthony Taylor has set up
an internet blog detailing his day-to-day work. It was started in December
2008 and has since proved popular with residents in his Taibach ward.
Describing his motivation for the site Coun Taylor said: “I started the blog as a
way of communicating with as many people as possible, particularly the
younger generation who are used to that sort of thing. “It seems to be quite
popular. It‟s very handy for people in that it gives them a chance of seeing
Carlos Guardian, “Blogs in eAdministration and the difussion of the knowledge on
eGovernment”, Internet Global Congress, Barcelona, Spain.
what‟s going on at things like PACT meetings if they can‟t attend.” On the
blog, web users can read about advice surgeries, local news, Councillor
Taylor‟s meetings and appointments and issues affecting Taibach.
2. The Australia senator for Queensland, Andrew Bartlett has his blog. He
posted his daily activities, campaings. And he included a poll on a proposal
raised for a national issue.
3. An eGovernment blog in Norwaycould be found at:
4 Policy consultation
Public consultation Web-sites could be interesting when thinking about
innovative measurement approaches (and Web 2.0) with regard to the big
number of people participating through such on-line portals. How are
comments filtered, how are discussions facilitated and monitored... and
how could this be used as a way to receive feedback from the citizens or
feed results from on-line discussions into eGov studies on e.g. user
satisfaction or impact measurement?
Here are two example of projects with which we may want to start a
discussion with concerning how their work and experience with "huge
surveys" could be relevant to eGov measurements. Both contact persons have
presented their projects at ePractice workshops. So they might be
available for a talk at our conference?
.4.1 Participatory budgeting in the City of Freiburg
The city of Freiburg im Breisgau used the DEMOS approach
(http://www.epractice.eu/cases/demos) combined with the DEMOS budget
planner (www.demos-budget.eu) to conduct a participatory budgeting trial
from 7 April to 9 May 2008. Citizens registering for the online platform
and participating actively in the online discourse and the budget planner
numbered 1861. There were 15 000 visitors counted on the website who
viewed 240 000 pages, giving a very high ratio of 16 pages viewed by each
visitor. Participants wrote 757 articles in the discussion forums, created
1291 individual budgets for Freiburg (with 914 text-based reasons given)
and complied 22 wikis discussing specific issues to be addressed by
Freiburg’s public administration.
contact: Rolf Luehrs, TuTech Innovation GmbH, email@example.com
.4.2 Osalle: the Estonian eParticipation tool
www.osale.ee is an Estonian portal to support open consultation and
inclusive policy-making. It was established in 2007. During first 2 years
of use, over 70 public consultations have been carried out, initiated by
all line ministries ( the structure of government includes 11), the
National Audit Office and the State Chancellery. Interest in the
consultations is quite significant. The website has 5 000 visits per
consultation on average. It has over 2 500 registered users. Among them
are individual citizens, but also representatives of organizations, e.g.
business organizations, CSOs or associations who issue a statement on
behalf of their members.
contact: Hille Hinsberg, State Chancellery, firstname.lastname@example.org
5 Non-government initiatives and participatory measurements
In this chapter we take a closer look at eGovernment measurements and surveys
that are conducted in a bottom-up approach. Traditionally measurements are carried
out in a top-down mode: a government institution, or third party with interest in
research and results carries out or commissions a study with a specific question in
mind. The objective of this type of measurement is usually well-defined. For instance
to study the effect of policy changes or to monitor the development and uptake of a
specific eGovernment service.
With the development of ICT and especially web technologies the access to
government and the implementation of government services has changed
fundamentally. Not only do citizens have a more direct access to the services
(although not to the public servants), they also have the possibility to connect the
other citizens in similar situations, who are using the same service. In the networked
society people are no longer individuals in their interaction with government. They
can connect easily with others. Web 2.0 technologies facilitate communication and
exchange of experiences. The emergence of the so-called Web 2.0 technologies has
opened up a wide range of possible applications using, re-using, and sharing public
One area of application of Web 2.0 technologies is the user-driven development of
services. The government or other organisations make available the raw data (such
as map data, statistics, or voting records). The processing and analysis of the data is
carried out independently by individual citizens or initiatives. In this way the data
presentation is tailored towards the needs of the users. New connections and
relationships can be discovered more easily.
Example: They Work For You (UK) http://www.theyworkforyou.com
Provides citizens with a range of information on their politicians, such as who their
local Member of Parliament (MP) is, what MPs said in Parliament, summaries of how
MPs have voted, text of debates in Parliament, video of MPs talking in debates,
written questions MPs have submitted to government departments, and the answers
they‟ve got back, email alerts whenever an MP speaks, or a topic is mentioned in
Parliament, comments and annotations from our users on what has been said.
Example: FollowThe Money (EU) http://www.followthemoney.eu
Aims to make it easier for European citizens to understand the EU budget: how it
gets decided, where the money comes from and how it is spent. FollowTheMoney.eu
is developed by journalists, researchers and activists who want to make the
European Union work in a more transparent and accountable way.
These initiatives provide valuable insight into government activities and can
contribute a lot to transparency. However their primary goal is not the assessment
and evaluation of eGovernment services therefore they will not be discussed further
here. Instead, the remainder of this chapter presents Web 2.0 initiatives targeted at
eGovernment measurement. After a short literature review we discuss a few
concrete examples, and conclude with some early observations.
.5.1 Literature review
In their article on benchmarking eGovernment, Cadagnone and Undheim
[CodagnoneUnheim2008] describe the concept of participatory measurement:
«Finally, a new emerging trend mentioned by Millard and exemplified in
the innovative proposal by Osimo [BenchmarkingeGovWeb20] is that of
participatory measurement, directly involving individual citizens and/or
citizen‟s representative groups. This means providing them with a voice,
not simply treating them as passive respondents as in classical user
satisfaction surveys. Interactive, deliberative, consultative – such
measurements entail asking users to provide also input on the relevant
criteria and dimensions to be measured.»
The authors state reasons for the use of participatory measurements to achieve
«Methodological pluralism entails also that measurements are not entirely
objective and neutral and entails some level of subjectivity in the
methodologies mostly used by the executive branch, which calls for
external evaluation produced by independent auditing institutions and
also for participatory measurements involving directly the citizens.»
Osimo [BenchmarkingeGovWeb20] argues that Web 2.0 is a new, important
paradigm for eGovernment. In his opinion, transparency could be the new flagship of
eGovernment. The existence of Web 2.0 features, such as data sharing and
distribution through RSS and open APIs, should be used as indicators. He uses the
term eGovernment 2.0 to describe this development. Based on earlier benchmarking
efforts, the author proposes to use a set of basic data items. Examples of data items
are beneficiaries of public funding (agriculture, EU structural funds, etc.), draft
legislation, planning applications, air pollution data, MPs votes, party donations,
citizens feedback and satisfaction surveys.
Wauters and Lorincz [WautersLorincz2008] also recommend a new design. They
suggest that a holistic view on administrative burden can be obtained «based on the
user experience which may include more external assessments (such as focus
groups) and eventually even participatory, bottom-up design (i.e. governments with
end-users) of impact measures.»
.5.2 Software and tools
In this section we describe and discuss the software and tools used in participatory
measurement with the help of examples.
Example: Sunshine Review (US) http://sunshinereview.org
Empowers citizens across the United States to share information with each other
about their local units of government – cities, counties, school districts, and state
agencies. The first focus of Sunshine Review is an awareness-building effort to
evaluate the transparency of City websites, County websites, School district
websites, and State agency websites. The Sunshine Review is a Wiki website that
anyone can edit.
The Sunshine Review uses a Wiki software which allows anybody to edit any page.
The advantage is a low threshold for contribution: Users are not required to register
or log in. Many people are already familiar with Wiki-editing, and in case they are not
it is easy to learn. The main focus of Sunshine Review is the collection of information
on the transparency of government in the US. Criteria like «Is the information on
budgets, open meetings, elected officials, administrative officials, taxes, etc. available
on the website?» are rated according to the three stages “present”, “missing”, or
“incomplete”. If an item is rated as “present” the claim can be backed up by providing
a link to the respective web page. The Wiki does not only contain the measurement
results. It is also used to define, discuss, and improve the evaluation methodology
As in any Wiki or similar online community, the technical maintenance does not take
up many resources. The challenging part is the community building: a critical mass of
contributors and relevant content must be reached to obtain publicity and credibility
among citizens, public officials, and the media.
Example: Rate your politician Ltd. (UK) http://www.r8yourpolitician.co.uk
An eDemocracy web platform for the UK, that provides a shared space for
communities to engage and interact. R8YourPolitician is an independent organisation
free from any government funding or political influence. The public can rate
politicians and politicians can engage and interact with citizens more easily and
transparently. The site allows the public to ask questions to politicians, put
questions on important issues to the public and get a quick real time analysis of
The site owners state that their mission is to become global market leader in the use
of the Internet and information technology to collect high quality, in-depth data on
public opinion, voter trends and market research to provide a wide range of clients
including individuals, companies, political parties, educational establishments with an
overview of the results and the provision of E-Democracy services.
R8YourPolitician conducts political opinion research over the Internet. In addition to
answering the polls, registered users can also set up their own polls and define new
questions. Although the idea of letting users pose their own questions is very
appealing, participation rate on this web site seem to be quote low. Most polls have
only very few answers so that the results become irrelevant.
One reason for this might be that citizens (as potential contributors) ask themselves:
Who performs the measurement? In this case it is a company that aims to become a
“market leader” in the field. However the service provided by R8YourPolitician does
not offer any distinctive unique features. In fact there are other web sites that provide
more advanced platforms to collect opinions and discuss ideas.
Example: Public services 2.0 using Uservoice.com
Uservoice.com is a commercial web site where users can set up a forum on a
specific topic. Other users (called customers or voters) can then enter their own
feedback and suggestions on the topic. They can also discuss and vote on
suggestions made by others.
The initiative “Co-creating an open declaration on public services 2.0”
(http://eups20.wordpress.com/) used a Uservoice.com forum to collect policy
recommendations: «What are the top things that EU governments should do in the
next 3 years to use the web to transform public services?»
There are other eGovernment and eParticipation projects using the Uservoice.com
platform. For instance Obama CTO initiative in the US (http://www.obamacto.org).
Many people are familiar with commenting on the Internet or rating items. The ease
of use and the freedom to add one's own view on an issue are important features of
the application. The fact that independent initiatives are responsible for the content
leads to a win-win situation: people are willing to participate if they believe that the
organisers will be able the communicate the outcome of the discussion and make
their voice heard by the government. On the other hand, any statements made by the
organisers will have more weight because they are based on the consensus of a
larger group of people.
An major difference between traditional top-down measurements and participatory
bottom-up measurements is the motivation: Why is the measurement carried out?
In a way the definition of the scope is aready an important part of the measurement
itself. Participatory measurements are targeting areas where citizen feel the need for
a change, the need to express their opinion. The surveys are often more focussed,
for instance towards a specific service, regulation, or institution. Some initiatives even
go beyond measurement, they have a clear idea that things need to change and set
up online petitions.
Another advantage is the collaborative aspect. The Wiki approach used by Sunshine
Review has many contributors which makes it possible to collect very detailed
information on a large number of web sites. Results can be reviewed and adjusted by
The increased use of Web 2.0 applications to deliver services on official government
web sites will have an effect on the way measurements are carried out in the future.
Web 2.0 allows a more direct interaction and hence can be extended to include direct
feedback as well.
 Codagnone, C. and Undheim, T.A.: Benchmarking eGovernment: tools, theory,
and practice (2008) http://www.epractice.eu/en/document/287913
 Osimo, David: Benchmarking eGovernment in the Web 2.0 era: what to measure,
and how (2008) http://www.epractice.eu/en/document/287915
 Wauters, P. and Lorincz, B.: User satisfaction and administrative simplification
within the perspective of eGovernment impact: Two faces of the same coin? (2008)
 Sunshine Review, Transparency checklist,
Figure 1: Rate Your Politician, User submitted polls
Figure 2: UserVoice.com page for “Public Services 2.0”
6 Privacy and security issues
.6.1 Introduction: why privacy and security isssues are included in the
8 Government engagement in Second life
Second Life is a virtual 3-dimensional world that is accessible via the Internet
(www.secondlife.com). It was first launched in 2003 by Linden Lab and has now
reached over three million users. Each user enters and interacts in Second Life
through an avatar. Users may socialize with others, create and/or trade virtual
properties, use learning spaces.
More and more real life organisations have set-up Second Life branches. It is not
only used by individuals or companies, but also as an educational platform by
universities or governmental bodies.
Linden Lab runs a newsletter to keep interested citizens informed about
governmental activities (https://lists.secondlife.com/cgi-
Screenshot 1: An avatar has entered the House of Sweden which was the 2nd embassy to be
opened in Second Life in 2007. Run by the Swedish Institute it aims to promote Swedish culture.
Screenshot 2: Birmingham Island has been established by Birmingham City Council in the UK in
2008. It offers organisations and companies the possibility to present themselves in the virtual
world. Virtual Birmingham is said to exemplify the opportunities of virtual technologies in
service delivery and citizen engagement. For more information see also
Further information and examples about governmental engagement in Second Life
may also be found in the government WIKI of Second Life
Web 2.0 – Is There A Second Life for e-Government? By Stephen Hilton (Connecting
Bristol, Bristol City Council & DC10plus)
Second Life feature part 2: Welcome To „Mixed Reality‟ by Dan Jellinek
eGovernment in Virtual Worlds
9 Methodology impact of the eGovernment awards
10 More in Government 2.0
In the context of Government 2.0 it is more than interesting to look at
of the Dutch government, only in Dutch, but it is a very interesting
example of grass root initiative of civil servants.
But perhaps Matt Poelsmans mentioned this already in an earlier state.
2010/1/28 Annika Nietzio <email@example.com>:
the Australian government 2.0 taskforce recently published the report
"Engage: Getting on with Government 2.0".
It contains many references and examples for the use of Web 2.0 tools by
government and could be a valuable resource for the 4th measure paper.
This section will list examples of innovative approaches accross
1 Innovative measurement case in France:
Ci-dessous un exemple de mesure innovante en France que je viens de
Des panels d'usagers permanents pour une nouvelle relation à
l'administration : Pour écouter les usagers et mieux prendre en compte leurs
attentes et leurs besoins en matière de simplification administrative, la DGME
va désormais s'appuyer sur la mise en place de deux panels permanents,
gérés par l'institut de sondage indépendant IPSOS.
Cette relation nouvelle et innovante entre l'administration et l'usager est
renforcée avec la création d'un espace dédié sur le site EnsembleSimplifions.
Les panélistes pourront y déposer des propositions, réagir aux remarques
faites par d'autres usagers, bénéficier d'une meilleure visibilité des travaux en
cours (programmes d'études) et être informés des résultats des études et des
L'élaboration de ce nouveau dispositif permettra de mettre en place d'un
système innovant d'écoute des usagers. Il s'inscrit dans une démarche
proactive d'amélioration des services publics fondée sur la consultation des
2 More innovatives approaches (to complete)
Compile and complete.
Annexes – Additional sources and input to evaluate
1 Annex - Input from the note takers and from the Knowledge Cafe
Notes from the meeting in Ghent: the report of the meeting is published at
It includes notes from the knowledge café.
2 Annex – Link to folder on eGovMoNet site
The last version of this document is available on the eGovMoNet site in the
3 Annex – Social Value Case Study
Extract from Jenner, S (2009) „Realising benefits from Government ICT
investment – a fool‟s errand?‟ Academic Publishing
Example 5 – the CJS IT Approach to Wider Social Value
As we saw in Chapter 4, the Root Cause Model was developed to articulate
and quantify the forecast impact of the CJS IT portfolio on major problems
(„Consequences‟) in the Criminal Justice System. This analysis was also used
to estimate the potential value of the CJS IT Portfolio on the economic and
social cost of crime i.e. if the model indicated a value to the Criminal Justice
System from reductions in re-offending or ineffective trials for example, then
the value of this reduction to the public, specific groups such as victims, and
public sector organisations that respond to the consequences of crime such
as the National Health Service, could also be assessed. Specific benefits
§ Time spent by offenders on bail provides them with the opportunity
to commit further offences (Consequence 2). By improving information
relating to supervision, monitoring and enforcement on bail and
improving the decisions that are made when awarding bail, such
offences may be reduced.
§ Improved risk assessments and information sharing enables
improved offender management and targeted interventions
contributing to reduced re-offending (Consequence 12).
§ Assuming that offenders are convicted more frequently as a result
of improved detections (Consequence 14), and that some convictions
lead to desistance from crime, then additional detections should
shorten criminal careers and reduce the number of offences
§ Information sharing helps reduce the number of ineffective trials
(Consequence 6) so realising time savings for people who attend
court (including witnesses) or are involved in preparation for trials.
Independent economists were employed to quantify and value these benefits,
using existing economic research such as Home Office Research Study 217
and OLR 30/05 into the scale and cost of crime and the number and cost of
different categories of crime – violent, sexual, common assault, robbery,
burglary, theft and criminal damage.
The result was a series of forecasts of the social value to be derived from
investment in projects within the CJS IT Portfolio. As always it is important
that benefits claims are as robust as possible and in the case of social value
benefits, this is particularly problematic – there are no representatives able to
sign forecasts off on behalf of society or to evidence benefits realisation, and
attribution of many outcomes is fraught with methodological difficulties.
Nevertheless the output was regarded as reasonably sound due to the fact
§ Firstly, the research was undertaken by professional economists
and the findings were subject to peer review.
§ Secondly, a conservative approach was adopted to the estimation
of impacts and their valuation in economic terms – for example:
general deterrent effects of convictions on other offenders were not
assumed; fraud was excluded because of a lack of reliable volume
estimates; and potential impacts on persistent and prolific offending
were also excluded due to the degree of uncertainty on impact;
§ Thirdly, a range of estimates were calculated based on confidence
in the underlying data and analysis. Quality scores were used to rate
confidence in each estimate as follows:
Confidenc Key criteria
5 Bottom up estimate; disaggregated to cover the main sub-types;
based on a representative sample; based on a large sample;
utilizing market prices for benefits.
4 Bottom up estimate; based on a small representative sample;
utilizing willingness to pay analysis.
3 Near complete aggregate data triangulated against some bottom
up calculations; some disaggregation by main categories.
2 Top down estimate; based on near complete aggregate data; no
1 Top down estimate; many groups/types encompassed by one
average; estimates based on small, unrepresentative samples.
Figure 12: Social Value Confidence & Quality ratings
The other key point to note is that the model was dynamic rather than static –
the value was updated on a quarterly basis to reflect: changes in the Root
Cause Model; programme delivery; updated crime cost and volume data; and
research into areas not covered by the original model and to improve the
quality of the estimates.
4 Annex – Input Case Piemonte Italy
This case was submitted for a previous EGOVMONET paper but was not
included (send after deadline). Source : BR. Michela
Document available in eGovMoNet folder:
Template filled in by Michela Pollone, CSP, firstname.lastname@example.org
While as many as 15 Member States may currently have already
adopted an eGovernment measurement framework there is no
consistent way to describe these frameworks.3 The objective of
developing a template is to facilitate the comparison of
descriptions of eGovernment measurement frameworks,
including descriptions of the used methods and tools, followed
practices, and implementation stage and challenges.
The template will be used to describe and to share the current practices and
the experience with deploying them. In this way, the common features, and
the differences among the methods can be identified in a collaborative way to
support the converging practices.
The template will be improved by taking in the main issues discovered when
using it for descriptions and comparisons in the network meetings. These
results will be actively disseminated among relevant stakeholders, also
beyond the network consortium. In that way the eGovMoNet will contribute to
coordination and harmonization of national measurement initiatives. 4
When filling the form below please provide references (links) to further
information on the method properties, and examples if available, for example
to questionnaires or lists of tests etc.
We may also try to cover what is useful for services providers to measure.
See “eGovernment Progress in EU27+: Reaping the benefits” report, available at:
91 -, p. 9
The need for convergence and strategic relevance in the applied measurement
approaches has been signalled recently in two recent reports; European eGovernment 2005-
2007: Taking stock of good practice and progress towards implementation of the i2010
eGovernment Action Plan, reviewing the of eGovernment development across Europe in the
past 2 years (See p. 38 in www.epractice.eu/files/media/media1671.pdf), and the Efficiency &
Effectiveness eGovernment: Key Activities 2007-2010
1. Name and country/countries/organisation(s) of the measurement
REGIONE PIEMONTE (ITALY) monitors and measures e-government on
its own territory, through different local bodies and organisations that
support e-gov and information society policy and projects. These
organisations, who act on behalf of the Regione Piemonte, are listed
below and their monitoring and measuring activities are described in this
paper. The different activities started independently in each organisation
and have a different focus, but they have been working in synergy in the
last 3 years to monitor and measure e-government services. In particular,
these organizations are:
1) OSSERVATORIO ICT DEL PIEMONTE
has been monitoring EGOV SERVICE USAGE since 2002. A survey is
conducted on a yearly basis by CATI collection method to interview
Piedmont citizens and enterprises.
The OSSERVATORIO ICT DEL PIEMONTE was born as a support to the
WI-PIE Programme, the regional policy to spread broadband all over the
regional territory. Its original goal was monitoring the progress of the
programme and its impacts. E government was considered as a part of
the broadband diffusion as well as monitoring e-government was a part of
monitoring the Information Society advancement in the region.
2) CRCPIEMONTE (http://www.ruparpiemonte.it/e-gov/crc.shtml) has
been conducting a yearly websurvey since 2003 in order to monitor the
ONLINE OFFER OF E-GOVERNMENT SERVICES by main local PAs
websites. Services are classified according to content type and
interaction level (according to the 4 level EU and CAPGEMINI typology)
3) LABORATORIO ACCESSIBILITA E USABILITA (http://lau.csi.it) has
the mission to promote accessibility among local PAs in Piemonte, with
particular focus on small and very small municipalities. Measuring the
ACCESSIBILITY LEVEL OF LOCAL PAS WEBSITES, referring to WAI
guidelines, is one the LABORATORIO ACTIVITIES, addressed to monitor
impact of the promotion and assistance activity.
No specific measurement is namely addressed to e-government impact
and customer satisfaction. Information about impact and customer
satisfaction can be interpreted starting from collected data in different
measuring activities and among the complex indicators some are relevant
for impact and customer satisfaction.
2. Does the measurement cover evaluation of implemented solutions and,
or support strategic decisions related to eGovernment?
3. What is the purpose of making measurements?
To deliver better services?
Improve efficiency of governments?
The purpose on a medium term is to monitor effectiveness of
implemented initiatives and running policy. A medium-long term purpose
is improving government efficiency
The LABORATORIO di ACCESSIBILITA E USABILITA has started its
activity in 2005. Its mission to promote accessibility among local PAs in
Piemonte, with particular focus on small and very small municipalities.
Measuring the ACCESSIBILITY LEVEL OF LOCAL PAS WEBSITES, is
one the LABORATORIO ACTIVITIES, addressed to monitor impact of the
promotion and assistance activity. Measurement methodology refers to
WAI guidelines and to the national law about public website accessibility
(Legge Stanca, 4/2004)
4. Examples of services that can be covered by the measurement
methodology. See also Appendix B.
All the services provided by local PAs „websites
1. Support for strategic decisions, please indicate examples, e.g.
selection of projects to invest in etc.
One of the main results coming from the measurement in terms of
support to strategic decision is that accessibility keeps on being a
regional priority year after year, worth to be funded as a regional policy,
to be promoted in dissemination events and training courses thanks to
2. Nature of evaluation result. Please indicate the outcome of the
measurement such as estimated amount saved, citizen take up ratio,
score for level of measured value, or some other form of result.
The evaluation results are meant to address part of the LAB activity and
to focus part of the investments towards dissemination and training
Please include an overview setting out the context for the Impact or user
3. Limitation of scope for the methodology.
Can it for example be used to determine user satisfaction of both
citizens and businesses?
Is the method only applicable for a certain group of citizen (e.g. those
who have internet access)?
The method is applicable to websites and not to citizens.
The methodology is meant to measure the adoption of accessibility
guidelines by a sample of local PAs websites in Piemonte. The final aim
(middle-long term) is improving the quality of public websites and
eGovernment services and maintaining the citizen satisfaction.
The LAU provides also ad hoc consultancy for local PAs asking for this:
in these cases, the method is applied to limited group of cases and of
citizens (eg: the accessibility level of an upper school website was
4. How are egovernment service users involved?
Is there a channel in place to suggest improvement of services?
egovernment service users are involved in only usability test sessions
A separate template should, if necessary, be completed for
each eGovernment service covered (and the template
should identify what that service is).
The template should report which eGovernment service are
covered by measurement methodology.
A. General properties of the measurement methodology
A.1. Repeatability – is there in the methodology a way to quantify
to what extent two independent measurements of the same
measurement object will lead to the same result?
Measurement is repeated once a year, paying attention to longitudinal
The methodology doesn‟t explicitly foresee a way to quantify to what
extent two independent measurements of the same measurement object
will lead to the same result
A.2. Independence of size – to what extent will the measurement
method cope with measuring different sizes of eGovernment
applications. Does the method have a scale factor that allows
comparison of different results? E.g. is there a way to measure e.g.
a small service for a garage building permit or a service for a
building permit for a shopping centre with the same methodology?
A.3. Estimated cost of using the methodology
Monetary, e.g. license fees for needed software or proprietary
parts of the methodology.
Resources needed to carry out measurements in terms of
Other resources needed to carry out the measurement.
Note that the cost may depend also on whether the
measurements can be carried out within the organisation or
whether a 3rd party has be commissioned.
Resources needed to carry out measurements consist mainly of
person/hours within the organisation (CSI-PIEMONTE). The same kind
of resources (person/hours) are needed to verify and eventually update
the methodology, interpret the collected data, write, edit, promote
reports. Training sessions need organisational efforts and costs, typical
A.4. Cost effectiveness- Relation between: how much does the
measurement cost and what impact do it have?
Improvement of accessibility level in local public websites, increasing
number of required consultancy sessions, increasing participation to
training sessions are considered as a positive indicator of cost-impact
A.5. Prerequisites to carry out and use the measurement
involvement and motivation needed
understandability for audience, is there any evidence that the
results from the measurements can be efficiently
o Local authority endorsement is prerequisite to carry on the
o Scientific independence of organisation carrying on measurement,
interpreting data, converting in knowledge useful for decision making
is part of the organisational model of the observatory
o The LAB pays attention to make monitoring results visible and
understandable. Results are communicated through reports,
dissemination-discussion workshops, training sessions addressed to
local decision makers and web developers
A.6. Stability of the methodology over time to allow for
comparisons of measurement results over time.
Methodology is basically stable and allows comparisons along historical
series starting from 2005. Improvement have been made and some
question-variable has been added along the years.
A.7. Accuracy, does the method give any indication of how
accurate the measurement results are?
Information not available
A.8. Degree of standardization, does the method refer to
established standards such as WAI from W3C.
The method refers to
o WAI from W3C standards
o the Italian law about public website accessibility (Legge Stanca
o guidelines produced by the working group number 1 "Metodologia" of
the Italian “Commissione interministeriale permanente per l'impiego
delle ICT a favore delle categorie deboli o svantaggiate”
A.9. Number and type of administrations using the method,
preferably a list of them.
Only LAU is using the methodology at the moment. Analysis is applied to
over 400 local PAs.
A.10. Multidimensionality (multi or uni)? See details in Appendix
Information not available
A.11. Cause and effect analysis basis for indicators, relevance? Is
there evidence that the methodology actually measures e.g. user
No direct evidence
A.12. The role of the person in charge for using the measurement
results in the government agency which is responsible for the
measured subject where the measurement is carried out.
B.1. Is the methodology designed to promote targeted change? If
so please specify in which level:
Implementation: the LAB‟s final goal is making most local public websites
in Piemonte accessible and consistent with national and international
guidelines. Main activities are dissemination, consultancy and training
activities towards local PAs, above all small ones.
So, measuring the accessibility level of local PA websites, referring to
WAI guidelines and to the national law about public website accessibility,
is one the LABORATORIO ACTIVITIES, addressed to monitor impact of
the promotion and assistance activity.
B.2. Is the measurement intended to support strategic decision
e.g. such as selection of project to invest in?
The monitoring, consultancy and training activities of LAU are meant to
introduce the culture and attention to accessibility in local PAs, for
example introducing accessibility requirement into the tenders for local
web and eGovernment service design and implementation.
5. Who are the intended users of the measurement results?
Please select among the given target groups and indicate the expected
use of the measurement results.
B.1.1. Policy makers, e.g. to facilitate the evaluation of a given
B.1.2. Service providers, e.g. to provide evaluation services.
B.1.3. Software vendors, e.g. to support product profiling and
Maybe, as an indirect effect
B.1.4. Developers, e.g. to support quality assurance of an on line
B.1.5. Web site owners, e.g. to support quality assurance.
B.1.6. Researchers, e.g. to identify good practice.
Maybe, as an indirect effect
B.1.7. NGO's, e.g. to monitor given properties of interest such as
accessibility or transparency.
B.1.8. Other, please specify
6. Measurements what for
B.2.1. Identify good practice.
Yes. But public reports present only aggregated results
B.2.1. Inform and learn (Benchlearning).
B.2.2. Measure impact of policies.
B.2.3. Assess current status.
Yes: this is the main goal
B.2.4. Strategic or operational, e.g. inclusion, efficiency, etc.
Yes, to improve efficiency and general quality of eGovernment offer in
B.2.5. Justify expenditure.
Yes, on a second level. In term of public investments, one of the results
coming from the measurement in terms of support to strategic decision is
that accessibility keeps on being a regional priority year after year, worth
to be funded as a regional policy, to be promoted in dissemination events
and training courses thanks to the LAU.
B.2.6. Determine user satisfaction- to identify the user‟s specific
needs in order to customize the services to fit users' specific needs
in the future, to obtain better knowledge of the citizens‟ attitudes,
opinions, expectations, habits, perceptions and satisfaction levels
with the delivery of public services, to identify the users‟ perceptions
of the quality of services provided, the perceived institutional
reputation and credibility, etc.
No, this is not the real focus.
B.2.7. Evaluate current status to
Yes, all of these.
Only as a secondary effect, in possible benchmarking of local websites in
terms of accessibility.
B.2.9. Community building.
Yes. With the aim of building a community of interest and of professionals
web designers fully aware of accessibility guidelines and advantages.
The description of the following properties should enable a practitioner to
understand the basics of how the methodology is to be used.
Who carries out the measurement?
C.1.1. Independence of measurement subject
Does the methodology require independence between those
who carry out measurement and the measurement subject
(Third party organisation carries out measurement)?
The LAB personnel, who carry out the measurement, is not involved in e-
gov website and service development for the measured local PAs.
C.1.2. Number of internal staff members are involved in
measurement for each public institution/each service.
The LAU personnel consists of about 10 people. About 5 people are
dedicated to measurement activities each year.
C.1.3. Number of external persons are involved in measurement for
each public institution/each service
C.1.4. Training requirements for evaluators (do they need any
Evaluators do not need or have a formal certification. Their competence
and expertise is stated by their CV, their working experience (both inside
and outside CSI-PIEMONTE) and their training path.
How is the measurement carried out?
Automatic, please refer to tool(s) used.
Feedback from eGovernment service users.
A combination of automatic and manual, please specify.
Yes, through accessibility page validators in combination with expert
evaluation sessions and usability test sessions.
In detail: technical tool used:
��Operating systems: Windows 2000 / XP
��Graphic browser: Internet Explorer 5.x / 6 / 7; Firefox 2.x / 3.0
��Screen Reader: Jaws 8.0
��Screen Reader emulator: Fangs 1.0
��Accessibility toolbar 2.0 version for Internet Explorer
��Web Developer Toolbar 1.1.x for Firefox
��Web Accessibility test: TAW3 on-line
��W3C QA Markup Validation Service
��W3C CSS Validator
On line, please refer the method(s) used. E.g. on line
questionnaires, crawler technologies etc.
Off line, please refer the method(s) used. E.g. phone calls, face to
face interviews or focus groups.
Yes, usability test sessions
Measurement coverage: subset or exhaustive sampling
For example, are all pages of the website evaluated, or all citizens of e.g.
a municipality interviewed? If not, how is the sampling carried out.
► Manual / Automatic sampling.
Sample is made of over 450 local PAs websites. In detail:
o Regione Piemonte website (1) +
o websites of the Province (8) +
o websites of Aziende Sanitarie Locali (local agencies for heath care
o websites of municipalities with more than 1500 inhabitants (377 upon
For each website, a sample of pages is analised, in relation with requisite
(eg: a sample of pages containing downloadable forms, a sample of
pages containing tables, a sample of pages containing downloadable pdf
files, a sample of pages containing multimedia materials, etc).
► Which sampling approaches are used?
Sequence of steps in scenarios for end-users, e.g. an online service to
apply for Kindergarten.
Stop criteria based on achieved accuracy of measurement result.
Other stop criterion to end the sampling.
Most of smaller municipalities do not have an official website. Almost
none accessibility measurement would be possible out of the above
C.3.3. How are interviewees/respondents for a
Where to measure?
C.4.1. Scope: where is the method used constituency?
(organisation, country, region etc.)
The reference territory for measurement is the regional territory. The
method used constituency is the independent public organisation (see list
What to test - Measurement subject?
C.5.1. Web sites:
Static or informational web pages, .pdf‟s, or documents
Downloadable documents or forms (including forms the user
must send in or e-mail)
Search engines or functions providing searches on static web
pages or documents
Intranet services or applications;
Forms submitted online requesting information
Submitting a complaint, or similar function
All of the above mentioned apart for intranet services or applications and
submitting a complaint
C.5.2. eGovernment web service, such as those cover in the
CapGemini 20 services report, or the Top of the web-User
Satisfaction and Usage Survey of eGovernment services. Please
provide link to the service.
C.5.3. Public Access Terminals.
C.5.4. Further channels like RSS, SMS etc. (e.g. are there other
options to use the service for a citizen who can't / doesn't want to
use the eGovernment channel?)
C.5.5. Users: multi-perspective consideration, single expert
consideration or web service users.
C.5.8. Project proposals
C.5.9. Project beneficiaries
C.5.10. Case study: ePractice
C.5.11. Public procurement process
C.5.12. Technical back office
Log from back office software, such as web logs
Re-use of input from users, to avoid that users need to access
several web sites and fill out repeated form fields with the same
How to test?
C.6.1. Definition of tests, e.g.
C.6.2. Definition of questionnaires and their questions, preferably a
reference to the actual questions.
C.6.3. Procedure for selective deployment of tests / questions e.g.
depending on the outcome of previous results.
When to test?
Is the measurement carried out before the event (e.g.
implementation of eGov service), when using it or after it has
been used. (ex ante,5 in between or ex post6)
Measurement is carried out on a regular (yearly) basis.
Measurement frequency (periodicity)
Measurement is carried out on a yearly basis. Measurement campaigns
have taken place in 2005, 2006, 2008, 2009
Timing of Measurement: external trigger (event driven), e.g. new
service, complaint error, elections, tax declaration.
duration of measurement: one/all
time span to cover
Measurement is carried out in summer time and lasts a couple of
months. It covers the previous year time span
What part of the value chain is being measured?
Impact of the services
“Before the event”: based on forecasts, where the difficulty lies in calculating
future circumstances or conditions.
Another term for actual returns. Ex-post translated from Latin means "after the
fact". The use of historical returns has traditionally been the most common way to predict
the probability of incurring a loss on any given day. Ex-post is the opposite of ex-ante,
which means "before the event".
Score calculation and statistics used
Please indicate the nature of the quantitative information. Such as
financial (e.g. value in Euros), count (number of submitted forms) or
score computed according to a given calculation or some other
Quantitative indicators are expressed as a percentage, computed upon
the total of examined cases. There are three types of indicators:
1. Macro indicator: are the websites in the sample consistent with the
law (L.4/2004) requirements? In other terms, are all the 22
2. Single requirement indicator: are the websites in the sample satisfying
the requirement number 1, 2, …, 22 (see figures 1 and 2)
3. Subindicators, used to compose the single requirement indicator (see
C.8.2. How to compute the score.
C.8.3. How to aggregate scores .
C.8.4. Statistical properties.
C.8.5. Comparisons, e.g. time series or needs and wants/actual
Reporting of the evaluation results
1) Score card, classifications etc.
Tables, scorecards, percentage indicators, longitudinal series
2) Levels of aggregation, e.g. web site, region, country etc.
Region, province, cluster according to dimension of the administrations
3) Access to underlying data. Are the results reported with access to
the raw data?
4) Conformance claims. Does the method support claims of
conformance to any given standards?
Yes: WAI and LEGGE STANCA (Italian law about public website
5) Understandability. Have any tests for understandability been carried
6) Accessibility for people with disabilities. Are the reports designed to
7) Usefulness for different stakeholders. Please indicate evidence to
demonstrate usefulness for different stakeholders.
Evidences of userfulness are the increasing number of required
consultancy sessions and the increasing participation to training sessions
How to interpret the results
4. Transparency of measurement.
Methodology is fully described in the reports
5. Repeatability of third party: possible to recompute score?
6. Results available to external bodies or not.
7. Different reports to address different stakeholders.
Public reports provide only aggregated data. Workshop and seminars are
organised to present and explain results. Disaggregated and raw data are
provided on request only to the organisation they refer to.
Accounted cost for deploying the methodology.
It should include some information on the real use and impact of this
D.1. Does the measurement lead to policy actions? Please provide
examples if available.
maintenance of the LAU with Regional funds.
D.2. What are the lessons learned from this exercise (for example,
was the measurement organised in focus groups, but the results were
not very consistent, because the focus groups were not homogeneous,
o direct contact with stakeholders, interaction with the territory (local
PAs decision makers and website and eGovernment responsible),
dissemination and evangelization are pivotal factors
o a practical approach is necessary, that starts from the running
accessibility law and gives practical tools to website and
o integrating measurement with training and consultancy is the key to
have an impact on accessibility culture diffusion and guidelines
D.3. What is the innovativeness of the methodology?
D.4. What were the difficulties carrying out the measurement?
o finding the right structure and detail level for reporting, in order both
to be complete and to popularize the accessibility philosophy
o elaborating indicators that refers to the Italian law and only on a
second level to WAI guidelines
o a work is running in order to automatize part of the measurement, in
cooperation with Regione Emilia Romagna
D.5. The relevance of the measurement in the organizational
D.6. Is there any evidence that the measurement results are used for
improvements of the eGoverment services?
Training and consultancy contents and formats have been improved
according to the measurement results
A measurement method will need updates to stay relevant over time and
to take lessons learned from deploying the method into account. We are
here looking for the properties of this maintenance.
E.1. Open Process, e.g. as defined by W3C at:
E.2. Method Document license: For example, Creative commons
share alike or not stated.
E.3. Organisation responsible for the maintenance process, for
example, the national government agency using it, ISO, or some
CSI-PIEMONTE has an ISO certification, applied to all the
maintenance processes. At the present stage, no detailed information
E.4. Sustainability: continuous improvement, evolving methodology, Is
the method regularly updated?
E.5. Change management. Is there a process to collect comments
e.g. from those using the method or those responsible for subjects
5. Please specify any other important properties that could be included.
6. Any other proposals for template improvements.
See LAU publications and reports (in Italian only):
o Valutazione dell'accessibilità dei siti web della PA piemontese 2008
o Valutazione dell'accessibilità dei siti web della PA piemontese 2006
o Valutazione dell'accessibilità dei siti web della PA piemontese (2005,
o La PA accesssibile - Progettare servizi web per tutti i cittadini (instant
User groups formed by organizational units can consist of more than one
role and there might arise questions, how multiperspectively the
measurement should be done. These aspects may not change the
contents of the model but they may change the procedure of how data are
to be gathered.
Roles per group: This perspective is somehow overlapping with 'user
groups'. We could distinguish 3 to 4 different roles that can be included in
End user: is actually working with the system/product
Beneficiary: does not work with the system but gets the benefits
Manager: does not work with the system but supervises its use
Developer, maintainer, administrator: all are involved in keeping the
system at work without using it as customers.
Obviously, measuring user satisfaction will easily produce different results
dependant on the roles that have been included in data gathering. This
leads to the following component.
Multiperspective measurement: If there are different user groups, it must
be decided "who is in it or not":
measurement can rely exclusively on just one dominant user group
or can take into account all user groups,
measurement can aim at all users of an user group or concentrate
on a representative sample,
if a user group contains different roles, all roles or just one might be
users are taken randomly, no specific procedure for selecting, the
selection is carried out automatically or manually by an operator
the selection of users is limited in advance by giving an amount of
data that have to be evaluated.
'one for all'-approach: an expert (a team of experts) makes all the
necessary determinations without directly contacting all or almost
all of the user groups included.
P. Röthig / WiBe-TEAM PR
After discussion over this draft and brainstorming the way we can carry out
the research activity we feel it is necessary for us to define the level of
details of eGovernment we need to focus on. Should we focus on national
level or regional level or both. Should we focus only on services that is
primal for eGovernment solution, eg. tax, land register or should we go
deeper, e.g. culture events, elderly support, package solution for minor
group-programmes for disables etc.
Another question is if should we focus only on citizens or both citizens and
businesses. See the survey from 2004: „Top of the web-User Satisfaction
and Usage Survey of eGovernment services” where thee services were
distributed as follows:
Social security benefits
Application for building permission
Declaration to the police
Birth and marriage certificates
Enrolment in higher education
Announcement of moving
Social contribution for employees
Registration of a new company
Submission of data to the statistical office