WG1 Reporting Tools by wuyunyi


									WG1: Reporting Tools

Background info

This paper contains a summary of the inputs received by the industry and NGOs to the
questionnaire sent after the meeting held in February and major issues and solutions amongst
the stakeholders.

The final aim of this document is to provide guidelines for the CEO Coalition members to
facilitate the implementation of reporting mechanisms within their services when it is needed
or relevant.

 These mechanisms should be aimed at improving the management/treatment of grounded
notices, while respecting the legal framework in place and fundamental rights. Accordingly, it
should be further analyzed the impact that any proposed reporting mechanisms may have on
data protection.

There is a power point annexed to this document which contains best practices from different
sectors in relation with reporting mechanisms already working and that could be deployed
among the CEO Coalition members

Following the statement of purpose, we have committed:

To deliver robust mechanisms for reporting content and contacts that seem harmful to kids.
These should be available across specific online services and devices, covering clear and
commonly understood reporting categories, while avoiding regulatory double jeopardy in areas
regulated by other means.

Summary of the WG questionnaire

After the meeting held on the 27th February, where CEO Coalition members and 3rd parties
raised their concerns with regards to reporting tools, a questionnaire was circulated to identify
those areas of improving and where an action where requested in relation with the scope,
categories, escalation, feedback, design and location of the reporting tools.

Here below you will find the summary of the responses received classified by industry and


Scope: There is a certain consensus on what kind of content should be reported. However,
most players underline the dramatic difference, also in view of the follow up to be given to the
reporting, between what is an illegal content and what is an inappropriate/harmful content In
general Potentially harmful/inappropriate content or behavior such as pornography, racism,
illegal, abusive or harmful material or activity, extreme violence, discrimination, bullying,
grooming, identity theft or impersonation, material affecting the user's dignity, honor or ability
to use the service freely. Almost every one thinks that content to be reported should be
according to the existing regulatory framework.

Categories: Regarding the categories there are two different kinds of reports identified: the
ones regarding illegal activities and the ones that could be harmful to children or in breach
with the provider’s Term & Conditions, but not illegal indeed. Categories could be helpful to
escalate the reports and/or prioritize them. Taking into consideration the multiplatform
approach and the diversity of issues that may be reported there was agreement amongst
industry that is important to leave categories open.

It seems important to educate kids on how to use the reporting system and to offer categories
on what is harmful and not allowed within the service they are using.

Escalation: Concerning who and how to report there is also a consensus among providers that
they should receive the reports in the first place and then escalate them to the relevant
helplines/hotlines or authorities as appropriate (this does not include material such as child
abuse material, where many providers forward the user directly to appropriate hotlines or the
national LEAs). It is also stated that common –but flexible- criteria could help to handle
different types of reports and that the mechanism should be easy and accessible.

Feedback: The service provider should offer clear information to users about the reviewing
process where appropriate, and depending on the design of its process, the service provider
may choose to offer additional information to users. Industry will work on improving the way
they give feedback, so users reporting abusive or harmful content have more information on
how their reports are handled.

        • Proposal:
              o Industry will work on improving the way they give feedback, so users reporting
                   abusive or harmful content have more information on how their reports are

Design: There is no consensus whether it should be a unique design (text or graphic) or if it
may differ among countries, platforms, providers or websites.

Location: Regarding where the button/icon should be placed, they all agree that it should be
visible, some think it has to be near the content to report, others in the main menu of the web
site and a third group think it should be placed on the browser so that it is available for every
web site. There is no unanimity about whether the button/icon should be placed on every
website or only in those that allow user generated content.

    •       Proposal:

        o    For certain services and devices a good example of best practices seems to be to place
             the icon within the browser that enables users to get in direct contact with NGOs to
             seek help with a single click. However we need to bear in mind that there would be
             different services and devices that need to be taken into account. Industry proposes
             to work with NGOs/Hotlines in order to provide browser extension.
     o   Also, mobile operators and other industry stakeholders will work to provide reporting
         Apps for connected devices.

Non-governmental Organizations

Their opinion is similar to that of industry, sometimes stating more categories, but essentially
about the same types of behavior/content.

Categories: They also agree to establish categories in order to make reporting more efficient
and easy-to-prioritize.

About the way to report they all agree that the system should be easy to find –visible for every
platform- and easy to use – in order to promote reporting. They also add that the system
should be transparent and the person reporting should be able to know the process of his/her
report and to have a feedback once the issue is being analyzed.

Design: Some see value in establishing one similar icon that could be identifiable across
platforms and countries; nonetheless there is no consensus on this point. In fact some think
that one single button would not work at all.

Location: About where it should be placed it’s agreed that it should be in a visible place in any
web site where there could be potentially harmful contents or behavior to children and
youngsters. There is no consensus whether websites, providers or authorities should be
responsible for this icon/button.

State of art: Industry responses to issues raised by NGOs

                NGO DEMAND                                    INDUSTRY REPLY
The report mechanism needs to be easy to It is important to stress that there is no need
find. It needs to be available for all users to have reporting tool in each single website,
effortlessly, where problems tend to arise.   Reporting tools should be available in
                                              appropriate locations in order to allow the
                                              user to report abuses. This excludes content
                                              that is fully controlled by the service provider
                                              and, thus, is no risk to minors. An extension
                                              in the browsers could be a solution for
                                              certain services and devices in cases where
                                              there is not reporting system in place or the
                                              platform is not offering them.
The reporting tool should be available for Industry will work to identify ways to
non-users of the service, so adults would not enable parents, carers and teachers
need to register it they wish to report.      making reports without having to sign up
                                                 and register to the service in question,
                                                 minimizing risks of high increase in false
                                                 or abusive reports."

                                                 However, the risk of the number of false
                                              or meaningless reports might rise, thus
                                              making it more difficult to timely and
                                              properly support those children who
                                              really have something to report. In any
                                              case, for hosting platform or social
                                              network not supporting anonymity,
                                              anonymous reporting cannot be a
                                              solution. Again, browser extensions
                                              managed by hotlines and NGO could be a
                                              solution to offer reporting tools for non
                                              registered user and, even, anonymous
                                              reporting (if the hotline, or the NGO
                                              believe this is an important feature).
                                              However, due consideration should be
                                              given to those cases where operators
                                              receiving the reporting keep the direct
                                              contact with the LEAs, in order not to
                                              lengthen the process.

Testing and evaluation of the mechanism by The CEO Coalition companies support
children should be necessary; it has to be appropriate testing and evaluation of
easy to use.                               reporting tools in order to ensure that the
                                           tools are relevant and easy-to-use.

Learning how to use the reporting system to Industry will assess possibilities how to
avoid misuses.                              provide better information for users in order
                                            to make appropriate reports. Some industries
                                            may consider to invest in user education and
                                            digital literacy programs in relation to
                                            reporting tools.
It is advisable to employ reporting options Child-friendliness is    important, and the
reflecting children’s own conception of the reporting options have to take this into
problem (e.g. “embarrassing pics”), and to account and be understandable for all users.
include the most common problems faced by
users of the service and common online risk
Industry should structure these report NGOs have called for provision of feedback to
mechanism in a way they provide feedback.   users reporting harmful content, in particular
                                            giving evidence of the receipt of the notice.
                                            Some social networking providers have
                                            launched pilot projects to address this
                                            concern and provide users with a tool to
                                            monitor their reports and obtain feedback on
                                            how they have been handled. It has to be
                                            noted, that individual feedback to users who
                                            report is not always possible (depending on
                                            the number of reports received). In any case,
                                            it should be ensured that any feedback do
                                            not hinder LEAs investigations and do not
                                            violate the fundamental right of data
                                               protection. . Other services – for example the
                                               gaming sector – have raised concerns
                                               because of abuse potential in particular in
                                               interactive games. More generally, this
                                               cannot always be done, particularly if a
                                               criminal act is involved so this would need to
                                               be well-circumscribed.
The report mechanisms should be                The answer to this question is closely related
independently reviewed or evaluated. There     to the overall review process within the CEO
needs to be a system in place for evaluating   Coalition, by a self-declaration about
how these mechanisms are working and if        successful implementation of agreed
they are being as effective as they can be.    guidelines and independent review of
                                               achievements. Where process are in place
                                               the sharing of best practices will provide
                                               continuous input for companies to improve
                                               their reporting mechanisms on an ongoing


This graphic will summarize the input received by NGOs in relation with the categories that
should be taking in account for the reporting systems. However, after analyzing several sites
where there is UGC the categories proposed for reporting are largely overtaken this proposal.

This should be taken as minimum set of categories to propose to costumers, where

                 These categories shouldn’t be an option for the user

Final Recommendations
In order to meet the goals of the statement of purpose, we have accepted with regard to
‘simple and robust reporting tools for users’, Industry proposes the following deliverables:

    •   Mobile operators and other industry stakeholders will work to provide reporting Apps
        for connected devices;
    •   Industry will work with NGOs/Hotlines in order to provide tools such as browser apps
        or direct links from websites and other messaging applications where appropriate and
        applicable for each different sector, to enable users to get in direct contact with
        NGOs/Hotlines to seek help with a single click;
    •   Industry will work on improving the way they give feedback and providing updates on
        reports of abuse where appropriate and applicable for each different sector, while not
        hindering LEAs investigations, so users reporting abusive or harmful content have
        more information on how their reports are handled;
    •   A minimum set of categories is proposed within this progress report (see above) to
        take into account for those services where UGC are hosted.

Annex I (will follow)

We have pulled together a slide deck summarizing the reporting tools already existing in the
market (icons on website, apps for smartphones and browser extensions) as well as current
practices for reporting systems of several CEO coalition members. (Facebook, Google, Habbo
Hotel, Hyves, Orange, Tuenti, Vivendi). The slide deck includes new actions undertaken by
certain CEO coalition members since the launch of the Coalition, in addressing the concerns
expressed by third party stakeholders for example feedback to users, browser extension or
apps for smartphones.

To top