Draft Recommendation on the roles and responsibilities of internet intermediaries

A wide, diverse and rapidly evolving range of actors, commonly referred to as internet intermediaries, facilitate interactions on the internet between natural persons and between natural and legal persons by offering and performing a variety of functions and services. Some connect users to the internet, enable the processing of information and data, or host web-based services, including for user-generated content.

Others aggregate information and enable searches, and give access to, host and index content and services designed and/or operated by third parties. Some facilitate the sale of goods and services, including audio-visual services, and enable other commercial transactions, including payments. Intermediaries may carry out several functions in parallel. They may also moderate and rank content, including through automated processing of personal data, and may thereby exert forms of control which influence users’ access to information online in ways comparable to media, or they may perform other functions that resemble those of publishers.

Internet intermediaries also develop their own rules, usually in form of terms of service or community standards that often contain content restriction policies. Moreover, intermediaries collect, generate, retain and process a wealth of information and data from and about users. These activities may interfere with, among other rights, the users’ rights to privacy and freedom of expression. Effective reporting and complaints mechanisms may be lacking, be insufficiently transparent and efficient, or be provided only through automated processes.

Guidelines regarding the roles and responsibilities of internet intermediaries

Any request, demand or other action by public authorities addressed to internet intermediaries that interferes with human rights and fundamental freedoms must be prescribed by law, must be exercised within the limits conferred by law and must constitute a necessary and proportionate measure in a democratic society. States should not exert pressure on internet intermediaries through non-legal means.

States shall ensure that legislation, regulation, and policies related to internet intermediaries are interpreted, applied and enforced without discrimination, also taking into account multiple and intersecting forms of discrimination. States should further take into account the substantial differences in size, nature, function and organisational structure of intermediaries when developing, interpreting and applying the legislative framework in order to prevent possible discriminatory effects.

States should ensure that legislation, regulation and policies relating to internet intermediaries do not unduly restrict the operation and free flow of trans-border communication. Any legislation must clearly define the powers granted to public authorities as they relate to internet intermediaries, particularly when exercised by law enforcement. The law must indicate the scope of discretion to protect against arbitrary application.

State authorities should obtain an order by a judicial authority or other independent administrative authority whose decisions are subject to judicial review when demanding intermediaries to restrict access to content. This does not apply in cases concerning content that is illegal irrespective of context, such as when involving child sexual abuse material, or in cases where expedited measures are required in accordance with the conditions prescribed in Article 10 of the Convention.

When internet intermediaries restrict access to third-party content based on a state order, state authorities should ensure that effective redress mechanisms are made available and adhere to applicable procedural safeguards. When intermediaries remove content based on their own terms and conditions of service, state authorities should not consider this as a form of control that makes them liable for the third-party content they give access to.

State authorities should not directly or indirectly impose a general obligation on intermediaries to monitor content which they merely give access to, or which they transmit or store, be it by automated means or not. When addressing any request to internet intermediaries or promoting, alone or with other states or international organisations, co-regulatory approaches by internet intermediaries, state authorities should avoid any action that may lead to general content monitoring.

States should ensure in law and in practice that intermediaries are not held liable for third-party content, which they merely give access to or transmit or which they store. State authorities may hold intermediaries co-responsible with respect to content that they store, if they do not act expeditiously to restrict access to content or services as soon as they become aware of their illegal nature, including through notice-based procedures. State authorities should ensure that notice-based procedures are not designed in a manner that incentivises the take-down of legal content, such as through inappropriately short timeframes.

Notices should contain sufficient information for intermediaries to act upon. Notices submitted by states should be based on their own assessment of the illegality of the notified content, in accordance with international standards. Content restrictions should allow notice of such restriction as early as possible to the content producer/issuer, unless this interferes with ongoing law enforcement activities. Information should also be made available to users seeking access to the content, in accordance with applicable data protection laws.

In order to ensure that illegal content, as determined either by law or by a judicial authority or other independent administrative authority whose decisions are subject to judicial review, is effectively prevented from being accessed, states should co-operate closely with intermediaries to secure the restriction of such content in line with the principles of legality, necessity and proportionality. They should further consider that automated means, which may be used to identify illegal content, currently have limited capacity to assess context.

Internet intermediaries should make available – online and offline – effective remedies and dispute resolution systems that provide prompt and direct redress in cases of user, content provider and affected party grievances. Complaint mechanisms should include in-built safeguards to avoid conflicts of interest when the company is directly administering the mechanism, for example, by involving oversight structures. Intermediaries should not include in their terms of service waivers of rights or hindrances to the effective access to remedies, such as mandatory jurisdiction outside of a user’s country of residence or non-derogable arbitration clauses.