Summary
Internet intermediaries should ensure transparency and easy access to their policies and practices regarding online content management, strategic dissemination, and automated processing.
Obligations
Election Parts
Issues
Criteria
Quotes
- In view of the foregoing, the Committee of Ministers: (…) - draws attention to the necessity of critically assessing the need for stronger regulatory or other measures to ensure adequate and democratically legitimated oversight over the design, development, deployment and use of algorithmic tools, with a view to ensuring that there is effective protection against unfair practices or abuse of position of market power.
- In view of the foregoing, the Committee of Ministers: (…) - emphasises in particular the need to assess the regulatory frameworks related to political communication and electoral processes to safeguard the fairness and integrity of elections offline as well as online in line with established principles. In particular it should be ensured that voters have access to comparable levels of information across the political spectrum, that voters are aware of the dangers of political redlining, which occurs when political campaigning is limited to those most likely to be influenced, and that voters are protected effectively against unfair practices and manipulation.
- In order to redeem its promise of fostering a culture of informed public debate and active participation in the democratic process, it is of the utmost importance that individuals are empowered to understand this environment and its challenges. (…) To this end, individuals need to develop a wide range of skills for media and information use and an awareness of their rights and responsibilities in relation to the use of digital tools and technologies.
- In view of the foregoing, the Committee of Ministers: (...) - acknowledges the necessity to consider the growing responsibilities of those internet intermediaries, notably online platforms, which through their wide geographical reach and user engagement act as main gateways for news dissemination and generate important revenue from online news. Their active role in providing services of public value and their influence in the media ecosystem should be accompanied by public interest responsibilities developed through self-regulatory mechanisms or other appropriate and proportionate regulatory or co-regulatory frameworks, aimed to ensure, inter alia that: a) With due regard to their status as important sources of information and communication, the intermediaries’ criteria by which they curate, categorise and rank online content and thus influence, through automated or human-directed processes, the visibility, accessibility and promotion of news and other journalistic publications, are transparent and applied in line with freedom of expression principles, notably the right to receive and impart information.
- The Assembly considers that social media companies should rethink and enhance their internal policies to uphold more firmly the rights to freedom of expression and information, promoting the diversity of sources, topics and views, as well as better quality information, while fighting effectively against the dissemination of unlawful material through their users’ profiles and countering disinformation more effectively.
- The Assembly calls on social media companies to: 11.1. define in clear and unambiguous terms the standards regarding admissible or inadmissible content, which must comply with Article 10 of the European Convention on Human Rights and should be accompanied, if need be, by explanations and (fictional) examples of content banned from dissemination.
- The Assembly calls on social media companies to: (...) 11.2. take an active part not only in identifying inaccurate or false content circulating through their venues but also in warning their users about such content, even when it does not qualify as illegal or harmful and is not taken down; the warning should be accompanied in the most serious cases by the blocking of the interactive functions, such as “like” or “share”; 11.3. make systematic use of a network analysis approach to identify fake accounts and bots, and develop procedures and mechanisms to exclude bot-generated messages from their “trending” content or at least flag their accounts and the messages they repost.
- Internet intermediaries should ensure that all terms of service agreements and policies specifying the rights of users and all other standards and practices for content moderation and the processing and disclosure of user data are publicly available in clear, plain language and accessible formats.
- Internet intermediaries should clearly and transparently provide meaningful public information about the operation of automated data processing techniques in the course of their activities, including the operation of algorithms that facilitate searches based on user profiling or the distribution of algorithmically selected and personalised content, such as news. This should include information on which data is being processed, how long the data processing will take, which criteria are used, and for what purpose the processing takes place.
- When restricting access to content in line with their own content-restriction policies, intermediaries should do so in a transparent and non-discriminatory manner. Any restriction of content should be carried out using the least restrictive technical means and should be limited in scope and duration to what is strictly necessary to avoid the collateral restriction or removal of legal content.
- Internet service providers should provide users with clear, complete and publicly available information with regard to any traffic management practices which might affect users’ access to and distribution of content, applications or services.
- Diversity of media content can only be properly gauged when there are high levels of transparency about editorial and commercial content: media and other actors should adhere to the highest standards of transparency regarding the source of their content and always indicate clearly when content is provided by political sources or involves advertising or other forms of commercial communications, such as sponsoring and product placement. This also applies to hybrid forms of content, including branded content, native advertising, advertorials and infotainment.
- Hosting service providers should be encouraged to publish clear, easily understandable and sufficiently detailed explanations of their policy in respect of the removal or disabling of access to the content that they store, including content considered to be illegal content.
- States should encourage social media, media, search and recommendation engines and other intermediaries which use algorithms, along with media actors, regulatory authorities, civil society, academia and other relevant stakeholders to engage in open, independent, transparent and participatory initiatives that: – improve the transparency of the processes of online distribution of media content, including automated processes.
- States should encourage social media, media, search and recommendation engines and other intermediaries which use algorithms, along with media actors, regulatory authorities, civil society, academia and other relevant stakeholders to engage in open, independent, transparent and participatory initiatives that: (...) – implement the principle of privacy by design in respect of any automated data processing techniques and ensure that such techniques are fully compliant with the relevant privacy and data protection laws and standards.
- The Committee of Ministers therefore, under the terms of Article 15.b of the Statute of the Council of Europe, recommends that member States, in consultation with private sector actors and civil society, develop and promote coherent strategies to protect freedom of expression, access to information and other human rights and fundamental freedoms in relation to search engines in line with the Convention for the Protection of Human Rights and Fundamental Freedoms (...), in particular by engaging with search engine providers to carry out the following actions: – enhance transparency regarding the way in which access to information is provided, in order to ensure access to, and pluralism and diversity of, information and services, in particular the criteria according to which search results are selected, ranked or removed.
- The Committee of Ministers, under the terms of Article 15.b of the Statute of the Council of Europe, recommends that member States, in consultation with private sector actors and civil society, develop and promote coherent strategies to protect and promote respect for human rights with regard to social networking services, in line with the Convention for the Protection of Human Rights and Fundamental Freedoms (...), in particular by engaging with social networking providers to carry out the following actions: (...) − raise users’ awareness, by means of clear and understandable language, of the possible challenges to their human rights and the ways to avoid having a negative impact on other people’s rights when using these services.
- In view of the foregoing, the Committee of Ministers: (…) - encourages member States to assume their responsibility to address this threat by (…) e) empowering users by promoting critical digital literacy skills and robustly enhancing public awareness of how many data are generated and processed by personal devices, networks, and platforms through algorithmic processes that are trained for data exploitation. Specifically, public awareness should be enhanced of the fact that algorithmic tools are widely used for commercial purposes and, increasingly, for political reasons, as well as for ambitions of anti- or undemocratic power gain, warfare, or to inflict direct harm.
- Therefore, the Venice Commission has issued two recommendations which remain highly relevant and need to be implemented: - Revising rules and regulations on political advertising, in terms of access to the media (updating broadcasting quotas, limits and reporting categories, introducing new measures covering internet-based media, platforms and other services, addressing the implications of micro targeting) and in terms of spending (broadening of scope of communication channels covered by the relevant legislation, addressing the monitoring capacities of national authorities.
- In order to account for how they address their human rights impacts, business enterprises should be prepared to communicate this externally, particularly when concerns are raised by or on behalf of affected stakeholders. Business enterprises whose operations or operating contexts pose risks of severe human rights impacts should report formally on how they address them.
- Therefore, the Venice Commission has issued two recommendations which remain highly relevant and need to be implemented: - Ensuring accountability of internet intermediaries, in terms of transparency and access to data enhancing transparency of spending, specifically for political advertising. In particular, internet intermediaries should provide access to data on paid political advertising, so as to avoid facilitating illegal (foreign) involvement in elections, and to identify the categories of target audiences.
- Measures such as the adoption of corporate digital ethics codes and of self-regulatory mechanisms to solve conflicts between companies and users would also allow greater regulatory flexibility for the benefit of the interests of users and companies, while depressurising the relationship with the government and promoting co-responsibility of online behaviours.
- Companies should take effective measures to ensure transparency of their policies and practices, including the application of their terms of service and of computation-based review processes, and respect due process guarantees. To this end, companies should publish regular information on their official websites regarding the legal basis of requests made by governments and other third parties and regarding the number or percentage of requests complied with, and about content or accounts restricted or removed under the company’s own policies and community guidelines.
- Intermediaries should publish clear and a comprehensive contents moderation policy and human rights safeguards against arbitrary censorship, and transparent reviews and appeal processes.
- Political campaigning undertaken by political parties, candidates and other individuals online entails responsibilities not only for governments but also for platforms and intermediaries, which should develop codes of conduct that make explicit their respect for such fundamental rights and put in place strategies for their effective enforcement in line with the respective national rules on political campaigning.
- Ensure that online services include, by design, safeguards against disinformation; this should, for example, include detailed information on the behaviour of algorithms that prioritise the display of content as well as development of testing methodologies.
- Online platforms should disclose their detailed content policies in their terms of service and clearly communicate this to their users. These terms should not only define the policy for removing or disabling access to content, but also spell out the safeguards that ensure that content-related measures do not lead to over-removal. In particular, online platforms' terms of service should clearly spell out any possibility for the users to contest removal decisions as part of an enhanced transparency of the platforms' general removal policies.
- Digital actors should, as relevant, be transparent about the use and any practical impact of any automated tools they use, albeit not necessarily the specific coding by which those tools operate, including inasmuch as those tools affect data harvesting, targeted advertising, and the sharing, ranking and/or removal of content, especially election-related content.
- Online platforms should, beyond the minimum legal requirements, operate as transparently as possible, in particular by giving users the tools they need to identify the creators of content and understand its prioritisation (or lack thereof) on their platforms.
- Intermediaries should take effective measures to ensure that their users can both easily access and understand any policies and practices, including terms of service, they have in place for actions covered by paragraph 4(a), including detailed information about how they are enforced, where relevant by making available clear, concise and easy to understand summaries of or explanatory guides to those policies and practices.
- Smart regulation, not heavy-handed viewpoint-based regulation, should be the norm, focused on ensuring company transparency and remediation to enable the public to make choices about how and whether to engage in online forums.
- State regulation of social media should focus on enforcing transparency, due process rights for users and due diligence on human rights by companies, and on ensuring that the independence and remit of the regulators are clearly defined, guaranteed and limited by law.
- Companies should adopt clear, narrowly defined content and advertising policies on disinformation and misinformation that are in line with international human rights law and after consultation with all relevant stakeholders. (...) They should ensure that all policies are easily accessible and understandable by users and are enforced consistently, taking into account the particular contexts in which they are applied.
- Companies should provide clear and meaningful information about the parameters of their algorithms or recommender systems and ensure that those systems enable users to receive a diversity of viewpoints by default while also enabling them to choose the variables that shape their online experience.
- In addition to the principles adopted in earlier reports and in keeping with the Guiding Principles on Business and Human Rights, all companies in the ICT sector should: (…) (c) Define the category of content that they consider to be hate speech with reasoned explanations for users and the public and approaches that are consistent across jurisdictions.
- In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including: 15 (a) A policy commitment to meet their responsibility to respect human rights; (b) A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights; (c) Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.
- In order to identify, prevent, mitigate and account for how they address their adverse human rights impacts, business enterprises should carry out human rights due diligence. The process should include assessing actual and potential human rights impacts, integrating and acting upon the findings, tracking responses, and communicating how impacts are addressed.