Internet intermediaries should ensure transparency and easy access to their policies and practices regarding online content management, strategic dissemination, and automated processing.
- Therefore, the Venice Commission has issued two recommendations which remain highly relevant and need to be implemented: - Revising rules and regulations on political advertising, in terms of access to the media (updating broadcasting quotas, limits and reporting categories, introducing new measures covering internet-based media, platforms and other services, addressing the implications of micro targeting) and in terms of spending (broadening of scope of communication channels covered by the relevant legislation, addressing the monitoring capacities of national authorities.
- Therefore, the Venice Commission has issued two recommendations which remain highly relevant and need to be implemented: - Ensuring accountability of internet intermediaries, in terms of transparency and access to data enhancing transparency of spending, specifically for political advertising. In particular, internet intermediaries should provide access to data on paid political advertising, so as to avoid facilitating illegal (foreign) involvement in elections, and to identify the categories of target audiences.
- Measures such as the adoption of corporate digital ethics codes and of self-regulatory mechanisms to solve conflicts between companies and users would also allow greater regulatory flexibility for the benefit of the interests of users and companies, while depressurising the relationship with the government and promoting co-responsibility of online behaviours.
- Intermediaries should take effective measures to ensure that their users can both easily access and understand any policies and practices, including terms of service, they have in place for actions covered by paragraph 4(a), including detailed information about how they are enforced, where relevant by making available clear, concise and easy to understand summaries of or explanatory guides to those policies and practices.
- Smart regulation, not heavy-handed viewpoint-based regulation, should be the norm, focused on ensuring company transparency and remediation to enable the public to make choices about how and whether to engage in online forums.
- State regulation of social media should focus on enforcing transparency, due process rights for users and due diligence on human rights by companies, and on ensuring that the independence and remit of the regulators are clearly defined, guaranteed and limited by law.
- Companies should adopt clear, narrowly defined content and advertising policies on disinformation and misinformation that are in line with international human rights law and after consultation with all relevant stakeholders. (...) They should ensure that all policies are easily accessible and understandable by users and are enforced consistently, taking into account the particular contexts in which they are applied.
- Companies should provide clear and meaningful information about the parameters of their algorithms or recommender systems and ensure that those systems enable users to receive a diversity of viewpoints by default while also enabling them to choose the variables that shape their online experience.
- In addition to the principles adopted in earlier reports and in keeping with the Guiding Principles on Business and Human Rights, all companies in the ICT sector should: (…) (c) Define the category of content that they consider to be hate speech with reasoned explanations for users and the public and approaches that are consistent across jurisdictions.
- In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including: 15 (a) A policy commitment to meet their responsibility to respect human rights; (b) A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights; (c) Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.
- In order to identify, prevent, mitigate and account for how they address their adverse human rights impacts, business enterprises should carry out human rights due diligence. The process should include assessing actual and potential human rights impacts, integrating and acting upon the findings, tracking responses, and communicating how impacts are addressed.
- In order to account for how they address their human rights impacts, business enterprises should be prepared to communicate this externally, particularly when concerns are raised by or on behalf of affected stakeholders. Business enterprises whose operations or operating contexts pose risks of severe human rights impacts should report formally on how they address them.