Regulating online safety and tackling online harms

When Tim Berners-Lee invented the World Wide Web in 1990, he envisaged a decentralised environment of free exchange of ideas and information. Fast forward to 2019, almost 30 years later, and that online environment has been polluted by disinformation, manipulation, harassment and privacy breaches. The growth of online pollution has prompted various regulatory responses such as the European Union’s General Data Protection Regulation1, Germany’s Network Enforcement Act2, Australia’s Abhorrent Violence Amendment Bill3 and California’s Consumer Privacy Act4, each one responding to an online safety problem. In a world first, however, the UK has signalled it will regulate online safety in a single and coherent way, including creating a statutory duty of care for online safety. This howtoregulate article will analyse the UK’s regulatory approach outlined in its April 2019 Online Harms White Paper, which is open for public consultation until July 2019, and propose ways to improve on regulatory enforcement of online safety.

A. UK Online Harms White Paper

1. Whether we like it or not, the internet has become an integral part of everyday life for so many people. According to the Online Harms White Paper (UK Paper) nearly nine in ten UK adults and 99% of 12 to 15 year olds are online. The UK Paper recognises that online harms are widespread and can have serious consequences, and that decisive regulatory action is required because progress to improve safety online has been too slow. The new regulatory framework for online safety proposed, will make clear companies’ responsibilities to keep UK users, particularly children, safer online with the most robust action to counter illegal content and activity.

2. The UK Paper is clear about the online harms outside of the scope of regulatory action, such as those harms experienced by:

  • organisations, such as companies, as opposed to harms suffered by individuals. This excludes harms relating to most aspects of competition law, most cases of intellectual property violation, and the organisational response to many cases of fraudulent activity.
  • individuals that result directly from a breach of the data protection legislation, including distress arising from intrusion, harm from unfair processing, and any financial losses
  • individuals resulting directly from a breach of cybersecurity or hacking. These harms are addressed through the government’s National Cyber Security Strategy
  • individuals on the dark web rather than the open internet. These harms are addressed in the government’s Serious and Organised Crime Strategy.

These online harms have been excluded, according to the UK Paper, because they already have an effective response to some categories of harmful content or activity online.

3. The scope of online harms of regulatory focus outlined in the UK Paper and not intended to be static, include:

Harms with a clear definition

Harms with a less clear definition

Underage exposure to legal content

Child sexual exploitation and abuse. Box 1, page 13.

Cyberbullying (includes online abuse of public figures) and trolling. Box 6, online anonymous abuse, page 17. Box 14, online abuse of public figures, page 24.

Children accessing pornography. See note 57, page 33.

Terrorist content and activity. Box 2, page 14.

Extremist content and activity.

Children accessing inappropriate material (including under 13s using social media and under 18s using dating apps; excessive screen time)

Organised immigration crime.

Coercive behaviour.

Modern slavery.

Intimidation. Of public figures in Box 14, online abuse of public figures, page 24.

Extreme pornography.

Disinformation. Box 12, page 23. See howtoregulate article on Countering “fake news”

Revenge pornography.

Violent content. Paragraph 7.14, page 67.

Harassment and cyberstalking. Box 6, online anonymous abuse, page 17.

Advocacy of self-harm. Box 9, page 19.

Hate crime. Paragraph 7.16, page 68.

Promotion of Female Genital Mutilation.

Encouraging or assisting suicide. Box 9, page 19.

Online manipulation: although not included in the UK’s original table but is listed as an online harm Box

Incitement of violence.

Sale of illegal goods/services, such as drugs and weapons (on the open internet). Box 5, page 15

Content illegally uploaded from prisons. Box 3, page 15.

Sexting of indecent images by under 18s (creating, possessing, copying or distributing indecent or sexual images of children and young people under the age of 18). Box 10, page 20.

4. The approach of listing online harms is a good place to start to understand what, if any, regulatory responses exist and the degree to which they are effective. The online harms outlined in the table concern “public” online spaces and not private channels, although private channels is the subject of public consultation in the UK Paper. The UK’s approach focuses on individual users in the UK, regardless of where the online platform company may be domiciled. However, consideration could be given to online harms to users outside of the UK by domiciled UK companies. Facebook’s role in the incitement to violence against the Rohingya minority in Myanmar is a case in point5. Following the international condemnation of its entry into the Myanmar market and subsequent ease of use by bad actors to incite violence, Facebook took action to ban such actors.6 The UK could give some thought about how its regulatory approach could prevent a British regulated entity from being used as a tool for inciting violence which was the subject of the International Criminal Tribunal for Rwanda trials against the Rwandan radio station and its owners7. This approach would be similar to the extraterritorial application of UK child exploitation laws on UK citizens that are outside of the UK against non-UK victims.8

5. Another harm that appears not to receive adequate treatment in the UK Paper concerns consumer harms. In the UK Paper, there is a brief mention of consumer enforcement by the UK Competition and Markets Authority, specifically looking at unfair practices in online gambling, online reviews and endorsements, “pressure selling” and misleading statements on hotel booking sites9. However, the consumer harm colloquially called “dark patterns” merit further thought for inclusion in the UK’s online harms regulatory framework. “Dark patterns” is the phrase coined for “interfaces designed to coerce users into taking actions that often result in giving up more information than the user realises”10. Examples of “dark patterns” can be found here. The US Senate is legislating to prohibit the use of “dark patterns” through the Deceptive Experiences To Online Users Reduction Bill, which would create a regulator in the Federal Trade Commission that would enforce best practices as platforms evolve to protect users´ interests.

I. Analyse issues of the sector

6. The methodology of the Handbook: how to regulate? provides a good basis on which to analyse the UK’s approach. Starting with understanding the issues in the sector to be regulated, the Handbook provides a list of questions the regulator ought to consider in analysing the sector and its issues. For the full list of questions see pages 13-17 but the summarised version is as follows:

  • Upcoming developments – presenting the future: which new developments are to be expected for the sector in the years to come, e.g. in terms of the economy, technology, behaviour of actors?

  • Landscape of political operators: who has some control or influence, who are leaders, what are their interests, who are their followers?

  • State of the current regulation: is the regulation complete, up to date, precise, clear, simple to apply?

  • Knowledge, compliance, responsiveness: how well do operators/citizens know their rights and obligation, how well do operators comply with their obligations or their responsiveness to soft steering (e.g. via guidance, recommendations and information).

  • Verification: is the verification of compliance of operators and of citizens intense enough, targeting the right aspects, executed by the right authorities or entrusted to the right private bodies? Do the authorities or entrusted private bodies have enough staff, equipment and money at their disposal to efficiently execute their verification and supervision function?

  • Top problems and potentials: looking at the answers to the above dot points or other weak points identified: What are the top five problems of the sector? What are the top three development potentials for the sector, taking upcoming developments into particular account?

7. The UK Paper does not outline the detail of its analysis of the online safety environment or provide specific evidence of the failure of industry beyond its statement that:

There is currently a range of regulatory and voluntary initiatives aimed at addressing these problems, but these have not gone far or fast enough, or been consistent enough between different companies, to keep UK users safe online.

The omission of detailed analysis of the online safety environment, or specific evidence that shows how existing regulatory/voluntary initiatives have been unsatisfactory, is a shortcoming of the UK Paper. Certainly, the UK has conducted various parliamentary inquiries and government investigations, a simple list of all the relevant documents that touch on the issue of online safety could have been useful, particularly as a counter-balance to any industry response against regulatory action.

8. Notwithstanding the absence of a detailed analysis of the online environment in the UK Paper, it does contain a general analysis of the threat or problem in a numbered blue box system. The range of online safety issues include:

  • Widespread prevalence of illegal and unacceptable content and activity online, and UK users are concerned about what they see and experience on the internet. Specific concerns:

    • Threats to national security;

    • Threats to physical safety of children;

    • Spread of propaganda by terrorist groups;

    • Hostile actors using disinformation online to undermine democratic values;

    • Rival criminal gangs promoting gang culture and incite violence, including illegal sale of weapons; and

    • Other online behaviour not illegal per se but causes serious harm nonetheless eg. harassment, cyber bullying, self-harm content.

  • As power and influence of large companies has grown, and privately-run platforms have become akin to public spaces, some of these companies now acknowledge their responsibility to be guided by norms and rules developed by democratic societies (e.g. CEO Facebook Mark Zuckerberg call for help11).

  • Although a range of regulatory and voluntary initiatives aimed at addressing online harms currently exist, these have not gone far or fast enough, or been consistent enough between different companies, to keep UK users safe online.

9. The UK Paper does acknowledge that by creating an independent regulator to enforce online safety, the regulator’s first priorities will be to collect data about the scale and nature of the problem, in order to develop the codes of practice that will increase online safety and tackle particular online harms. This explains the absence of detailed analysis but casts doubt as to how creating a statutory duty of care for online safety would remedy the, at this stage, perceived scale of online harms.

10. The UK vision for addressing these online harms is:

  • A free, open and secure internet.

  • Freedom of expression online.

  • An online environment where companies take effective steps to keep their users safe, and where criminal, terrorist and hostile foreign state activity is not left to contaminate the online space.

  • Rules and norms for the internet that discourage harmful behaviours.

  • The UK as a thriving digital economy, with a prosperous ecosystem of companies developing innovation in online safety.

  • Citizens who understand the risks of online activity, challenge unacceptable behaviours and know how to access help if they experience harm online, with children receiving extra protection.

  • A global coalition of countries all taking coordinate steps to keep their citizens safe online.

  • Renewed public confidence and trust in online companies and services.

II. Identifying policy goals

11. Clarifying policy goals of regulation ensures there is a connection between the analysis of the sector, including its problems, and the objectives, requirements and measures derived from the policy goals. Policy goals can either be predetermined, e.g. by politicians or superiors in the administration, or still to be determined. Some of the goals, are connected to other goals. For example in the goal to reduce tobacco smoking, other goals connected to this is to reduce casualties and reduce sickness. Without a clear understanding of the policy goals it is difficult to develop objectives, measures, requirements and incentives to achieve the goals.

12. The UK goals for regulating online safety can be summarised as:

  1. To protect individual users from online harms;
  2. To ensure the effectiveness of online safety rules;
  3. To provide clear standards to help companies ensure safety of users while protecting freedom of expression, especially in the context of harms in content or activity that may not cross the criminal threshold but can be particularly damaging to children and other vulnerable users; and
  4. Boost the tech-safety sector to develop technology that helps users manage their safety online.

13. These four policy goals for regulating online safety form a reasonable list but other worthy policy goals could include ensuring that the safety rules are comprehensive and ensuring the rules are effectively enforced. Although the focus of this consultation is on online harms to individual users it may be that online harms could be suffered at group level (and not at individual level per se) or by businesses, this could merit investigation further down the track.

III. Developing objectives, requirements, measures and incentives

14. The Handbook’s simple illustration of how to develop “objectives”, “requirements”, “measures”, and “incentives”, uses the political goal “to reduce the number of traffic accidents in the city”. In such an example one possible objective could be to ensure that vehicles do not drive too fast. One requirement to fulfil this objective could be: vehicles should not drive faster than 30 km/h in an area of 100 meters surrounding schools. Such a requirement could be ensured by these three measures:

  • Establishment of a zone with reduced maximum speed by speed limit indications;

  • Enforcement of the speed limit by radar surveillance, vehicle identification and financial sanctions;

  • Establishment of radars displaying: “Thank you for respecting the speed limit”.

The first two measures are based on the incentive “avoid sanctions”. The last measure uses the incentive “praise”.

15. Using this simple illustration we can understand the UK objectives and requirements for regulating online safety as:

a. Ensure that companies of online platforms fulfil their duty of care for the safety of its platform by making that duty statutory with the following requirements:

  • Establish codes of practice setting out how companies fulfil their statutory duty of care.

  • Regulated entities are to adhere to their internally developed codes of practice on the basis that they explain and justify to the regulator how the alternative approach will effectively deliver the same or greater level of safety.

  • The government may direct the regulator in relation to codes of practice relating to terrorist activity or child sexual explicit abuse (CSEA) online.

  • The Home Secretary must sign off on all codes of practice concerning terrorist activity or CSEA online.

  • The regulated entity may be required to monitor specified targets where there is a threat to national security or the physical safety of children, such as CSEA and terrorism.

  • The regulator is required to be vigilant about the pace of threats that may affect codes of practice related to illegal harms, including incitement of violence and the sale of illegal goods and weapons.

  • Regulated entities are to anticipate and proactively combat threats to national security or to the physical safety of children.

  • Regulated entities are to cooperate with the investigations of an independent review mechanism and abide by its rulings

b. Ensure that an independent regulator is created with effective powers to oversee and enforce the duty of care:

  • The government is required to provide sufficient resources and the right expertise and capability to perform its role effectively.

  • Regulated entities may be required to pay a levy that funds the regulator initially.

  • The regulator is required to promote education and awareness-raising about online safety, and related technologies.

  • The regulator is required to follow a risk based approach to tackling activity or content where there is the greatest evidence or threat of harm, or where children or other vulnerable users are at risk.

  • The regulator is to take a proportionate approach to companies depending on the nature of the harm and the resource and technology available to the companies.

c. Harmonise and clarify rules tackling online content or activity that harms individual users (particularly children, or threatens our way of life in the UK, either by undermining national security, or reducing trust and undermining our shared rights, responsibilities and opportunities to foster integration):

  • The regulator is required to define online harms clearly in codes of practice.

  • The regulator, in cooperation with relevant enforcement agencies such as the police, are required to ensure illegal harms apply online in the same way as offline.

  • The regulator is required to reduce legal fragmentation of legal harms.

d. Ensure consistent enforcement of online safety rules:

  • The government is required to ensure stronger powers and adequate levels of resources for enforcement and control of the rules.

  • The regulator, in cooperation with relevant justice and law enforcement agencies and diplomatic officials, to develop binding cooperative procedures and effective mutual assistance with other jurisdictions.

e. To develop a culture of transparency, trust and accountability:

  • Regulated entities are required to be transparent in their implementation of online safety rules.

  • Regulated entities are required to respond to users complaints within an appropriate timeline, provide effective remedies for its users and sanctions for non-compliant users, consistent with the codes of practice.

  • The regulator is required to empower third-party bodies to act on behalf of victims of online harm.

  • Regulated entities are required to publish annual transparency reporting so that:

    • The regulator can gain an understanding of the level of harms on online platforms and the mitigating action being taken by companies, to inform regulatory priorities;

    • Users can gain a greater understanding and awareness of whether and to what extent companies are taking positive steps to keep their users safe, and the processes different companies have in place to prevent harms;

    • Companies take responsibility for the impacts their platforms and products have on their users, with the aim of incentivising accountability within the industry (see page 44 of the UK Paper).

  • The regulator is required to encourage and oversee the fulfilment of companies’ commitments to improve the ability of independent researchers to access their data, subject to appropriate safeguards.

f. To ensure a proportionate approach and avoid being overly burdensome on companies within the regulatory scope:

  • The regulator is required to follow a risk based approach to tackling activity or content where there is the greatest evidence or threat of harm, or where children or other vulnerable users are at risk.

  • The regulator is to take a proportionate approach to companies depending on the nature of the harm and the resource and technology available to the companies.

B. Additional requirements and measures to strengthen the UK regulatory framework

1. The UK regulatory approach centres around the creation of a statutory duty of care and an independent regulator to enforce the duty. The UK Paper outlines a comprehensive approach to regulating online safety but additional requirements, measures, and incentives can be identified that could strengthen the UK´s approach.

I. Duty of care for online safety

2. Companies that provide services or tools that allow, enable or facilitate users to share or discover user-generated content, or interact with each other online, will be subject to a new statutory duty of care for the safety of their users and non-users who may be harmed by the services or tools. The duty is focused on providers of services and tools that allow sharing or discovery of user-generated content, the providers may be large companies, start-ups, SMEs, and other organisations such as charities. The principle of proportionality will require those organisations that expose users to the greatest harm to take steps to reduce this harm. In particular, companies will be required to ensure that they have effective and proportionate processes and governance in place to reduce the risk of illegal and harmful activity on their platforms, as well as to take appropriate and proportionate action when issues arise.

3. The UK approach can be distinguished from the German approach which focuses on enforcing existing German criminal law online through additional measures in its Network Enforcement Act (see howtoregulate article Countering “fake news”). The UK approach will not only enforce existing British criminal law online through additional measures but other online activity which is not illegal but does cause harm (see table at Part A, paragraph 3 above concerning scope of online harms of regulatory focus). In this respect, the UK approach is more comprehensive because it recognises that online harms occur that fall short of the criminal threshold.

a. National security, terrorist activity and child safety

4. It is clear from the UK approach that companies are required to act quickly to harms relating to national security, terrorist activity and child safety, including monitoring activity of this nature. The UK Paper seeks views on monitoring of private messages. In developing codes of practices for harms relating to national security, terrorist activity and child safety the regulation needs to be clear about whether to set full safety principle or other fixed risk limits. Given that the UK intends for the Home Secretary to sign off on codes of practices on these harms, the monitoring of private messages may be required noting that this may be the preferred method of communication between bad actors. A list of requirements the regulator may wish to regulate on include:

  • Articulating what is a reasonable response time (minutes, hours, days) to particular instances of harms relating to national security, terrorist activity and child safety could be useful. For example the New Zealand government viewed Facebook’s time of 24 hours to remove all copies of a 17 minute live-streamed recording of the mosque shooting as unacceptable12. Australia’s legislation in response to the New Zealand mosque shooting, the Abhorrent Violence Amendment, foresees “reasonable response” as an issue for the trier of fact13.

  • Harms relating to national security, terrorist activity and child safety would be more effective if applied equally to all regulated entities because bad actors could simply move communication or harmful content to those regulated entities that have reduced requirements.

  • The number of employees engaged in monitoring.

  • Minimum skills level of employees engaged in monitoring, including language skills.

  • Protection of mental health and well being for employees engaged in monitoring.

  • Certification of algorithms used for monitoring, this will require some thought about the issue of proprietary protection. Nevertheless, items that could be regulated, providence of data sets, explanation of verification process of data sets, models designed on what basis, chain of custody of people involved in developing the algorithm.

b. Protection of non-users

5. The broader application of the duty extends beyond users but also applies to victims who may not be users of the company’s service (eg. victims of sharing non-consensual images). Some requirements for protection of non-users could include:

  • Being informed of sharing of image;
  • Right to deletion;
  • Right to obtain corrections in case of damaging statements;
  • Access to legal aid;
  • Administrative / police support for investigations, e.g. by online-detectives engaged by the authorities;
  • Support by the company for investigations;
  • Right to financial compensation; and
  • Right to injunction.

II. Independent regulator with effective powers to oversee and enforce the duty of care

6. The UK paper recognises that the regulator’s enforcement powers need to enforce sanctions for non-compliance that:

  • Incentivise companies to fulfil their obligations quickly and effectively.

  • Apply effectively accord different types of online companies, which vary enormously in size and revenue and may be based overseas.

  • Be proportionate to potential or actual damage caused and the size and revenue of the company.

7. The UK Paper asks whether or not the regulator should be empowered to i) disrupt business activities, or ii) undertake ISP blocking, or iii) implement a regime for senior management liability, our view is that the regulator should be empowered to do so. For example the German Network Enforcement Act implements a regime for senior management liability. It is prudent that the regulator have the full range of empowerments because the online environment in which the harms take place is complex and so ensuring the policy goal of online safety requires a comprehensive arsenal. The other policy goal of not overly burdensome for companies helps to keep the regulator’s power in check as well as developing an independent review mechanism of the regulators work. Another factor in ensuring a broad range of empowerments for the regulator reflects the analysis of the sector (see Part A, Section I), which was not included in the UK Paper but most certainly would have recognised that:

  • monopolistic behaviour is a feature of the online world, particularly social media,

  • of those monopolies, the global revenue could make for toothless penalties and sanctions if not designed correctly,

  • the lobbying resources of some companies are powerful14,

  • anonymity of bad actors is a common feature of perpetrators of online harms, and

  • self-regulation and voluntary initiatives have been too slow.

8. For a systematic list of empowerments, including analysis thereof, see howtoregulate articles “Empowerments (Part I): typology” and “Empowerments (Part II): empowerment checklist”. The two empowerments articles can be used to verify that all necessary empowerments have been integrated into the draft regulation under development. In addition to the enforcement powers listed on page 59 and 60 of the UK Paper these additional enforcement powers have been identified to be useful in enforcing online safety (check “Empowerments (Part II): empowerment checklist” for full list):

Imposing costs on natural or legal persons

  • Imposing enforcement costs;
  • Taking fees for state action (e.g. for authorisation procedures);
  • Requesting securities (as guarantee for the fulfilment of financial obligations); or
  • For all financial obligations and financial sanctions or penalties: levying extra fees and interests for delayed payment.

Sanctions and penalties

  • Exclusions from public tenders;
  • Imposing a ban from participating in certain activities other than tenders;
  • Imposing a ban from receiving grants;
  • Public naming and shaming;
  • Withdrawing titles and rewards;
  • Withdrawing membership;
  • Extending sanctions to mother or sister companies and their agents, at least in cases where a company has been set-up as a shield for illegal activities15;
  • Extending sanctions to partner companies and their agents where they contributed to illegal activities; or
  • Restraining orders against persons (obligation to stay in a certain perimeter).

Investigations and data

  • Conducting, or cooperating with persons conducting, research, development, tests, demonstrations and studies and publishing this research or test results;
  • Imposing obligations to cooperate without remuneration with the regulator and in particular to provide information, and to grant access to documentation and premises;
  • Arrest (and financial sanctions) in case these obligations are not fulfilled;
  • Visiting and inspecting offices, factories, warehouses, research institutions and other premises in which products are produced or kept, or where services provided);
  • Entering and inspecting any vehicle used to transport or hold products or to provide services;
  • Investigate alleged violations of laws and regulations
  • Seizing and taking possession of all documents, data and objects which might serve as means of proof for stating the non-conformity;
  • Compelling the attendance of witnesses and the production by third parties of evidence via a subpoena, when there are reasons to believe or first evidence for assuming that any infringement there is;
  • Supervising the internet communication or telecommunication (meta-data or even content) in a personalised or generic way;
  • Acquiring data and documents from third parties, including against payment or providing advantages; or
  • Processing data.

Enforcement in general

  • Empowerment to request from Internet or telecommunication service provider blocking of certain content;
  • Confiscating and destroying illegal products or means to produce them or means to provide illegal services;
  • Forbidding the use of premises or establishments;
  • Closing premises;
  • Requesting securities (as guarantee for the fulfilment of non-financial obligations);
  • Publishing individual infringements;
  • Obliging operators to disclose each month statistics about content removed, complaints received, times from receipt of complaint to take down and other information that would aid users and the regulator;
  • Obliging operators to inform their clients of their rights in case of illicit practice; or
  • Imposing immediate, temporary obligations by provisional order (e.g. for reasons of imminent danger).


  • Information dissemination to media, with or without data concerning natural or legal persons;
  • Information campaigns for the general public or specific target groups;
  • Blocking illegal information or information which endorses illegal activities; or
  • Informing the clients of non-compliant operators of their rights.

Cooperation, including exchange of data with other jurisdictions or international organisations, for example :

  • Permitting foreign officers to take part in state operations;
  • Disclosing confidential information to authorities of other jurisdictions;
  • Establishing joint expert committees and data exchange needed for that purpose;
  • Investigating or enforcing on the territory of the other jurisdiction;
  • Empowering foreign authorities to investigate cases on one’s own behalf on the territory of the other jurisdiction;
  • Empowering foreign authorities to investigate cases on their own behalf on one’s own territory;
  • Recognising foreign certificates or approvals; or
  • Extradition of offenders for offences committed in other jurisdictions; andTransfer of witnesses in custody for court procedures in other jurisdictions.

Intra-organisational matters

  • (Re-)Assigning of human resources (in case of changing situation);
  • (Re-)Assigning of financial resources (in case of changing situation);
  • Assigning of and cooperation with external experts as advisors;
  • Creating scientific or advisory bodies;
  • Establishing procedures regarding scientific or other advisory bodies;
  • Creating intra-organisational independence (e.g. for scientific or advisory bodies that must have independence to be credible);
  • Intra-organisational reporting obligations;
  • Intra-organisational control mechanisms (e.g. mandatory review by superior instance for difficult or important decisions, periodic control review, right to instruct); or
  • Whistle-blower protection mechanisms (see the howtoregulate article “Whistleblowers: protection, incentives and reporting channels as a safeguard to the public interest”).

III. Harmonise and clarify rules tackling online content or activity that harms individual users

9. The task of defining the online harms with a less clear definition and developing codes of practice to address them is a difficult task. The regulator should be empowered to work with a broad range of actors to define these online harms, including international and foreign NGOs operating in other jurisdictions. Once the codes of practice have been determined, companies of online platforms will be required to ensure online harms do not occur and respond appropriately when they do. An initial step in the harmonising and clarifying of rules tackling online content or activity is to ask all regulated entities to submit a report about how the design of their online platform contributes to the well-being of individual users and what tools they use to optimise user well-being. The Centre for Humane Technology has created a Design Guide that helps designers of a platform or app take meaningful steps towards designing a more humane product and to identify where investing in a deeper understanding of human nature will yield further benefits.

10. There is also the question of responsibility of parents in relation to online use of their children. While the UK is one of few jurisdictions that has released government guidelines for screentimes of children based on their age, the World Health Organisation has also released a related guideline, perhaps other requirements specific to children could be considered:

  • incentivising ISPs to create family services that automatically prevent devices in the home from accessing sites inappropriate for children, the Great Firewall of China regulatory system of the internet in China is a case in point.

  • Incentivising mobile phone makers to create phones that are child-safe by design.

  • funding and supporting research that would facilitate the first two dot points.

The UK Paper does foresee economic incentives to encourage the industry to develop such technological solutions.

IV. Consistent enforcement of online safety rules

11. The UK has analysed the enforcement of illegal harms both online and off line, and found that the enforcement of illegal harms should be strengthened. The UK Law Commission report outlines ways to do so and the British government are considering legislative amendments.16 Until the British government outlines the legislative amendments it intends to make, it is difficult to analyse the consistency of enforcement of online safety rules. On the question of ensuring that online rules are comprehensive, the future legislation should include provisions that could capture future harms not yet considered. One way to do this is to focus on the harm the individual suffered, activity that could reasonably be expected to cause distress. Another mechanism to future proof harms that may arise is to ensure that the online platform´s user complaint mechanism: 1) responds to complaints that distresses the user quickly, particularly those of children, 2) set a threshold for user complaints for which a code of practice does not exist, which must be reported to the regulator to determine whether or not a code of practice is required.

12. The UK Paper envisages an independent review mechanism, which will help to ensure consistency in that where the regulator’s enforcement has been too heavy handed such enforcement can be independently reviewed through a complaints mechanism or a system for reviewing procedural fairness. Most public decisions in a common law system enable aggrieved parties to seek a review of procedural fairness through administrative courts.

13. Ensuring a consistent enforcement approach to the rules will require consistent data and consistent reporting. The UK Paper states that companies will be required to publish transparency reports and it would be advisable to make clear the data requirements for reporting and standardise reporting formats. It has already been noted that the initial reports about content removal following the coming into force of the German Network Enforcement Act has been difficult to analyse because the social media platforms did not report consistency, the German Act does not require standardised reporting.17

V. Proportionate approach and avoid being overly burdensome on companies

14. All regulation should consider the effects and burdens on regulated entities and on the regulator of enforcement of the rules. The rules should be proportionate to the risk of the harm occurring. For guidance on risk classification see howtoregulate article Research and Technology Risks: Part III – Risk Classification. The UK’s regulatory approach for illegal harms will impose more specific and stringent requirements, compared with harms that may not reach the criminal threshold. The UK Paper foresees reasonable steps to ensure users are safe online, commensurate with the regulated entity’s size and resources. The UK approach does not apply a metric of number of users as the German Network Enforcement Act does. It is conceivable that a platform has many users but the company managing the platform may be a start-up of five people. Such a start-up’s platform may expose users to greater online harms because of the high number of users but being a start-up would have limited resources. The UK should also look at the number of users a platform has in addition to the regulated entity’s size and resources.

15. The severity of online harms concerning national security, terrorist activity and safety of children may well require regulated entities to enforce the same standard regardless of resources but some measures to reduce burdens for regulated entities without the resources, includes:

  • tax relief for measures implemented for harms concerning national security, terrorist activity and safety of children; and

  • government provided technology that ensures a minimum safety standard.

VI. Private channels

16. When the codes of practice are developed by the UK regulator, the standard for monitoring of private communication would need to be guided by existing precedent and guidance on the question of privacy. The European Court of Human Rights provides clear guidance on the limit to the right to privacy in relation to file or data gathering by security services or other organs of the State (page 38, paragraph 169) and Police surveillance (page 39, paragraph 173). Scanning or monitoring content for tightly defined categories of illegal content will not apply to private channels according to the UK Paper and are consulting on how to define private channels. It notes the difference between one-to-one messaging and a Whatsapp group of several hundred members. In seeking to define private channels to be exempt from scanning and monitoring the following requirements could be considered:

  • If the channel permits one-to-one messaging only, as opposed to one-to-multiple, its character would be more private and perhaps a private channel exempt from scanning or monitoring. However, one difficulty with this approach is that a user may only send one-to-one but sends many messages, which has the same affect as one message sent to multiple users, it also provides little protection to the receiver of the one-to-one messaging in examples of cyber-bullying, trolling18 or catphising/catfishing19.

  • Facebook and Whatsapp offer one-to-multiple and one-to-one messaging, the technical feasibility of excluding from scanning or monitoring of private communications involving one-to-one messaging could be investigated to determine whether it is a measure worth including or not.

17. Where companies have services and tools for private communication an approach consistent with the limits to privacy in laws that protect privileged communications could be instructive. It varies from jurisdiction to jurisdiction but generally communications with prescribed health professionals (doctors, psychiatrists, counsellors etc.), with a spouse, a lawyer, a journalist and their source, or a priest are protected subject to statutory limits. Examples of statutory limits include:

  • UK common law recognises that in a lawyer client relationship information of an iniquitous nature cannot be confidential20;

  • disclosures made during a court ordered physical or mental exam;

  • where a health professional believes that the person who made the disclosure poses a danger to him/herself or to another individual/s and there is a risk of imminent harm21;

  • where a health professional has reason to suspect or believe that a child or an elderly person, disabled or incompetent person is being abused, statutes prescribe mandatory reporting to the agency responsible for caring for these individuals22; and

  • between spouses in court proceedings concerning children or domestic violence.

Other statutes require disclosure based on what is disclosed for example terrorist activity, infectious diseases or genetic information.23

18. Evidently, the professionals in the previous paragraph would directly be disclosed information and are usually governed by mature codes of ethics, a company managing a private messaging service would not be disclosed information in the same way. However, creating a statutory duty of care for online safety means the company is responsible for protecting users and non-users and so must monitor for such harms and in this way could be argued to be disclosed information. It is not always clear how private messages are in reality when key words in such messages may well be used to better target advertising.

C. Additional measures for ensuring compliance

Public service announcements

1. Many online platforms rely on advertising for revenue and regulated entities could be required to host a number of public service announcements commensurate with the risk and prevalence of that particular online harm. Regulators could consider public service announcements concerning media literacy, educating users about complaint mechanisms for online safety and educating users about online harms generally.

Complaint portals

2. The UK Paper will require regulated entities to provide adequate complaint portals, the Australian e-Safety Commissioner has a very good online complaint portal as an example,

Legal action of competitors

3. Given that state authorities might have limited administrative capacities, it might be suitable to give competitors the option to sue their peers in case of infringement. For example internet service providers could obtain the right to sue their competitors if they fail to act in a reasonable time to remove harmful content. It is in the interest of the policing operator that his peers ensure the same level of compliance. Evidently, this measure should be taken with the right dosage, taking into account the capacities of the courts in charge.

Legal aid for victims, victim compensation fund

4. Financial support and free advice to victims of online harms might enable the victims to take legal measures against perpetrators. Empowered victims will increase the level of compliance by sanctioning perpetrators.

Class actions and legal action by accredited organisations

5. The US technique “class action” is slowly infiltrating other jurisdictions. It consists of permitting law firms to raise a legal claim on behalf of a group or “class” of individuals with similar legal claims, without being formally empowered by each of them individually. Class actions are usually allowed in civil cases only. Class actions are a powerful but limited tool because individuals can amplify their ability to litigate, negotiate and settle disputes, however, strength in numbers also limits choices and options. For example some disadvantages, depending on the jurisdiction could include, conflicts between different parties involved, settlement money or money from a successful claim may be lower than if you were to take your own case to court, limited ability to control proceedings or extinguishing a right to later bring an individual claim to court.24

6. Another legislative technique is to empower accredited, mostly private organisations to take legal action in the public interest. This technique has been used by various jurisdictions in the field of environmental protection25 or animals rights. In international law the Aarhus Convention establishes a number of rights of the public (individuals and their associations) to information, to participate in decision-making and the right to challenge public decisions concerning the environment. These rights are laid out in the Convention on Access to Information, Public Participation in Decision-Making and Access to Justice in Environmental Matters.

Self regulating bodies and certification

7. The German Network Enforcement Act26 regulatory approach to enforcement of criminal law online through self-regulating bodies has been a useful technique complementing other compliance measures. In addition to the recognition of self-regulation institutions, we recommend self-regulating bodies also provide basic certification of processes to be installed by internet service providers to take reasonable steps to ensure the online safety of users and non-users. Given the possible high number of infringements, the internal processes of service providers are of utmost importance, hence the importance of the UK’s proposed codes of practice. Whilst it might go too far to require external quality system certification (big tech has come out resolutely against any type of external auditing in Australia´s Digital Platforms Inquiry27), certification by a self-regulating body could be a proportionate way for verifying that suitable internal processes have been established.

Penal and administrative sanctions

8. Whilst penal sanctions are mostly on the radar of regulators, particularly taking into account global revenue and not domestic, we recommend considering addition administrative sanction against the legal bodies (the economic operators) which do not ensure compliance. The administrative sanction does not need to be based on negligence of an individual – negligence of the entire organisation or bad management might suffice. Hence it is more easy to prove the conditions for the sanctions which ensures a higher degree of efficiency. The German Network Enforcement Act imposes regulatory fines committed by any person who, intentionally or negligently fails to produce the reports outlined in the Act, fails to provide a procedure for complaints, fails to monitor complaints, fails to offer training and support amongst others listed at Section 4, subsection 1 of the Act. Regulatory fines may be made up to five million euros (Section 4, subsection 2 Network Enforcement Act ).

Funding research

9. It is conceivable that some research may not attract funding by private industry and so it is important that the UK develops a mechanism for funding valuable research into online safety or supporting civil society organisations to do so. There is a similar issue in the pharmaceutical industry where the threat from antimicrobial resistance is rising and yet pharmaceutical companies are closing down their antibacterial and antiviral research due to unprofitability.28 Regulators started looking at incentives to encourage research in antibiotics, including:

  • changing the way antibiotics are assessed to reduce barriers and costs,

  • using smaller clinical trials,

  • creating private-public projects where small and medium-sized enterprises and academics work with large companies to create an antimicrobial discovery hub,

  • research and development tax credit,

  • introducing rapid antibacterial approval pathway for serious or life-threatening infections,

  • extending data exclusivity for qualifying antibiotics/antifungals, and

  • pricing to ensure return on investment.29

Other links


Digital Platforms Inquiry:

Australia E-safety Commissioner:

Australia Cybercrime Online Reporting Network:


Cybertip CA:

House of Commons Standing Committee on Access to Information, Privacy and Ethics:


Most successful education approach to disinformation:


Media Policy Project Blog from London School of Economics, Researchers from the Alexander von Humboldt Institute for Internet and Society, on initial transparency reports from online platforms removal of online hate speech:


Go Safe Online:

Protection from Harassment Act (Chapter 256A) 31 May 2015 (contains simple illustrations of types of harassment conduct):

Stigler Centre at the University of Chicago: Digital Platforms Project “How to Mitigate the Political Impacts of Social Media and Digital Platform Concentration”. 

Oxford-Stanford University Report “GLASNOST! Nine ways Facebook can make itself a better forum for free speech and democracy”.

Centre for Humane Technology: Your Undivided Attention Podcast, which exposes the hidden designs that have the power to hijack our attention, manipulate our choices and destabilise our real world communities.

This article was written by Valerie Thomas, on behalf of the Regulatory Institute, Brussels and Lisbon.


5 Wong, J.C., “Overreacting to failure’: Facebook’s new Myanmar strategy baffles local activists”, Guardian Online, 7 Feb 2019, .



9 UK Paper pages 33-24.

10 Tausche, K. and Wellons, M.C., “A new Senate bill would ban a ´deceptive´ practice used by Facebook to get user data, such as phone and email contacts”, CNBC Online, 9 April 2019, .

11 BBC US & Canada, “Mark Zuckerberg asks governments to help control internet content”, BBC News Online, 30 March 2019, .

12 Gunia, A., “Facebook tightens live-stream rules in response to the Christchurch massacre”, Time Online, 15 May 2019, and Sonderby, C., “Update on New Zealand”, Facebook Newsroom, 18 Mar 2019,

14 (US lobbying) Brody, B., “Google, Facebook Set 2018 Lobbying Records as Tech Scrutiny Intensifies”, Bloomberg Online, 23 Jan 2019, . (EU lobbying) Heath, R., “Silicon Valley’s Euopean Solution”, Politico Online, Jan 2019, .

15 See Section 7.18 of the Handbook “How to regulate?”.


17 Gollatz, K. et al, “Removal of online hate speech in numbers”, London School of Economics: Media Policy Project Blog, .

18 Cambridge Dictionary online definition of trolling, .

19 Cambridge Dictionary online definition of catfishing, .

20 UK Solicitors Regulation Authority, Disclosure of client´s confidential information,

22 Ibid.

23 Ibid.

24 Australian Securities and Investment Commission, “Class actions”, .

25 Section 487 of the Australian Environment Protection and Biodiversity Conservation Act is an example of a jurisdiction creating a legislative right to sue by outlining a less stringent threshold for legal standing to environmental groups who might not necessarily have standing because they are not affected (in a direct sense) by a decision made under the Environment Protection and Biodiversity Conservation Act.

26 Section 3, subsection 7 of the German Network Enforcement Act.

27 Australian Competition Consumer Commission Digital Platforms Inquiry, 4 March 2019, .

28 Hu, C., “Pharmaceutical companies are backing away from a growing threat that could kill 10 million people a year by 2050”, Business Insider, 21 July 2018, .

29 Sukkar, E., “Why are there so few antibiotics in the research and development pipeline?”, The Pharmaceutical Journal, 13 Nov 2013, .

Leave a Reply

Your email address will not be published. Required fields are marked *

5 × 4 =