Back to home page click here

HOW SHOULD WE REGULATE CONTENT

IN A CONVERGED WORLD?

Submission to the Communications Review

of the Department for Culture, Media & Sport

Contents


INTRODUCTION

This submission is informed by my six years as the first independent Chair of the Internet Watch Foundation, my one year as the Vice Chair of the Nominet Policy Stakeholder Committee, my seven years as the Member for England on the Communications Consumer Panel, and my three years on the Board of Consumer Focus, but it does not in any sense represent the views of any of these organisations. It is a personal submission.

The submission addresses only Question 3 from the Secretary of State's open letter of 16 May 2011 – but it is a fundamental question: "Is regulatory convergence across different platforms desirable and, if so, what are the potential issues to implementation?"

CURRENT REGULATION OF BROADCASTING AND THE INTERNET

Broadly speaking, I would suggest that in the UK, as in most democratic countries, broadcasting is regulated around some concept or definition of offence. So 'excessive' or 'inappropriate' bad language, violent behaviour, sexual activity, and such anti-social practices as smoking, drinking and drug-taking are prohibited or confined to certain times or certain channels. Therefore essentially the test of acceptability is offence.

Historically the justification for this approach has been relatively objective factors like the scarcity of spectrum and the limited number of channels and more subjective considerations such as authority's wish to control what citizens see and hear and citizens' wish that someone protect the vulnerable, most especially children.

Implicit in this approach has been an assumption that the society in question has a broad set of values shared by the overwhelming majority of citizens that enables one to determine without too much controversy what is and is not offensive and that the bodies making these decisions can be trusted to reflect this consensus in their decisions.

Even in democratic countries which might be expected to have very similar approaches to what is 'offensive', it is noticeable how the United States is much more relaxed about violence than sex and how France is much more casual about sex than say the UK. Also it is often asserted – rightly, I think - that values have changed and that we all see more sex and violence on television than we used to do.

I would suggest that, by contrast, in the UK and most democratic nations 'regulation' of the Internet simply borrows from general law and that, as far as is practical, what is illegal offline is regarded as illegal online. This would include criminal content such as child abuse images (what many – wrongly, I believe – call child pornography), 'extreme' adult pornography, race hate material, and inducement to violence or other activities which are of themselves illegal such as drug-taking, fraud or robbery. It would also include content such as libel or copyright infringement. Therefore the test of acceptability is our domestic law.

The main problem here is that, while most law is domestic, the reach of the Internet is truly global. Furthermore the Internet has a more opaque delivery mechanism than broadcasting and it is more difficult, but usually not completely impossible, to ascertain who is responsible for the content and who can be legally obliged to amend or remove it.

In theory, Internet content can be 'regulated' through a much lower test than illegality, since all providers of Internet connection and all hosters of Internet content have terms and conditions of use – often known as abuse policies – which would enable them contractually to remove all sorts of 'offensive' material. But, in practice, companies are reluctant to act on such complaints, both because they fear being accused of censorship (especially in the United States which ironically hosts more child abuse images than any other nation on the globe) and because such processes could be complicated and time-consuming to operate on any significant scale.

WHY CONTENT REGULATION NEEDS TO CONVERGE

Regulation of broadcasting and the Internet cannot be the same. It would both technically impossible and socially unacceptable. The issue is whether the sharp differences in regulation and the fundamentally different tests of what is acceptable should continue. My own view is that, over time and with consumer education, we should move to a less differentiated model. Why?

First, the current broadcasting model is no longer appropriate. Effectively there is no scarcity of spectrum or channels – the volume of content and the range of choice are now enormous. In the UK as in many countries, there is no longer a real consensus about what constitutes offence – we are much more cosmopolitan and much more variegated in our tastes and values and what would outrage one family would be no problem at all to another.

Second, the current Internet model is no longer adequate. When the Net was used by a few thousand academics and nerds, maybe we did not need to worry too much about its content. But now the Internet is a mass medium – indeed it is the mass medium – with some two billion users. To limit 'regulation' simply to material which is illegal is not facing up to some serious challenges of Internet content – such pro-anorexia, pro-bulimia, pro-self harm and pro-suicide sites – or to the wishes of consumers for some more protection and guidance.

Third, convergence now means that regulation based on device – one system for broadcasting because it is delivered on radio and television sets and another system for the Internet because it is delivered on a computer – is wholly inappropriate and unsustainable. Already one can have a split screen with the broadcasting of a television programme as the main picture and a live Twitter feed about the programme on a smaller section of the same screen. Tablet computers (like the iPad) and Internet Protocol Television (IPTV) are accelerating the convergence of content delivery.

It might be argued that convergence is not yet sufficiently advanced to require policymakers to address now the challenge to regulation. But the DCMS review is looking for legislation in 2015 and it is suggested that such legislation should be fit for purpose for around a decade, that is until 2025.

Relevant data can be found in the November 2010 report to the European Commission's DG Information Society and Media "Towards A Future Internet: Interrelation Between Technological, Social And Economic Trends". This report utilised a Delphi survey of opinions from some 235 experts. They were asked: “When will average Internet use exceed watching broadcast TV?” Some 6% said now; 36% said 2015, 31% said 2020, 18% said 2025, 6% said 2025+, and 2% said never. It will be seen, therefore that, a large majority of experts expect online time to exceed broadcast television by 2020 and 42% believe it will actually happen much earlier, by 2015.

HOW CONTENT REGULATION COULD CONVERGE

In considering how regulation of broadcasting and the Internet could converge, I would suggest three basic principles:

  1. Regulation does not have to be perfect to be appropriate. We have a 70 mph speed limit on motorways but many drive faster than that and we still have accidents. The law is not there is to spoil enjoyment of driving but to minimise harm. This is a sound principle to apply to content regulation too.
  2. We need to distinguish less between regulation of the same content delivered over different platforms or by different technologies. So differences between broadcasting regulation and Internet regulation should – over time – become less. This would mean a lot less regulation of broadcasting and a bit more regulation of the Internet. Overall the impact would be significantly deregulatory.
  3. We need to distinguish more between how we treat different types of content. Broadly speaking, we regulate broadcasting in terms of offensive content which – in my view - catches too much material and we regulate the Internet in terms of illegal content which – again in my view - catches too little material. My suggestion is that, as well notions of offensive content and illegal content, we should develop an intermediate category of harmful content and focus new mechanisms on tackling this.

If we are going to have a more converged approach to content regulation, then essentially we have three broad choices.

First, we could regulate broadcasting the way we regulate the Internet, so that all content would be accessible unless it was illegal. This would throw open what was permissible on television to an extent which I believe would be politically and socially unacceptable. Our screens would be awash with sex and violence and not just when we 'pull' it down from the Net but when it is 'pushed' at us by broadcasters.

Second, we could regulate the Internet the way we regulate broadcasting, so that anything offensive on the Net would have to be blocked or limited in some way. In a global medium where every user has the opportunity to create content, this would be technically impossible (although it is feasible in a particular totalitarian regime like China or Iran). Furthermore it would change the whole concept of the Net and radically diminish the rich and varied content that we currently enjoy.

Third, we could seek some sort of middle way that uses a different test of acceptable content – one that is not so strict and subjective as offence or taste or decency, but one that is not so limited and difficult to enforce as illegality. What could such a test be? As I have indicated, I would suggest for debate the test of harm.

But how would one define harmful content? I offer the following definition for discussion and debate: "Content the creation of which or the viewing of which involves or is likely to cause actual physical or possible psychological harm." Examples of material likely to be ‘caught’ by such a definition might be glorification or trivialisation of violence, incitement to racial hatred or acts of violence, and promotion of anorexia, bulimia or suicide.

An obvious response to this proposal for a harm test is: how can you draw the line? In reality, in the UK (as in most other countries), people are drawing the line every day in relation to whether and, if so how and when, one can hear, see, or read various forms of content, whether it be radio, television, films, DVDs, video games. newspapers & magazines. Sometimes the same material is subject to different rules - for instance, something unacceptable for broadcast at 8 pm might well be permissable at 10 pm or a film which receives a particular certificate in the cinema might be cut for broadcast television.

Therefore I propose in relation to Internet content that we consult those bodies which already make judgements on content plus other relevant parties about creation of an appropriate advisory panel for Net content and that we then create an independent panel of individuals with expertise in physical and psychological health who would draw up an agreed definition of harmful content, together with supporting advice, and be available to judge whether material referred to them did or did not fall within this definition.

What would one do about such harmful content on the Net?

Once we have effective regimes for illegal and harmful content respectively, one has to consider that material which is offensive - sometimes grossly offensive - to certain viewers of television programmes or users of the Internet. This is content which some users would rather not access or would rather that their children not access.

Now identification of content as offensive is subjective and reflects the values of the user who must therefore exercise some responsibility for controlling access. The judgement of a religious household would probably be different from that of a secular household. The judgement of a household with children would probably be different from that of one with no children. The judgement of what a 12 year old could access might well be different from what it would be appropriate for an 8 year old to view. Tolerance of sexual images might be different to those of violent images.

It is my view that, once we have proper arrangements for handling illegal and harmful content, it is reasonable and right for government and industry to argue that end users themselves have to exercise prime responsibility and control in relation to material that they find offensive BUT we should provide users with effective techniques and tools that they can use to exercise such control and actively educate users over the use of such techniques and tools.

For television, this would involve warning information in the electronic programme guide and PIN protection for adult channels and possibly use of warning symbols on screen during transmission. For the Internet, this would involve use of filtering software and better understanding of search engine options and privacy settings on social networking sites. In both cases, home hubs would probably provide more technological options. In both cases, there should be public programmes to promote media literacy.

Therefore regulation of broadcasting and the Internet would increasingly focus on illegal and harmful content, progressively leaving offensive content as a matter essentially for viewers and surfers to block if they thought that appropriate for their family or household. This would suggest a convergence of the regulation of broadcasting and the Internet to a model which, compared to the present situation, would involve a lot less regulation for broadcasting and a bit more regulation for the Internet.

Two issues are crucial here:

CONCLUSION

It is clear that:

I want to see a regulatory model for content that meets the following criteria:

Therefore I commend study of a model which has the same test for content that is broadcast and that is online, which has at the heart of that test the notion of harm, and which uses an expert panel to make quick, consistent and transparent decisions. This will not happen overnight and it will not be easy to win immediate and universal support – but this, I believe, should be the direction of travel.

The Secretary of State's open letter declares: "A deregulatory approach that deals with these developments to the benefit of both consumers and citizens, and also industry, is the aim." The model proposed in this submission is overall substantially deregulatory – and empowering.

The letter states: "A new Bill is the end point of whole process, but we are willing to take action sooner where primary legislation is not required." Most of the proposals in this submission could be implemented through non-statutory, self-regulatory means. Government should challenge the industry to set up up the necessary modalities on the understanding that, it does so, the new Bill have little to say on content regulation but, if fails to do so, then Ministers will seek the necessary legislative powers.

ROGER DARLINGTON

30 June 2011

Back to home page click here