Back to home page click here

RATING AND FILTERING OF THE NET

Contents

  • What Are Rating And Filtering?
  • What Are The Available Rating Systems?
  • What Are The Available Filtering Systems?
  • How Do Filtering Systems Work?
  • What Is The Case For Rating & Filtering?
  • What Is The Case Against Rating & Filtering?
  • How Should One Chose A Filtering System?
  • Where Can One Obtain Advice On Safe Surfing?
  • Further Reading

  • WHAT ARE RATING AND FILTERING?

    Rating - or labelling - involves the assessment of the content of a Web site in terms of the extent of the occurrence of particular characteristics, notably such factors as violence, nudity, sex and language.

    Filtering involves the limitation of the material that may be downloaded to a particular computer at a particular time in accordance with user selection based on such rating of Web sites and the use of browser settings or filtering software.

    Therefore rating and filtering go together as a control or selection system based on the personal wishes of the end user or the supervisor of the end user.

    Link: Filtering FAQ from Computer Professionals for Social Responsibility click here

    WHAT ARE THE AVAILABLE RATING SYSTEMS?

    Currently there are three main rating systems in use.

    1) RSAC/ICRA

    The Recreational Software Advisory Council (RSAC) was originally set up in September 1994 and established a rating system for computer games in the USA. Subsequently the system was extended to web sites. It rated material at five levels according to nudity, sex, violence and bad language. Currently some 160,000 sites are RSACi rated.

    The RSAC is no longer operative and the rights to RSACi have been bought by the Internet Content Rating Association (ICRA). Although ICRA originated in the United States, it has obtained major funding from the European Commission and therefore it is now headquarted in the UK.

    In December 2000, ICRA publicly launched its proposals for a successor rating system to RSACi. This new system - which has been extensively trialled - is a much more objective, yet sophisticated, system which takes account of the context of material and the very different cultures in the world. It has the active support of many of the leading players in the industry.

    In the UK, ICRA is supported by the Internet Watch Foundation [click here] and indeed the IWF's Chief Executive is the Secretary of ICRA.

    Link: Internet Content Rating Association click here

    2) SafeSurf

    This is a system developed by the SafeSurf Corporation. It is more sophisticated than RSACi in that it uses nine levels for a dozen or so characteristics including age range, gambling and intolerance.

    Link: SafeSurf click here

    3) NetShepherd

    This is a system based in Calgary in Canada. It accords quality levels (1-5 stars) to maturity thresholds (general, child, pre-teen, teen, adult and objectionable).

    Link: NetShepherd click here

    All three of these systems are based on the same Internet protocol: the Platform for Internet Content Selection (PICS) which was developed by the World Wide Web Consortium (W3C) and first released to the public in March 1996. PICS works by embedding electronic labels in text or image documents so that their content is electronically checked and vetted before the computer displays them.

    Link: Platform for Internet Content Selection click here

    The first two systems – ICRA and SafeSurf - rely on self rating of Internet sites by Web publishers. While self rating is voluntary and fair, so far it has not been implemented widely because there is no compelling economic incentive for organisations to rate their sites. By contrast, NetShepherd conducts third party ratings of Web sites.

    WHAT ARE THE AVAILABLE FILTERING SYSTEMS?

    Both Internet Explorer and Netscape Navigator browsers have built-in content and ratings facilities. Furthermore many Internet service providers have parental controls or Web-filtering options. Also several of the major search engines can be set up so that they will not look for or will block access to various types of material. These include the use of Family Filter on AltaVista and Parental Control on Lycos.

    Currently there are around 40 specific blocking and filtering programs, most of them US-based. They include Chibrow, Cyber Patrol, CyberSentinel, CYBERsitter, Cyber Snoop, Eye Guard, Family Connect, Gulliver's Guardian, Hedgebuilders, I-Gear, The Internet Filter, KidDesk Internet Safe, Kliki, Net Nanny, PlanetWeb Parental Control, PureSight, SafeSurf, supananny, SurfControl, SurfWatch, We-Blocker, Xcheck, X-Stop. For links to the sites of these products, see note 1.

    Typically such software costs around $30-40 or £20-25. As an indication of the degree of sophisication that is possible, Cyber Patrol covers 12 different categories and can be set up for up to nine users.

    Links:
    Browsers and Filters click here
    List of filters click here
    Family Guide Book click here

    HOW DO FILTERING SYSTEMS WORK?

    Since so few web sites (and no newsgroups or chatrooms) are rated, essentially filtering systems work in one of three ways:

    1. They have an 'allow' list of sites so that users can only access the sites permitted by the system. This approach is also known as whitelists or good lists.
    2. They have a 'deny' list of sites so that users can access all sites except those 'blocked' by the system. This approach is also known as blacklists or bad lists.
    3. They have 'keyword matching' which means that sites are 'blocked' if they contain previously determined unacceptable words or phrases.
    Filtering products can have one of two bases:
    1. They can be client-based which means that filters have to be installed and maintained on each individual computer or work station used to access the Internet. Such systems are more prone to tampering and are more difficult to keep up-to-date.
    2. They can be server-based which means that they are installed either at a central location maintained by the network administrator of the organisation or company or at the server of the Internet service provider (ISP). Such systems are effectively impune from tampering and are generally kept current.

    In March 2000, two computer experts - Eddy L. O. Jansson of Sweden and Matthew Skala of Canada - were issued temporary restraining orders by a US District Judge because they had reverse-engineered the Cyber Patrol filtering software, using a utility called "cphack", to reveal the product's list of more than 100,000 Web sites deemed unsuitable for children and to enable disclosure of the parent's password.

    WHAT IS THE CASE FOR RATING & FILTERING?

    1) There is considerable evidence that users – especially parents and teachers – wish to adopt rating and filtering systems. This is clear from the extensive take-up of the wide range of market offerings. Furthermore it is reinforced by many studies and polls carried out in both in the USA and the UK [see note 2].

    2) It is a mechanism that gives control to the owner or supervisor of the PC and therefore rests on personal choice and individual responsibility. It is a very flexible approach because it enables account to be taken of the norms of local cultures or the values of particular households.

    3) There is very wide acceptance indeed of rating and filtering by Governments, regulators, users, and Internet service providers. It is a world-wide approach which is well-fitted to the global nature of the Internet.

    Link: Bertelsmann Foundation's self-regulation project click here

    WHAT IS THE CASE AGAINST RATING & FILTERING?

    1) It is argued that rating and filtering is in effect a form of censorship since it denies access to certain material for certain people.

    BUT: This is really to misuse use the word censorship. Censorship involves the permanent denial of access to people who have a legitimate right and a personal wish to access it. In the case of rating and filtering, no adult who wishes is denied access to any material. Instead the individual adult decides what type of material he or she wishes to access and which kind of content children or other vulnerable individuals – such as violent prisoners or the mentally ill – should be allowed to access.

    2) It is argued that filtering systems are imperfect and sometimes over-inclusive. For example, a block on the word “sex” could deny access to information on Middlesex (where I live) or Sussex University (where my son went to university) or the filtering out of material on sex or drugs could block access to educational materials on AIDS or drug abuse prevention [for some examples, see note 3].

    BUT: In so far as there have been such problems, these instances are occurring less frequently as the relevant software becomes more sophisticated and takes account of the particular context of material. In any event, the absence of a perfect solution is not an argument against the best solution.

    3) It is argued that rating and filtering provide a false sense of security to those who use such systems.

    BUT: This is not an argument against the use of rating and filtering, but rather a warning – and a sensible one - not to over rely on such an approach. At the end of the day, parents, teachers and guardians cannot absolve themselves of personal responsibility for the conduct of children in their charge and – whether or not they choose to use filtering software – they should do everything possible to monitor the activities of their charges and to inculcate a sense of responsibility in them.

    4) It is pointed out by critics that, in almost every case, the suppliers of filtering software will not disclose the lists of sites blocked by their software.

    BUT: Most users of such software probably do not want to second-guess filtering software companies and are prepared to 'trust' them. This makes independent surveys of the effectivenes and appropriateness of such software especially useful. Also my personal view, as a consistent advocate of openness, is that suppliers of such software should be prepared to disclose their lists and debate the appropriateness of inclusion or exclusion.

    HOW SHOULD ONE CHOSE A FILTERING SYSTEM?

    The "Superhighway Safety" pack issued in Britain by the Department for Education & Employment (DfEE) and the British Educational Communications & Technology agency (BECTa) recommends that, when one is buying a filtering product, one should ask the following questions:

    Links:
    American review of filtering software click here
    Australian review of filtering software click here


    WHERE CAN ONE OBTAIN ADVICE ON SAFE SURFING?

    Two excellent British sites are NCH Action For Children, which has a parents' guide to children on the Internet, and Childnet International, which makes awards to sites which particularly cater for children. On the European level, Childnet International has colloborated in a Netaware programme to promote safe surfing.

    In the USA, the GetNetWise coalition run a very good site and a broad-based coalition called America Links Up runs the Net Parents site, while other useful American sites are Disney's SurfSwell Island, Family Guide Book, Safe Kids and Safe Surfing With Doug. "Safe And Smart: Research And Guidelines For Children's Use Of The Internet" is an American study published by the National School Boards Foundation, the Children's Television Workshop and Microsoft.

    Finally, although Singapore may be a small nation, it takes Internet safety very seriously and there is a Parents Advisory Group for the Internet (PAGi) which has an active programme around child safety.

    Links:
    Website Safety for Kids & Teens click here
    Childnet International click here
    GetNetWise click here
    Family Guide Book click here
    Safe Kids click here
    Safe Surfing With Doug click here
    Child Safety On The Internet click here
    Washington State office of the Attorney General click here

    FURTHER READING

    "Protecting Our Children On The Internet" edited by Jens Waltermann & Marcel Machill (Bertelsmann Foundation Publishers, 2000)

    ROGER DARLINGTON

    Last modified on 15 June 2013


    Note 1:

    Links to filtering products:

    Note 2:

    In May 1999, the Annenberg Public Policy Center in the USA published the findings of a survey conducted by Roper Starch which revealed that 78% of US parents are concerned about the type of content their children can access on-line and one-third of them use software to control access by their children.

    In October 2000, the Digital Media Forum in the USA published the findings of a survey of 1,900 individuals about the use of computers in schools which showed that 92% wanted pornography to be blocked and 79% wanted filters to be used to bar hate speech.

    In November 2000, the Center for Communication Policy of the University of California Los Angeles published "Surveying The Digital Future", the results of a survey of 2,096 American households which showed that 32.8% used filtering software to control access to the Internet by children.

    Note 3:

    In the UK, a "Which" magazine survey of filtering software - published in the May 2000 issue - found that three of the filters tested blocked access to an advice and information site run by Lancashire Council's Youth and Community Service because of a small, factual paragraph about safe sex.

    In the USA, the Digital Freedom Network made 'awards' in October 2000 for examples of over-exuberant filter software which included:

    British wildlife artist Richard Bell found that his Web site was banned by a US Internet directory after an automatic checking system picked up captions such as "a pair of Great Tits" and "a magnificent cock pheasant".

    In August 2000, a Los Angeles attorney called Sherril Babcock was prevented from entering the Web site BlackPlanet.com because the filtering software would not allow her to register under her real name, although shortly afterwards she was able to register as "Babpenis".

    In the February 2001 issue of "the net" magazine, it was reported that Philadelphia's prestigous Beaver College has been forced to change its name after seeing applications fall by some 30%. Apparently prospective students are being deterred from visiting the US college's site by filtering software blocking the word "beaver" on the grounds of its sexual connotations.

    Back to home page click here