Back to home page click here

MY INFORMATION TECHNOLOGY
ARTICLES IN 2007

Since 2003, I have written regular articles on information technology for Connect, which used to be a separate trade union and in January 2010 became a section of the larger union Prospect. Originally the magazine was called "The Review" and then in April 2004 it was renamed "Connected". The text of all these articles, with relevant hyperlinks, are filed on my web site and this page brings together all those from 2007. If you would like to comment on any of them e-mail me.

Jan/Feb 2007 How You Became The Web
March 2007 The Network Of The Future
April/May 2007 The Changing Picture Of Television
June 2007 What Is It With Being Connected?
July/Aug 2007 Could The Net Fall Over?
September 2007 More Digital Divides Open Up
Oct/Nov 2007 Is Wikipedia The Best Site On The Web?
December 2007 Are You Safe On-line?


In Britain, at least until recently, UGC was a reference to a cinema chain (it's now called Cineworld [click here]). But these days the acronym means user-generated content – the phenomenon that is transforming the web. Our Internet correspondent Roger Darlington explains ..

HOW YOU BECAME THE WEB

"Time" magazine's 'Man of the Year' for 1982 was not a man at all but a machine - the computer. In those days, computers were in very few homes, hardly anyone had heard of the Internet, and the World Wide Web did not exist.

In 1999, "Time" magazine's 'Person of the Year' (notice the gender change) was Jeff Bezos, the founder of the e-commerce web site Amazon. By then, PCs were ubiquitous and the web commonplace but dominated by major commercial interests.

In 2006, "Time" [click here] made its 'Person of the Year' choice as 'You. Yes, you. You control the Information Age.' For this edition of the magazine, seven million pieces of reflective Mylar were ordered for sticking on the front cover, so that you saw yourself on the page.

So, what was all this about? Why did the magazine then devote no less than 27 pages to examining how you and me are now shaping the Net [for main article click here]?

In a sense, the Internet - first created in 1969 - started as a reservoir of user-generated content (UGC). It initially consisted mainly of e-mail and bulletin boards.

Then, with the invention of the World Wide Web in 1989, the Net became dominated by information and transactional sites created by old and new corporations.

What has happened in the last few years is that increasingly the largest volumes of material on the Net – and the fastesdt-growing and the most interesting – are coming not from organisations but from individuals.

It started with blogging and, as the free software became simpler, this spread like wildfire. Then it received a whole new impetus from social networking sites where users posted personal details (see Facebook [click here]) or photographs (see Flickr [click here]) or video clips (see YouTube [click here]) for all of us to access.

2006 was the year we knew this was big when Rupert Murdoch's News Corporation snapped up MySpace [click here] for a cool $580 million and Google acquired YouTube [click here] for an eye-watering $1.65 billion. But just how big is this development?

Today there are around one billion Net users globally. Now, of course, not all of these have a web site or blog or even put details on a social networking site or comments on a blog posting. But, compared to old media (for instance, the number who send letters to newspapers), the level of participation on the web is breathtaking.

The number of blogs worldwide is now over 60 million. Wikipedia [click here] now has around 1.6M articles in English alone. Every day, some 65,000 new video clips are uploaded to YouTube and every day over 100 million videos are watched on this site.

The level of activity of some users is astonishing.

For instance, there is a 25 year old Vietnamese-American called Tila Nguyen – she is Tila Tequila professionally – who has a profile on MySpace [click here] that has been viewed more than 50 million times and she has between 3,000-5,000 new friend requests a a day. Or take the 25 year old Canadian Simon Pulsifer who has authored over 2,000 articles on Wikipedia [click here] and edited roughly 92,000 others.

Of course, Tila and Simon are very exceptional. But consider 45 year old South Korean housewife Kim Hye Won. She has authored about 60 pieces for the on-line newspaper OhMyNews [click here] which has 47,000 such contributors to a site which obtains between 1-1.5 million page views a day.

Much of this UGC is – like much newspaper and magazine content – inconsequential, but sometimes it is explosive.

When a video clip made by S. R. Sidarth of American politician George Allen making a racist remark about him was posted on-line, it had repercussions that arguably led to the Democrats winning the US Senate in November 2005 [for explanation click here].

When Iraqi dictator Saddam Hussein was hanged in December 2006, the world became aware of the taunts and insults he received because someone there filmed the event with a mobile phone and put it on the Net [click here].

I guess that I am part of this revolution. I created a web site in 1999 and started blogging in 2003. Today my site receives around 3,000-4,000 visits a day. So maybe “Time” is right and I am “Person of the Year” after all – but then so are you.


Our current communications infrastructure is unsatisfactory and unsustainable – but rebuilding it will take money and imagination, explains our Internet correspondent Roger Darlington.

THE NETWORK OF THE FUTURE

In January 1980, I wrote a research report entitled “Optical Fibre Technology” for what was then called the Post Office Engineering Union (POEU). This examined trials of optical systems in the network of what was then the Post Office Telecommunications Business.

In May 1989, I organised a public conference called “The Network Of The Future” for what by then was the National Communications Union (NCU). The event promoted the case for a national broadband network carrying both telecommunications and broadcasting on optical fibre.

Almost two decades later, where do we find ourselves?

In a sense, Britain now has a national broadband network, since over 99% of homes can access exchanges that will provide bandwidth of half a megabit per second. In a way which we never envisaged in 1980 or 1989, the copper network has proved able to deliver broadband through the use of Asymmetric Digital Subscriber Line (ADSL) technology.

In another sense, though, we are no further forward, since we have no optical fibre in the local loop and, unlike other countries, there are no current plans to develop what is now called Next Generation Access (NGA).

Certainly the core network will be transformed, as all network operators invest heavily in what is called Next Generation Networks (NGNs) which use the Internet Protocol (IP), of which the most important in the UK is BT's 21st Century Network (21CN). But, as far as Next Generation Access is concerned, this country is in a state of paralysis.

In the UK, we are going to develop ADSL technology using those old trusty and rusty copper wires. Theoretically ADSL2+ technology will deliver bandwidth of up to 24Mbps, but in reality the state of the copper lines and the distance from the exchange will ensure that most customers will receive much slower speeds.

Meanwhile, other countries are taking a much more imaginative approach. The USA, Japan, South Korea, France, Germany and the Netherlands are all making significant investments in NGA.

In this country, all that has happened so far is that in November 2006, the regulator Ofcom issued a public discussion document – not a formal consultation document – entitled “Regulatory challenges posed by next generation access networks”.

The key regulatory issue here is whether NGA will create what the economists call “an enduring economic bottleneck”, that is a stranglehold on the competition by total control of a key element of the network.

The regulatory answer to such a bottleneck is to promote downstream competition through mandated access. The problem is that BT will not want to make heavy and risky investments in optical fibre if it is immediately forced to open up access to these networks to its competitors.

The regulatory answer to that dilemma is called forbearance - that is, permitting the investor in NGA sole use for a period of time sufficient to make the investment worthwhile.

In principle, I would favour a measure of forbearance but limited in both time and location. There is a case for forbearance in rural areas and downstream competition in urban areas.

In its discussion document, Ofcom states: "It is not for Ofcom to determine when or how public policy is employed with respect to next generation access deployment. However, the wider social implications are a key feature of the debate on next generation access."

This leads me to assert that next generation access is too big an issue to be left to the regulator. Ultimately NGA must be a public policy issue for Government. There is a strong case therefore for arguing that there should be a Government review of the case for this 'network of the future'.

In the summer, we will have a new Prime Minister looking for some big forward-looking ideas. I offer him this one.

Links:
BT's 21st Century Network click here
Next Generation Networks UK click here
Ofcom discussion document on next generation access networks click here
What's happening in the Netherlands click here
What's happening in Sweden click here
What's happening in Norway click here
CWA's Speed Matters campaign click here


Do not adjust your sets. The strange things happening to television are what you should expect from a digital revolution, explains our regular columnist Roger Darlington.

THE CHANGING PICTURE OF TELEVISION

In 1999, Andy Grove, then Intel’s Chief Executive, wrote a book called “Only The Paranoid Survive” [for my review click here]. This work was based on his concept of the strategic inflection point which he defined as “a time in the life of a business when its fundamentals are about to change”.

If you were working in the television industry today – a £11 billion business in the UK – you would have no doubt that you are smashing right into such a strategic inflection point because all the fundamentals of television are being utterly transformed by digital technologies.

Let's consider these one-time eternal verities one by one.

We have long ago moved from the single set in the living room, so that now most households have smaller, even portable, sets in the kitchen or bedroom. However, broadband now means that we are starting to watch television material on our PC or lap top and the BBC's iPlayer and Internet Protocol television (IPTV) will give a big boost to this trend. Other devices – such as games consoles and mobile phones – are increasingly offering access to television.

In the 1980s, the video cassette recorder (VCR) started the process of time-shifting, but half of viewers never managed to master these consumer-unfriendly devices. Now though we have the personal video record (PVR) a hard disc device that is both much simpler to use and has much greater storage capacity and, if (like me) you have a Sky+ box, recording material is a one-click operation and recording a whole series is only a two-click operation.

There was a time when the most popular programmes would attract audiences of around 20 million and you could chat to many people at work about the item you viewed the night before. This was great for advertisers who knew what they were getting for their money. The current fragmentation of audiences over many hundreds of channels restricts 'water cooler TV' to programmes like “Big Brother”.

If we take the broadest view of PSB (BBC 1 & 2, ITV 1, Channel Four & Five), then we find that only two-thirds of viewing is of the five main channels as viewers switch rapidly to the proliferation of digital channels. As the 'big five' respond to this challenge with more popular programming, this trend is undermining the whole concept of PSB.

In fact, advertising revenues are actually falling which is presenting a serious threat to ITV especially. Meanwhile subscription revenues are rising rapidly and subscriptions now raise more money for television than advertising. For the moment, the BBC licence fee is safe but, as viewers watch less and less BBC television, the sustainability of this funding model becomes ever strained.

We used to have documentaries and drama, but many programmes must now be classed as drama-documentaries as the styles are mixed. Similarly comedy and drama used to be quite different but a series like “Desperate Housewives” cleverly embraces both genres. Programming and advertising used to be completely separate whereas we now have a lot of shopping channels in which the programme is the advert.

In the beginning, the BBC and ITV made most of the British-produced programmes, but the independent production companies – the so-called 'indies' – are now strong not just on the commercial channels but on the BBC channels too. The value chain is becoming more complicated as one company screens a programme, a second makes it, and a third provides the interactive elements such as voting.

Increasingly we will all become television-makers since events can be filmed with relative ease and little expense using new digital technologies. Community Channel [click here] is a voice for community groups, charities of all sizes and not-for-profit organisations. A new channel called Information TV [click here] carries material from government departments, public bodies and other public service institutions. Current TV [click here] – promoted by Al Gore - offers a mix of self- and viewer-made short shows.

Most of these changes are good news for consumers: more choice and more empowerment. But, if you are Mark Thompson of the BBC or Michael Grade of ITV, it must make management and funding little short of a nightmare.


You may think that you're already bombarded with electronic messages, but many of us are seeking more and the technology is certainly going to deliver many more. Our Internet columnist Roger Darlington considers the question:

WHAT IS IT WITH BEING CONNECTED?

It is not so long ago that each day most of us received simply a few letters and several dozen phone calls. Today typically many of us also receive a dozen or so text messages on our mobile and several hundred e-mails on our PC, lap top or PDA.

Many of us think that we have reached our capacity to absorb and handle messages and that we are already suffering from information overload – and yet consider some current behaviours and some future technologies and you will appreciate that this is just the beginning.

When you wake up in the morning, what is the first thing you do? For many of us, one of the first things we do is to switch on the mobile (if it was even off) and to check our e-mail.

When you are at a meeting, are you paying full attention to the discussion? Chances are that you're checking your e-mail or surfing web pages on your lap top or Blackberry (not for nothing known as a 'crackberry').

When you go on holiday, do you leave it all behind? A lot of us are regularly texting family and friends and some of us are even checking our e-mails on the lap top or PDA.

Young people – and, even if Connect does not have many young members, we have our children and know their friends – are even more compulsive. You might be surprised at how many Instant Messenger or text messages your daughter sends – even to school friends that she sees every day.

People in their 20s and 30s are signing on big time to a host of new social networking sites that marry the web and the mobile and enable them to share the details of their everyday life with a close circle of friends, or anyone who wants to connect with them, or even any web user – and all in real time at very low cost.

These sites have weird names like Twitter [click here], Kyte [click here], Radar [click here] and Jaiku [click here]. Most people have never heard of them, but they are symptomatic of how technologies are changing our patterns of communication and of our seemingly insatiable appetite for messages.

Why is this happening?

In previous societies, people obtained a strong sense of identity from their family, village, class or religion. All these social units have become fragmented and less important to many people. But all of us still have a strong desire to belong and to be recognised - perhaps especially young people who are still establishing their role in life.

This is where new communications technologies are so appealing and even addictive. They give people a sense of belonging, involvement and participation or even – to become a bit more psychological for a moment – a feeling of validation, inclusion and desirability. And the networks are operating 24/7, the feedback is instantaneous, the usage is simple, and the cost is minimal.

Even for those of us not looking for more messages, the technology is going to make communications more useful and desirable. This is because things like radio frequency identification (RFID) tags, satellite navigation chips, and new radio technologies are going to enable millions and eventually billions of everyday objects to communicate with each other and with us.

Miniature devices in a medicine container or special bracelet could tell you whether your aging grandmother has taken her tablets today or failed to rise from bed. Similar devices in your food packaging could tell you whether the 'eat by' date has passed or in your clothes could set the washing machine to the correct programme.

And, of course,distance will be no obstacle at all. Even on holiday in Australia, you will be able to alter the heating and lighting in your home or to record a radio or television programme.

As well as issues of information overload and personal control, there will be a profound question of privacy. Currently there is a pact – often unwritten - between citizen and state and between consumer and company. But, as new communications technologies become ubiquitous, we will need much more transparency and openness about how information is collected, communicated and controlled.


We are now critically dependent on the Internet, but the network is under pressure both internally and externally, as Roger Darlington explains.

COULD THE NET FALL OVER?

Every government department, almost every business of any size, and over 60% of homes in the UK now use the Internet. Over one billion people worldwide are now connected. Every day, the numbers increase. Last year, China alone added 26 million users.

As more organisations and individuals make more intensive use of the Net, we become ever-more dependent on the network to work fast and efficiently every second of every day. But can we totally rely on this?

There are two main threats to the effectiveness of the Internet: an internal one which is that it could run out of capacity and an external one which is that it could be struck by a malicious attack.

Let's start with the capacity problem. Originally the Net only carried messages: e-mails and bulletin boards. Then we saw the development of picture-rich web sites. Now we have sites such as YouTube with streaming video and surfers using the likes of BitTorrent to downloaded huge files.

This has required the capacity of the Internet to increase year on year on year. Indeed Verisign, the American firm which provides the backbone for much of the Net, including domain names .com and .net, is investing $100M (£51M) over the next three years to increase bandwidth 10-fold for new services.

Early in the year, a report from Deloitte said 2007 could be the year the Internet approaches capacity, with demand outstripping supply. It predicted bottlenecks in some of the Net's backbones as the amount of data overwhelms the size of the pipes.

In fact, there is virtually unlimited capacity in the Net's backbone infrastructure because so much optical fibre has already been laid. A problem is more likely to occur with the routers, although Cisco is now manufacturing routers than can handle a staggering 92 terabits a second.

Much more serious is the weakness of the so-called 'last mile' – the copper cables that connect almost all Net users to their ISP. This is why ISPs are putting caps on usage or charging more for extra usage or using techniques like bandwidth shaping.

The other threat to the operation of the Net is external: either an incident which takes out capacity or an attack which damages the proper running of systems or sites.

It was hardly reported outside Asia but, a few weeks ago, Vietnam lost half of its Internet connections because some fishermen in Ca Mau (the south-east point of the country) dived down into the sea and cut off about 98km of optical fibre cable in order to sell it. Apparently it will take $4 million and some four months to repair the lines.

My man in Hanoi e-mailed me: “From this morning, it's very difficult for us to load international web pages. The Internet turns very slow and idle. That's really a disaster to me. Hope that things will be OK soon and those fishermen should be punished for their actions."

Even more serious is the risk of attacks on web sites for political reasons or terrorist purposes.

In May, a significant number of web sites in Estonia - especially ones of Government departments and political parties - were under heavy attack for weeks and the strong suspicion was that Russia was behind the assault because of strained relations between the two countries. NATO dispatched some of its top cyber-terrorism experts to Tallinn to investigate and to help the Estonians beef up their electronic defences [for more information click here].

But this was merely the visible tip of a much more extensive series of unreported attempts to compromise systems and sites.

Why should Hamas and Hezbollah content themselves with sending suicide bombers and rockets into Israel if they could disable the IT networks of Mossad or the Israeli Defence Ministry? You can be sure that, even as you're reading this, there are people in Syria and Iran attempting such cyber-attacks.

The sort of electricity outage we saw in the north-west of the USA a few years ago could look a minor inconvenience compared to a major collapse or attack on a section of the Net. Every government agency and sizable company should have a contingency plan for such an eventuality.

It may never happen – but don't be too surprised if it does.

Links:
Could the UK face a cyber attack? click here
Chinese attacks on US and UK sites click here
Attacks on New Zealand's sites click here
DoS attack on Verizon click here


From some of the media hype, you could be forgiven for thinking that now everyone is on-line and generating their own content, but our Internet correspondent Roger Darlington studies the latest survey and finds that the picture is more variable than one might think.

MORE DIGITAL DIVIDES OPEN UP

The Oxford Internet Institute (OII), headed by Professor William H Dutton, has now run three biennial Oxford Internet Surveys and the 2007 report gives not just a detailed profile of UK Net users and usage but an indication of some broad trends.

We tend to think that everyone is now on-line, but the survey finds that two-thirds of homes have access to the Net but 34% still do not. The OII notes that “the increase in access and use of the Internet from 2003 to 2007 has been slow, if it has not yet reached a plateau”.

Age is a major factor: while 90% of those under 18 in the survey use the Net, only 24% of those aged over 75 did so. Lifestyle is another important factor: 97% of students and 81% of the employed are on-line, but only 31% of the retired are.

Perhaps the biggest differentiator is income: of those earning less than £12,500 a year, only 39% use the Net but, of those taking home more than £50,000, the figure is 91%. A final factor worth mentioning is disability: the survey found that only 36% of those with a disability are connected compared to 77% of those without a disability.

So, are these digital divides going to dissolve any time soon? The OII records that, among households without the Internet, the number that say they are planning to get access in the next year has dropped dramatically. Less than one-fifth (18%) plan to obtain access in comparison to 44% in 2005.

Just why do those off-line not wish to become connected? Expense is far from being the major reason (51%). Mostly it is all about not knowing how to use the Internet (81%) or a computer (77%) or feeling that it is not for people like them (60%).

There is a major challenge to Government here. Something like a third of households are unlikely to go on-line anytime soon unless there are some significant support programmes.

So, what about those who are connected? Are they making full use of the Net?

Well, e-commerce is doing fine with 79% of Net users saying that they buy products and services on-line, although 38% report that it is difficult to return or exchange goods which have been purchased on the Internet. Also there is slow growth in use of government web sites: 29% access local government sites and 26% go to central government sites.

However, active civic participation is very, very low: only 7% have signed an on-line petition and a mere 2% have used the Net to contact a politician.

But politics is boring, right? Instead we are all running our own web site or blog and social networking like crazy. Well, not exactly.

Passive production such as posting photographs is fairly common (28%), but only 15% maintain a personal website and only 12% (actually down from 17% in 2005) run a blog. Less than one-fifth (17%) of Net users have created a profile on a social networking site and students are three times as likely (42%) as employed users (15%) to have a profile and almost no retired users (2%) have such a profile.

Among the rich data in the OII survey, one other subject is particularly intriguing, namely the divided views on the idea of Internet regulation. There is simply no consensus about whether the Government should regulate the Net. Just over a third (36%) think that it should, just under a third (30%) think that it should not, and the remaining third (33%) are undecided.

Non-users and ex-users think the Internet should be regulated by Government to a larger extent than users. 51% of non-users support such regulation, compared to 31% of users.

So it would appear that usage of the Internet alleviates some of the fears about being on-line, but a massive majority (85%) believe that children's content should be restricted and a third (34%) have received a virus on their computer and 24% complain about too much spam.

Link: Oxford Internet Surveys click here


We all have our favourite web sites and here our Internet columnist Roger Darlington looks at the background to one of the sites he uses most.

IS WIKIPEDIA THE BEST SITE ON THE WEB?

It is has over 8 million pages in more than 250 languages. It is by far the biggest encyclopedia ever written. And it's all done by volunteers and free to all users.

It is of course Wikipedia – the web site that shouldn't be possible. So how has it happened?

The site was founded by the American Jimmy Wales [click here] and his then partner Larry Sanger and it was originally called Nupedia. The concept then was to invite experts to contribute articles and, by the end of the first year, they had a grand total of 22. The next year was not that much better.

The plan changed dramatically when the founders decided to use the idea of the wiki which enables any Net user to contribute an article or to edit one. In the first two weeks of the new approach, they had more articles than in the two years of Nupedia.

The whole enterprise seems to defy the laws of business and economics. The Wikimedia Foundation is run as a charity on a budget of £700,000 a year provided by donations, mostly of around £20. It takes no advertising.

It employs only seven people with an office in St Petersburg, Florida and one-room outposts in California and Poland. Its main servers are in Tampa, Florida with additional servers located in Amsterdam and Seoul.

And yet it is among the top ten most visited web sites in the world and, at peak times, has around 15,000 visitors a second. It is probably worth more than £2 billion.

To merely summarize the content of the site is to stretch the imagination. In its main language English, there are almost two million articles. There are another 630,000 articles in German and 550,000 in French. There are over 400,000 in Polish and Japanese and well over 300,000 in Italian and Dutch.

And so it goes on. The top 14 languages have over 100,000 articles and the top 139 have over 1,000 articles At the last count, there was some material in 253 languages and the total number of pages is currently 8.2 million.

Every page contains links to other pages. And amendments are being made and pages are being added every second of every day. A typical page has been edited 16 times and the site is growing at the rate of 1,700 articles a day.

Wikipedia already has a range 20 times greater than the entire 17 volumes of the “Encyclopaedia Britannica”. The grandiose mission of Jimmy Wales is “to bring the sum of human knowledge to every single person on the planet, free, in their own language”.

It is not just the pages of factual content that are impressive. For each page, there is a record of all the amendments made and a discussion forum to debate the content. The democratic nature of the original authorship and subsequent amendments, together with the transparency of whole the process and the invitation to improve and debate the material, make Wikipedia a truly radical project.

The contributors to the site number in the hundreds of thousands, but there are some 75,000 active contributors, there is a core of around 4,000 people who make more than 100 edits a month, and there are 1,000 official 'admins' who arbitrate when 'trolls' try to vandalise pages and who if necessary block unruly users from the site.

But does the process work? Can one rely on the accuracy of Wikipedia?

Like most information sources – both on and off-line – there are mistakes. But the evidence suggests that the level of accuracy is as good as other comparable sources.

The scientific journal “Nature” ran an exercise to test the comparative accuracy of Wikipedia as contrasted with “Encyclopaedia Britannica” in 43 randomly selected articles [click here]. They found 162 mistakes in Wikipedia compared to 123 in “Britannica”.

Also Wikipedia pages are open about material which needs to be checked or sourced.

After years of using the site literally every day, I am a huge fan. It is not perfect, it is not brilliantly written, but it is hugely informative and very user-friendly. As a starting point to learn about a topic, it is currently unbeatable.

Links:
Wikipedia search page click here
Wikipedia main page click here
Wikipedia's About Wikipedia page click here
Wikipedia's Wikipedia FAQ page click here
John Naughton on Wikipedia click here


Something many don't like to talk about too openly is the issue of Internet security, but our columnist Roger Darlington lifts the curtain on a darker side of the Net.

ARE YOU SAFE ON-LINE?

Let's start with what we know. More people are connecting to the Net and they are using ever faster speeds. They are spending more time on-line, doing more things, and carrying out more transactions.

What we don't know is the true level and sophistication of e-crime. We don't know whether proportionately it's worse than off-line crime, how badly companies and individuals are being hit, and how the various players are coping.

This is because there is no central collation – or even an agreed definition – of e-crime. Furthermore many companies are reluctant to admit problems because this could damage their brand and deter visitors to their site.

However, in the first half of 2007, the issue of personal Internet security was examined by the House of Lords Science and Technology Committee led by Lord Broers (formerly of IBM and now a Vodafone board member). In August 2007, it published a 121-page report [for text click here].

While being very clear that “the Internet is a powerful force for good”, the Committee insisted that “the Internet is now increasingly the playground of criminals” and that these bad guys are “highly skilful, specialised, and focused on profit”.

The Committee looked at problems like denial of service attacks, malicious code (malware), phishing, identity theft, and on-line fraud and theft. It insisted that “there is a growing perception, fuelled by media reports, that the Internet is insecure and unsafe”.

In the course of the inquiry, a major difference emerged between the Government and the Committee.

In its evidence to the Lords, the Government insisted that the responsibility for personal Internet security ultimately rests with the individual. However, the Committee – rightly, in my view – took a different stance.

It argued that the Government's position is “no longer realistic” and argued that this attitude “compounds the perception that the Internet is a lawless 'wild west'". The Lords asserted that “it is clear to us that many organisations with a stake in the Internet could do more to promote personal Internet security”.

Therefore the Committee had messages and recommendation for a wide range of players: manufacturers, retailers, Internet service providers, businesses that operate on-line, police and the criminal justice system, Government, and Ofcom.

The prime challenge, though, was to the Government: “Government leadership across the board is required. Our recommendations urge the Government, through a flexible mix of incentives, regulation, and direct investment, to galvanise the key stakeholders".

Seem sensible? So, how did the Government react?

In its response [for text click here], it insisted: "The Government does not agree with the implication within the report that the public has lost confidence in using the Internet". Instead it argued that there is “an acceptable level of comfort with the technology”.The response asserted that “we would refute the suggestion that the public has lost confidence in the Internet and that lawlessness is rife".

Ministers stated: "Legislation will be kept under review but the Government does not consider that imposing additional burdens on business is the best way forward".

Unsurprisingly, the House of Lords Committee feels that the Government is taking too relaxed a view. Committee member the Earl of Erroll - who has a long track record in IT - was reported in the press saying: "The Government's response is a huge disappointment".

So, who is right? Are people worried about being on-line?

In the latest Oxford Internet Institute survey [for access click here], one subject was particularly intriguing, namely the divided views on the idea of the Government regulating the Internet which must be one measure of the level of concern. Just over a third (36%) think that it should, just under a third (30%) think that it should not, and the remaining third (33%) are undecided.

Significantly non-users and ex-users think the Internet should be regulated by Government to a larger extent than users. 51% of non-users support such regulation, compared to 31% of users.

A massive majority (85%) believe that children's content should be restricted and a third (34%) have received a virus on their computer and 24% complain about too much spam.

A report from the Ofcom Consumer Panel on “Consumers And The Communications Market: 2007” [for text click here]found that 61% had worries or concerns about using the Internet, while more specifically 26% expressed anxiety about Internet security.

So this debate is going to run and run ...

Links:
Response to Lords report from Ofcom click here
Response to Lords report from Children's Charities' Coalition on Internet Safety click here
EURIM workshop on e-crime click here


Back to home page click here