Since 2003, I have written regular articles on information technology for Connect, which used to be a separate trade union and in January 2010 became a section of the larger union Prospect. Originally the magazine was called "The Review" and then in April 2004 it was renamed "Connected". The text of all these articles, with relevant hyperlinks, are filed on my web site and this page brings together all those from 2008. If you would like to comment on any of them e-mail me.
Jan/Feb 2008 Is Your Broadband Up To Speed? March 2008 The New Plan For The 21CN April/May 2008 Our Fragmenting Television Picture June 2008 Who Is Bringing Us NGA? July/Aug 2008 Should We Be Worried By Behavioural Targeting? September 2008 What's Next For The Net? Oct/Nov 2008 The Mobile Revolution December 2008 The Internet Of Things
Are you getting enough? Broadband speed, that is. Our Internet columnist Roger Darlington explains the problems and offers some solutions.
IS YOUR BROADBAND UP TO SPEED?
In the four years that I have been a member of the Ofcom Consumer Panel, none of the issues we have raised has excited more consumer and media interest than that of broadband speeds.
My own experience is typical: I signed up for a service offering up to 8 Mbit/s; I was told that I could only receive about 4 Mbit/s; and most of the time I obtain just over 2 Mbit/s. Apparently I live too far down the line from my local exchange.
But I am far from being the only one with problems. A report published by Which in August 2007 concluded that, while many packages now advertise speeds of up to 8 Mbit/s, the average speed for such connections was 2.7 Mbit/s.
When the Consumer Panel went public on the issue, the BBC web site opened an on-line discussion which was flooded with angry comments. They closed the discussion after receiving almost 2,000 submissions.
Now there are many reasons why an individual customer may not receive the maximum possible broadband speed: Some relate to physics: the state of the copper cable varies around the country, some of it is decades old and, the further one lives from the exchange, the worse the speed.
Some relate to the network design of ISPs: the more people using the service at any given time - known as the contention ratio - the worse the speed.
Some relate to the consumer's own premises or equipment: the internal wiring may be of low quality, the PC may be inadequate, and there could be interference from other electrical appliances.
But all this does not excuse ISPs in being more open and honest with consumers about what they can reasonably expect and what can be done if their expectations are not fulfilled.
In October 2007, the Chairman of the Ofcom Consumer Panel wrote to the top six UK ISPs about the problem and had a series of meetings with them. Then, in December, the Panel submitted its proposals to Ofcom.
The Panel wants to see Ofcom leading discussions with industry to produce an enforceable code of practice that would be mandatory for ISPs. This code would establish agreed processes to give the customer the best information during and after the sales process. Also it would give them flexibility to move freely to different packages that reflect the actual speeds with which their ISPs are able to provide them.
The code of practice should include a commitment from ISPs to:
The Panel also wants the advertising of broadband speeds to be tightened up. It will be requesting that Advertising Standards Authority, working with industry, considers how the range of factors affecting broadband speeds can be given much greater prominence in advertising material.
In an initial response to the Panel's proposals, Ofcom Chief Executive Ed Richards stated that the regulator's initial proposalsare very much in line with those of the Panel. He announced that Ofcom has already started initial discussions with leading ISPs and wants to see that any measures are implemented in the shortest time fame possible.
The latest Ofcom survey of The Consumer Experience generally shows sustained high levels of satisfaction with communications services. The exception is broadband services. Overall satisfaction has fallen from 92% to 88%, while the figure for those very satisfied has fallen from 44% to 38%.
The writing is on the wall: ISPs need to up their broadband game and be more open with their customers.
Link: letter from Ofcom Consumer Panel to Ofcom Chief Executive click here
BT's next generation network project was always ambitious, so maybe it's no surprise that something of a rethink has now taken place on implementation, as Roger Darlington reports.
THE NEW PLAN FOR THE 21CN
Around the globe, backbone or core telecommunications networks are being replaced by new networks which use the Internet Protocol (IP) that is at the heart of the Internet. Such a new core network is called a Next Generation Network (NGN).
The one currently being developed by BT is called the 21st Century Network (21CN) and the public face of this programme is a web site called Switched On.
The purpose of the project is to move all the company's networks dedicated to particular services (around 12 depending on definitions) to a single network deploying the Internet Protocol.. Total investment will be of the order of £10 billion, although a proportion of that money would have been invested even without 21CN.
BT has presented itself as something of a first mover and world leader in NGN.
When an indicative timetable was first announced in June 2004, the plan was that mass migration would commence in 2005, up to half of PSTN customers would have migrated by the end of 2007, and the whole programme would be completed by the end of 2011.
The intention then was that customers would notice nothing as they were switched from the old to the new network with no noticeable change in quality and no introduction of any new services.
Since then, BT has been consulting with industry and gained real life experience of customer migrations in the field and a review has taken place in the light of industry input and the early customer feedback.
Consequently, in November 2007, BT proposed to industry (via the Consult21 process) and Ofcom that it would allow for extended periods of voluntary customer upgrades to new services rather than an early forced migration and make available the offer of a new broadband service when a customer switches.
The company insists that the fundamental objectives of the programme are the same. Also it believes that it is still a year or two ahead of other leading telcos around the world, although it concedes that some niche players are ahead in some respects.
The proposed new approach has largely come about as a result of the experience of the initial test phase known as Pathfinder that has been taking place in the Cardiff area, together with consultation with industry and recognition of evolving new technologies. The test phase commenced at Wick on 28 November 2006 when the first customers were put on the new network.
The programme gradually expanded to include all the analogue telephone lines connected to the Wick exchange and subsequently the Bedlinog exchange a total of around 1,100 customers. However, BT had originally intended to migrate about 350,000 customers in South Wales by December 2007.
The new 21CN roll-out proposal is fundamentally different from the one originally conceived and announced:
Originally the plan was to migrate customers forcibly one geographical location after another in accordance with a national plan. Now BT is proposing that Communications Providers would be invited - for an extended period - to switch or have their customers switch to the new network voluntarily where these new services are available, although at some point down the line there will need to be a national migration date, followed by a planned legacy platform and service withdrawal.
Originally moving from the old to the new network would have involved no difference in services. Now the plan is to offer those switching to the new network ADSL 2+ which will provide broadband service up to 24 Mbit/s compared to the current ADSL offering of up to 8 Mbit/s (in both cases, of course, actual speeds depend very much on distance from the exchange and other factors).
The launch of the new ADSL2+ service will be from April 2008 when the new service will be available commercially to around 5% of the UK marketplace, rising over the next 12 months to around 50%. This compares to the original plan when large scale migration was intended for 2006. BT will not give an end date for the new timetable as it is still the subject of consultation with industry, but it might reasonably be assumed to be somewhere after 2011 (the original end date).
NGN UK click here
BT's 21st Century Network click here
Switched on click here
We're no longer all watching the same television programmes at the same time but does this matter? Roger Darlington explores the issues.
OUR FRAGMENTING TELEVISION PICTURE
As a teenager, I lived in a district of south Manchester where a converted church housed the BBC studios which broadcast the immensely popular programme Top Of The Pops. The show began in 1964 and ran for over 2,200 performances and an incredible 42 years, only finishing in 2006 as its audience shrank and shrank.
In the 1960s, a programme like Top Of The Pops - or a popular soap or a dramatic documentary could attract an audience of between 15-25 million. People would watch the same programme at the same time and often discuss it at work the following morning.
But no more. Most television viewers now have a choice of hundred of channels through digital technologies whether delivered through cable, satellite or terrestrially. Even if some of them are watching the same programmes, they may not be watching them at the same time thanks to personal video recorders, catch-up TV on broadband, and the wide availability of DVD sets.
Some commentators talk of the simultaneous viewing of a programme by large numbers as common spaces and argue that the retention of such common spaces is vital to social cohesion and our sense of community.
It is part of the argument for public service broadcasting as opposed to leaving everything to commercial forces. It is part of the argument for retention of the BBC licence fee even when fewer people are watching BBC channels.
But does this fragmentation of television viewing matter and, even if it does, is there anything we can do about it?
Point one: why should broadcasting be different from other media in terms of common spaces?
After all, we don't all read the same newspapers or books or view the same films or plays? And, when we were all watching very similar programmes, arguably they represented a rather narrow, white, middle-class view of Britishness.
Mass audiences for TV programmes was only a phenomenon of the 1960s and 1970s before then we didn't all have TVs and after then we saw the growing impact of multi-channel TV.
So one could argue that common spaces in broadcasting was a sector-specific, culturally-specific and time-specific experience that is no longer appropriate or desirable.
Point two: is the loss of common spaces in public service broadcasting that complete?
We still watch lots of television on average some three and a half hours per person per day. And we still watch a lot of the main channels even in multi-channel households, two-thirds of all viewing is on the five major channels.
No programme regularly wins the viewing figures that Top Of The Pops - or Coronation Street - used to clock up in its heyday, but each evening the BBC and ITV news programmes at 10 pm have a combined viewing audience of around 8M and, at times of national crisis (like the London bombings of 7/7), viewers still flock to the BBC.
Point three: are we wrong to think of traditional broadcasting in isolation?
People now watch what we still call television not just on TV sets but on PCs, mobiles, and iPods and the arrival of faster broadband and IPTV will accelerate this trend.
Furthermore media is blurring. Popular television programmes have their own web sites and even blogs and they place or stimulate coverage in newspapers and magazines. Think about the Shilpa Shetty row on Celebrity Big Brother: few people saw it as it happened, but millions learned about it through replays on television or discussions on the radio or coverage in the newspapers.
Point four: ultimately don't consumers want greater choice about what and when and how they view audiovisual material?
Consumers are voting through the remote control and the mouse by selecting what to watch and when and how it suits them. Consumers are choosing through the marketplace by taking up digital television in ever greater numbers and purchasing new devices that give new forms of access to ever greater volumes of material.
In doing so, consumers are asserting their individualism. While I am recording Lost, my wife is recording Location, Location. Neither of us are satisfied with the volume of coverage on the BBC or ITV of the American presidential primaries, but we can watch CNN or Fox News and access the web sites of the New York Times or the Washington Post. On the whole, it is good news.
The pioneer of next generation broadband in the UK is not who you expect. Our columnist Roger Darlington reveals a dark horse leading the way.
WHO'S BRINGING US NGA?
The technical term is next generation access (NGA), but a more user-friendly term is super-fast broadband. We're talking here of downloads speeds of up to100 Mbit/s.
The debate on NGA for the UK has picked up tempo but there's still very little happening on the ground compared to many other countries.
In March, Connect made a valuable contribution to the debate by publishing its booklet: Connecting Britain's Future: The Slow Arrival Of Fast Broadband [for text click here]. This is the most intelligible guide to all the main issues that has so far been produced.
In June, several pieces of new research are to be published to coincide with a one-day conference organised by the Broadband Stakeholder Group (BSG). One research report by Plum Consulting is on the economic and social value of NGA, while a further report from the Analysys Mason consultancy addresses the case for public sector intervention.
Two further studies are in progress: Analysys is looking at the case for fibre to the cabinet (FTTC) vs fibre to the home (FTTH), while Ofcom is reviewing the prospects for duct sharing. Following a six-month review commissioned by the Government, a major report will come from former Cable & Wireless CEO Francesco Caio some time in the Autumn.
So we have plenty of reports - but still very little action. Why?
One of the reasons why next generation access is not happening anywhere near as fast in the UK as in many other countries is that this is a disruptive technology and none of the existing actors has a real incentive to move.
Take BT. It owns the current copper network and it wants to 'sweat' those assets by extracting the maximum potential from them. ADSL technology has enabled it provide up to 8 Mbit/s over copper and now ADSL2+ promises to deliver up to 24 Mbit/s (for a few anyway).
What about the alternative network operators (altnets)? Well, in the last few years, they've invested millions in local loop unbundling (LLU) and the move to NGA would undermine the value of those investments. Unbundling in the NGA world would not be at the exchange, but at the cabinet, with considerable technical and economic implications.
Then there is the regulator Ofcom. It has spend years constructing a model of competition based on the functional separation of BT through Openreach. If we moved to NGA, this separation model would needed to be revisited big time and completely new regulatory remedies sub-loop unbundling and active line access would have to be negotiated.
So, who is going to pioneer NGA in the UK?
Virgin Media which owns the cable networks passing around half of UK homes - has announced an upgrade to its local networks that will enable the launch of a 50 Mbit/s broadband service (plus an upstream speed of around 1.5 Mbit/s). It is intended that this will be available to around 70% of its customer base by the end of 2008. But this is not fibre to the home (FTTH).
The immediate prospect for the deployment of FTTH is in the Ebbsfleet Valley part of the Thames Gateway project in Kent. BT Openreach will supply the infrastructure, but BT Retail and its competitors will be offered access to the high speed lines on a wholesale basis.
The top available speed will be 100 Mbit/s. This is expected to start in August 2008. However, it will initially be limited to around 600 new houses. The development will eventually have some 10,000 homes but the project could take until 2020 to complete.
But, if you want to see an early and significant use of fibre, you have to look to an unusual source.
H2O Networks Ltd, the pioneer of providing fibre connectivity via 360,000 miles of sewers, has announced that the UK's first Fibrecity will be Bournemouth. Work will begin on the deployment of the fibre within the next few months.
This will be the largest Fibrecity project in Europe and the company will be funding and providing the network at a cost of around £30 million. The fibre will provide ultra high bandwidth to all Bournemouth's businesses and more than 88,000 homes at speeds far exceeding current DSL or cable modem speeds.
Regulatory Challenges Posed By Next Generation Access Networks, Ofcom discussion document, November 2006 click here
Pipe Dreams? Prospects For Next Generation Broadband Deployment In The UK, Broadband Stakeholder Group, April 2007 click here
Future Broadband - Policy Approach To Next Generation Access, Ofcom consultation document, September 2007 click here
Users of the web want high privacy and low prices, but sometimes there might be a trade-off between the two, as our Internet columnist Roger Darlington explains.
SHOULD WE BE WORRIED ABOUT BEHAVIOURAL TARGETING?
As broadband prices continue to fall but there remains a need for new infrastructure investment, Internet service providers (ISPs) continue to look for new sources of revenue. One of the newest possible sources is also proving to be one of the most controversial: behavioural targeting.
The idea is that your ISP will install special software in its network which will intercept web site requests that you make as you roam around the Net. The software will then scan these pages for key words in order to build up a profile of your interests and then use this information to target you more accurately with online advertising that you are likely to find of interest.
So, for example, you might access sites which include the words 'Cyprus', 'hotel' and 'flight'. When you later look at sites that carry online advertising, you are more likely to see offers of flights to Cyprus or hotels on the island.
Such online advertising because it is targeted will be more effective, so more can be charged for it and ISPs using behavioural targeting software will receive a share of the extra revenue.
The companies leading the way in providing such behavioural targeting software are Phorm (previously 121Media) [click here], NebuAd [click here] and FrontPorch [click here]. Phorm has managed to sign up the three biggest ISPs in Britain: BT, Virgin Media and TalkTalk. Between them, these three ISPs account for around 70% of UK Internet subscribers.
So what's the problem? Privacy campaigners worry about how much information ISPs will have on our surfing behaviour and how they will use that information.
Here in Britain, computer security expert Richard Clayton is not happy and the Information Commissioner has queried aspects of the system. The European Commissioner is monitoring the situation. In the USA, 15 pro-privacy organisations have written to the House of Representatives demanding public hearings on the use of such technologies.
Consumers were not thrilled to learn that BT had trialled Phorm without advising the relevant customers. Some 36,000 customers were included in this first trial in September-October 2006 which may technically have breached the law. A second trial involving some 10,000 customers is due shortly and this time BT promises to make an announcement.
However, Phorm claims the technology does not gather personally identifiable information, does not store IP addresses, search terms or browsing histories, and only sees users as a unique, random number. So it argues, unlike search engines or most websites, Phorm's technology cannot know who users are or where they have browsed.
Phorm states that its privacy claims have been validated under best industry practices, both through an independent audit conducted by Ernst & Young and a Privacy Impact Assessment undertaken by Simon Davies, MD of 80/20 Thinking and Director of Privacy International.
Providers of behavioural targeting software insist that search engines like Google already hold far more data on users without there being an outcry.
A key part of the debate is whether consumers should have to 'opt out' of the system or 'opt in' to it. In the United States, two Congressmen have questioned the 'opt out' approach used by the NebuAd system that is being used by Charter Communications, the country's fourth-largest ISP. Here in the UK the Information Commissioner has suggested that such systems should be 'opt in'.
While the 'opt in' model would obviously be welcome to privacy campaigners, this would probably reduce the use of the software to levels that render it uneconomic, unless consumers can be incentivised to do so by for instance lowering prices. BT is following this route by offering users of its free Webwise service [click here] extra levels of security, such as anti-phising protection, in return for accepting more targeted advertising.
There seems little doubt that behavioural targeting is here to stay and will grow in use. The solution to the current controversy is to for those companies using such systems to be much more open and honest with their customers and communicate much more fully what the systems do and do not do. Ultimately customers need to be in control if they are to be content and this points the way to 'opt in' systems.
"Economist" article "Watching while you surf" click here
Federal Trade Commission testimony click here
In such a short period of time, the Internet has become a central part of our lives. But, as our columnist Roger Darlington explains, there is more a lot more to come.
WHAT NEXT FOR THE NET?
There is no historic precedent for the speed and influence of the development of the Internet. Although originally conceived as a military communications network called the ARPAnet in 1969, the World Wide Web - the graphical part of the Internet - was only invented (by a British scientist) in 1989 and arguably the Internet really 'took off' in 1993 when its use doubled to more than 25 million people.
Yet future developments promise to transform the Net as we understand it today. Consider in no particular order just seven changes that we know are just round the corner (no doubt there will be others that take us by surprise).
1) The reach of the physical infrastructure will become genuinely global.
East Africa remains the only large, inhabited coastline cut off from the global fibre-optic network that is the heart of the modern Internet. Reliant entirely on expensive satellite connections, people on the world's poorest continent pay some of the highest rates for logging on. But, in the next couple of years, three new undersea cable systems will give millions of African Net users the sort of experience and prices that we take for granted. [For more information click here].
2) The number of users will explode.
Today there are around 1.4 billion users of the Net. But the world population is currently 6.7 billion, so only just over a fifth of the global citizenry in on-line. By 2020, the world population is expected to be over 7.7 billion. So there is plenty of scope for growth in the number of Net users and, thanks to what economists called externality, every new user benefits every existing user. [For statistics on world population click here].
3) Broadband speeds will leap.
Basic broadband with an always-on connection could be thought of as 512 Kbit/s, but already BT's ADSL2+ service is providing up to 24 Mbit/s and Virgin Media's cable network can offer 50 Mbit/s. Next generation broadband systems using fibre to the cabinet (FTTC) or fibre to the home (FTTC) will soon routinely provide speeds of 100 Mbit/s and more. BT's recent announcement of a £1.5 billion investment to bring next generation access to up to 10 million households by 2012 should kick-start the NGA roll-out in the UK. [For information of next generation access click here].
4) Much more content will be available.
We think that all information is now on-line but this is far from true. The bulk of human knowledge (one estimate is 85% of published information) remains off-line and the challenge is to change that as quickly as possible. I was recently contacted by a researcher who wanted on-line access to a telecommunications study I wrote in 1979 and, of course, it is simply not available on the Net. More significantly, many public records and scientific studies are still not accessible.
5) Many more services will be on-line.
Ten years ago, Google was merely two guys in a garage; now it is the world's most used search engine (and much more) employing 20,000 and worth tens of billions of dollars. The next decade will see many similarly tranformative services being developed. As the Web gets much bigger, perhaps we will have a search engine that understands better our personal needs and interests and the context and meaning of search terms.
6) The Net will go mobile.
Currently most people access the Net via a PC but worldwide many more have a mobile than a computer. Therefore Internet-enabled mobile phones will allow millions in Africa, Latin America, China and the Indian sub-continent to gain access to the Net for the first time. Even those with a PC will find that smart mobiles provide faster, easier and more ubiquitous access and, after less than a year with an iPhone, I could not imagine not having the Net in my pocket.
7) The Net will become more multilingual.
Most of the content of the Net is still in English which is fine for me but useless for the majority of the world's population. Over the next few years, there will be much more content in Russian, Mandarin, Arabic and other languages. Then we will see the development of much better automatic language translation tools so that any of us can read any language quickly and accurately.
In short: we ain't see nothing yet.
Link: views of the 'Father of the Internet' ckick here
Mobile has totally revolutionised the communications marketplace and further dramatic changes are on the way, explains our columnist Roger Darlington.
THE MOBILE REVOLUTION
My first published piece on mobile was a study called Telephones On The Move written for the then Post Office Engineering Union (POEU) in May 1984. The booklet majored on the award of two new cellular licences to a British Telecom-Securicor consortium (now O2) and a Racal-Millicom consortium (now Vodafone) and pointed out that, at that time, the number of users of what we then called radio telephones was a mere 315,000.
The changes since then have been truly breathtaking. In so many respects, the UK mobile industry has been an outstanding success story:
The mobile story is far from over.
Link: Ofcom's mobile sector assessment click here
Well before everyone in the world is connected to the Net, a much more ambitious project is under discussion. Our columnist Roger Darlington explores
THE INTERNET OF THINGS
One way of looking at the evolution of the Internet is to see it in three stages: first, a fixed Net essentially connecting desktop PCs; second, a mobile Net connecting hand-held mobiles; third, what we call the Internet of things.
This is not a new concept it goes back to a body called the Auto-ID Center [click here] which was founded in 1999 and based at the time in MIT. At that time, the vision was the widespread use of Radio Frequency Identification (RFID) chips to locate products in a company's supply chain.
What gives the idea new potency is the increasing adoption of a new version of the addressing system at the heart of the Internet. The current one is called Internet Protocol version 4 (IPv4) [click here] and the number of addresses that it generates has led many to forecast it will hit its limit as soon as 2010.
What will replace it is something called IPv6 [click here] which will represent a paradigm shift in the addressing system. While IPv4 'only' supports 2 to the power of 32 addresses, IPv6 provides 2 to the power of 128 which is over seventy nine billion billion billion times more than IPv4.
Put another way, this is roughly 2 to the power of 95 addresses for each of the 6.5 billion persons on the planet. In other words, every human on the globe could have a personal network the size of today's Internet.
In practice, the Internet of things might well encode up to 100,000 billion objects and follow the movement of those objects. It is estimated that every human being is surrounded by 1,000 to 5,000 objects.
Who on earth would want to use such a system?
Well, in September 2008, a range of powerful companies founded something called the Internet Protocol for Smart Objects (IPSO) Alliance [click here] to promote just this vision. Currently it has almost 30 members, including such payers as Cisco, Ericsson and Sun Microsystems.
But what would an Internet of things actually do?
Given the twin challenges of climate change and energy supply, the early adoption of the Internet of things is likely to be driven by utility companies to assist energy management. So, for instance, electricity companies might offer you discounts if they are allowed access your washing machine or dishwasher so that they can shut them down for a few minutes to manage peak energy usage.
Retail companies will then be massive users of the Internet of things to encode batches of products or even individual high-value products to assist stock management and to deter theft.
The state will find many uses whether it is for traffic management or theft control because all cars are encoded or for flood control because sensors are placed along river banks and flood plains.
Individuals will then embrace the idea so that they can restock basic food items automatically or identify food that has passed its 'eat by' date or control the heating, lighting, or security of their homes from any remote location even on the other side of the world during a business trip or a holiday.
Imagine if the activation of a smoke detector automatically turned off all your gas appliances and signalled to your mobile or a fire station. Eventually it will not just be your valuables purse, wallet, mobile, car that will be encoded; so will all your clothes and all your books.
We already put RFID chips in animals so farmers can track their cows or a person can locate their pet. In time, we might even think of implanting RFID chips in people. If this seems bizarre, imagine if a worried parent could locate her child anywhere at any time; imagine if a doctor could identify your blood group and allergies if you passed out in the street or abroad.
The implications of the Internet of things are profound. Like the current Net, no single authority will design or control it and no individual will be able fully to opt out.
There will be enormous benefits - on the global scale in terms of things like energy conservation, at the government level for control of crime and enhancement of security, and on the individual level including for those with special needs or disabilities. But there will also be risks especially concerning privacy and data protection. So we should start debating the idea now.
Wikipedia page click here
"Guardian" article click here
International Telecommunications Union report click here
European Commission consultation document click here