Since 2003, I have written regular articles on information technology for Connect, which used to be a separate trade union and in January 2010 became a section of the larger union Prospect. Originally the magazine was called "The Review" and then in April 2004 it was renamed "Connected". The text of all these articles, with relevant hyperlinks, are filed on my web site and this page brings together all those from 2010. If you would like to comment on any of them e-mail me.
February 2010 Taking Faster Broadband Further March 2010 The Data Deluge May 2010 Should The Net Be Neutral? July/Aug 2010 More Pictures On A New Canvas September 2010 The Wind To Cloud Computing Oct/Nov 2010 What Now For Digital Britain? December 2010 How The Web Works
Super-fast broadband will never reach all parts of the country without some sort of public sector intervention, explains our columnist Roger Darlington
TAKING FASTER BROADBAND FURTHER
In March 2008, Connect published a booklet entitled The Slow Arrival Of Fast Broadband which I drafted for the union in a consultancy capacity [for up-dated version click here]. Two years later, real progress has been made in the roll-out of what is technically called next generation access (NGA) but is more popularly called super-fast broadband (SFB).
Virgin Media has up-graded the whole if its network to provide 50 Mbit/s services, while BT has announced a £1.5 billion investment to cover 10 million premises by 2012 , mainly through fibre to the cabinet (FTTC) which would provide services up to 40 Mbit/s.
However, Virgin's cable networks only cover half the country in population terms and BT's current plans would only reach around 40% of homes. Research conducted by consultants Analysys Mason for the Broadband Stakeholder Group suggests that the market alone is unlikely to take NGA to more than 60-70% of homes [for text of report click here].
So what is the answer? Various Regional Development Authorities, local authorities and community groups are developing local initiatives. On behalf of the Communications Consumer Panel, I have produced a report pulling together information on these and, while there are over 40, almost all are very small-scale [for text of report click here].
The Government's answer was a surprise feature of the Digital Britain Final Report published in June 2009 [for text click here]. It is what is popularly known as the next generation levy but the Government now calls the landline duty. This is a charge of 50 pence per month on each fixed line to raise money for a fund that will be used to stimulate investment in the so-called 'final third' of the country that otherwise might not see NGA for decades if ever.
The Conservative Party is firmly opposed to this levy, but the Government seems determined to press ahead with the measure. It was mentioned in the Pre-Budget Report and is expected to be in the Budget. The problem is that a General Election is likely to be called before the Finance Bill has reached the Statute Book and, in those circumstances, there will be tough negotiations between Government and Opposition over what stays in the Bill and what is dropped.
Meanwhile the Government has issued two consultation documents over the measure - effectively one on how the money will be raised [for text click here] and another on how it will be spent [for text click here]. So we have a fairly good idea what it will look like.
The duty will apply to fixed line phone and broadband services not mobile, satellite or other wireless communications.
Broadband has been included to prevent market distortion arising from phone users moving to Voice over Internet Protocol (VoIP) services for their voice needs. Mobile is excluded because it has developed in a different environment to fixed (that is, without the monopoly incumbent) and it would be difficult to apply fairly (for instance, ensuring equal burden for prepaid and contract users).
The duty will be payable on all local loops that are made available for use regardless of whether they are actually used, regardless of technology (copper, cable and fibre), and regardless of whether voice or data services are delivered over the connection. Liability for the duty lies with the owners of the physical assets such as BT because a small number of network owners is easier to identify and less costly than registering retailers or the local loops themselves and this approach will prevent customers being charged twice where two retailers provide services over one line.
The decision whether or how network owners will pass on the tax to retailers is a commercial decision for the network owners to make. The Government expects that the tax will be passed on to retailers and, subsequently, to consumers. Obviously consumers will not welcome such a charge if it is passed on but Ofcom figures show that average bills have fallen by more than 50p per month over the last three years.
The Government intends to implement the duty on 1 October 2010. It is expected to raise £175 million per year (£150 million from the duty, £25 million from VAT on the duty) and the intention is that it will be used to help fund the roll-out of next generation broadband to 90% of the UK population by 2017.
I support the measure and hope to see it implemented but watch this space.
The Internet has utterly transformed how individuals and organisations access and use information, as explored by our columnist Roger Darlington.
THE DATA DELUGE
Communications technologies have changed our world so much that it's hard to comprehend just how transformative this revolution has been. A useful prism through which to examine some of these changes is information or data.
First, consider the mind-blowing volume of data that is now available and the scary pace at which this volume is increasing.
According to The Economist, the total amount of information in existence this year is 1.2 Zettabytes (ZB) where a Zettabyte is 2 to the power of 70 bytes. The same publication suggests that the rate of growth of data is a compound annual rate of 60% which means that the amount of digital information increases tenfold every five years.
These figures are incomprehensible to most of us, so let's try some smaller but still huge numbers around sites you know, so that you can grasp something of the scale of the Net.
Take Wikipedia, a wonderful online encyclopedia: today it has over 3.2 million articles in English alone and more than 100,000 articles in 30 other languages. Or take YouTube which hosts short videos posted by anyone: it is estimated that there are more than 140 million such videos and that more than 20 hours of video is uploaded every minute worldwide.
Storage and classification of such exponentially growing volumes of data present huge challenges of both technology and cost.
Next, think about how all this data is used and how it can be abused. Companies use data-mining or business intelligence to decide how best to meet our needs as customers. So, if you use Amazon, it will track which books or videos you order and even which pages on the site you look at in order to make targeted recommendations and offers. And, if you use Gmail, then the entire content of your mail will be scanned using content extraction to enable targeted adverts to be presented to you.
Wherever we go on the web, we leave what is called data exhaust, a trail of clicks that show where we've been, how long we stayed there, and what we did online.
The ability to track and trace online is necessary if law enforcement is to tackle a host of abuses such as spam and scams, hacking and malware, phishing and identity theft, circulation of child abuse images, or planning of terrorist activities.
On the other hand, in cyberspace privacy has become almost meaningless. Already social networking sites there are more than 400 million active users of Facebook alone have put a ton of very personal data out there for anyone to see or hack. The move to cloud computing, with applications hosted externally rather than on our PCs, will put much more sensitive data on the web.
Finally, contemplate how we access the data that we want to see and block out the data that we don't want to see.
If there is more and more data available to us, how do we decide where to go for it and which of it is most accurate or useful? Most of us use implicit or explicit recommendation tools. Implicit tools include Google's ranking of web pages by the number of links to that page or Wikipedia's facility for anyone to create or amend an entry. Explicit recommendations come from our use of trusted sources which may be organisations - such as the BBC or the Guardian - or individuals such as someone's blog or Tweeter feed.
These techniques for managing data make sense but they tend to reduce the wonderful serendipity that is presented by the web. Following links can take you to interesting new sources of information or insight, but so many people tend to visit a very limited number of web sites or blogs.
An opposite problem is how we block out information that we don't want to see or which we don't want our children to see. Somehow we have to control the volume of e-mails, text messages, telephone calls and other messages that increasingly bombard us. Filtering software can limit exposure of children to inappropriate material online but such software has its limits.
Let's give the last word to an academic researcher who is a friend of mine, Professor Sonia Livingstone at the London School of Economics: Data, data everywhere, but never time to think.
Our columnist Roger Darlington examines one of the most divisive debates around the Internet.
SHOULD THE NET BE NEUTRAL?
Net neutrality as well as being nicely alliterative sounds as comforting as motherhood and apple pie. How could anyone be opposed to, or even anxious about, something neutral? On the other hand, if you call the issue traffic management, then the debate becomes less clear.
The terms used (and abused) mean different things to different players, but essentially what we are talking about is whether and where there should be a principle of non-discrimination regarding different forms of Internet traffic carried across networks.
At the extreme, net neutrality means that there should be no prioritisation of any types of traffic by network operators so all bits would be treated as equal and no charging for content providers. In practice, it is about whether communications providers should be allowed to block, degrade or charge for prioritising different applications or different content providers' traffic or whether network operators should be able to charge consumers, service providers or both for tiered quality of service.
The reality is that, in its pure form, net neutrality does not and cannot exist because all network operators have to use a variety of network management techniques to ensure that their networks do not become congested and that types of application that are sensitive to latency and jitter (like voice and video) are treated differently from others (like e-mail).
Where the debate becomes really anguished is when there is any suggestion that traffic management goes further to prioritise some service providers' content or applications over others or even to block access to a rival's content or applications.
The last time that I addressed the issue of net neutrality in this series of columns (September 2006), the debate was raging in the USA but hardly featured in Europe. Why has it always been such a big issue in the United States? The explanation is that, in most parts of America, effectively there is a duopoly between the telephone company and the cable company and little real competition.
Here, in the UK, we have more network competition and net neutrality has not been the issue that it has on the other side of the Atlantic. However, the debate here is becoming more high-profile for several reasons: the growth of bandwidth- hungry services like iPlayer and YouTube, the explosion of Internet traffic over mobile networks, a review by the body representing regulators throughout the European Union, and implementation of a revised EU Framework for telecoms.
So far, the debate has been highly polarised. For instance, BT has argued that the BBC and other content providers should not expect a free ride over its network while, for its part, the BBC has complained that BT throttles iPlayer at peak times.
One might think that, from a consumer point of view, net neutrality is obviously desirable and its strongest proponents want it to be mandated by the regulator or even (in the USA) enforced by legislation. However, the current position effectively a 'best effort' approach in the face of limited capacity and rising volumes of bandwidth-demanding traffic presents several dangers.
The most obvious threat is increasing congestion which will degrade quality of service for all consumers and most applications. More particularly, delay-sensitive applications, such as voice over Internet Protocol (VoIP) or video services, could be degraded or even lost. Most worrying of all, if we do not agree a consensual and realistic approach to net neutrality, the net could fragment into two (or more) tiers with business users, who are ready to pay for it, receiving a better service.
Later this year, therefore, our regulator Ofcom will issue a discussion document and then a consultation document before making policy and recommendations. The two key issues are likely to be discrimination and transparency.
On discrimination, we need to think of options for Internet traffic management as a continuum and determine what forms of discrimination are fair and reasonable and what forms are anti-competitive and unacceptable and should be the subject of regulatory intervention. Ofcom will need to decide when intervention and what form of intervention would be appropriate.
On transparency, consumers need to be fully informed of any traffic prioritisation, degradation or blocking policies being applied by their ISP, so that they can take this into account when choosing a service provider. These policies need to be prominent, accessible and intelligible. Canada and Norway have examples of good practice here.
Our regular columnist Roger Darlington looks at a new initiative that will bring together television and broadband.
MORE PICTURES ON A NEW CANVAS
You may not have heard of it, but it's coming soon, it will transform our television offerings, and it may provide a further boost to broadband take-up. It's called Project Canvas but, when it is launched on the market, it will have a different brand name.
The central proposition is the delivery of video-on-demand (VOD) and other interactive services to the television set via Freeview and broadband. In effect Canvas is an attempt to replicate the success of Freeview for Internet television.
Behind the core proposal are three specific elements:
So what would Canvas mean for consumers?
The benefits of Canvas will be:
But possible downsides to the project are:
When the Canvas device is first launched - probably in early 2011 in good time for the Olympic Games - it will be expensive, somewhere between £200-£249. But the 'base case' scenario is that, by 2015, there will be 4M Canvas devices out there and 23% of digital terrestrial television (DTT) households will have Canvas on its primary set. The 'high case' scenario would see 8.3M devices sold and 50% of DTT households with at least one set covered.
An appealing feature of the Canvas project is the amount of thought that has gone into the issue of accessibility regarding the new device. While the set top box will not at first have all the features that disability groups want, the plan is that the boxes will be upgradeable enabling new accessibility solutions to be added in the future.
A particular interesting consequence of the launch of the Canvas offering is the likelihood that it will stimulate some further take-up of broadband, since one will need a broadband line to access the services, and some further use of the Internet, since the consumer will be able to access the Net without needing a computer.
The BBC estimate is that, over five years, between 500,000-870,000 homes which currently are not online will take broadband as a result of Canvas. Such a development would have positive implications for the online delivery of public services such as NHS Direct.
Project Canvas info site click here
BBC Q & A page click here
Our Internet columnist Roger Darlington explains why the current buzz about 'the cloud' is more than hot air.
THE WIND TOWARDS CLOUD COMPUTING
When it comes to the development of the Internet, there's always something new. Not everything new is important or lasting, but cloud computing is a very significant trend that will impact both users and networks.
The 'cloud' here is really just a fancy word for the Internet, but it's rather appropriate because it's large, it's out there, and it's fuzzy at the edges. Cloud computing is an umbrella term but essentially it refers to putting more material and software on the Net itself rather than on the computers and servers that businesses or users have themselves.
An analogy could be with electricity. At the beginning of the industrial revolution, companies generated their own electricity needs locally but, once the grid became truly national and reliable, the provision of electricity was outsourced to specialised companies and networks.
Similarly today many companies are questioning why they need to own sophisticated computers and servers which need constant and expensive maintenance and regular up-dating with new software when they could access and pay for software and services just when they need them.
For years, we've talked of putting the intelligence in the network and making terminals cheaper and dumber. Now a version of this vision is starting to happen. So, what's the point of cloud computing?
Options include software-as-a-service (SaaS), platform-as-a-service (PaaS), and infrastructure-as-a-service (IaaS), depending on what the customer wants. Suppliers of such services include some of the very biggest names on the Net, including Microsoft, Google and Amazon.
Users of cloud computing include many familiar names like the Guardian newspaper and the supermarket giant Tesco. But the largest potential user is yet to come on board. At the beginning of 2010, the then Labour Government talked of bringing in cloud-based infrastructure and services across government departments. The new coalition government is likely to drive the idea faster as a means of saving money. The so-called 'G-Cloud' strategy will probably come online in 2011 and could eventually save central government £3.2 billion out of the £16 billion annual IT budget an impressive 20%.
If the project works for Whitehall, expect central government to follow suit soon afterwards.
The biggest roadblock in the rush to cloud computing for what we used to call the information superhighway is something which is rarely discussed: network capacity. While new investments in the local loop mainly fibre to the cabinet (FTTC) but some fibre to the home (FTTH) are receiving attention, too little thought is being given to the need for heavy investments in the backhaul the circuits that take traffic from the local exchange back to the routers and servers of Internet service providers (ISPs) which in the UK are normally located in the Docklands area of London.
There are only six or seven backhaul networks in the country the largest being BT Openreach and, unless the necessary investments are made, the heavy traffic requirements of cloud computing will lead to growing contention and the inability of networks to self-heal. Networks could slow down to the point of not being usable or usage could be capped or charged for which might lead to a two-tier system.
Wikipedia page on cloud computing click here
Discussion of the 'G cloud' click here
Our regular columnist Roger Darlington examines the Coalition Government's communications policies so far.
WHERE NOW FOR DIGITAL BRITAIN?
In June 2009, the then Communications Minister Stephen Carter in the then Labour Government launched the Digital Britain Final Report. It ran to 238 pages and it weighed in at 1 kg (2.3 lb).
But Labour is no longer in office and Carter is no longer a minister, so what's happened to the ambitious proposals in Digital Britain as regards the telecommunications sector? Well, the report did lead to legislation in the form of the Digital Economy Act, but the Bill was caught in the 'wash up' process of the last Parliament which meant that the Conservative Opposition blocked a number of proposals even at that late stage.
So the regulator Ofcom obtained new responsibilities to combat illegal file-sharing and review the adequacy of the nation's communications infrastructure, but the suggested duty to encourage investment was dropped.
Meanwhile the new Coalition Government is planning huge cuts in public expenditure and a savage reduction in so-called quangos and it has Ofcom in the firing line. The debates are still raging but, almost certainly, Ofcom will finish up with a reduced role and a smaller budget that will have major implications for the industry which ironically funds a large part of the regulator's work.
For consumers and businesses, even more important than having an effective regulator is having access to a communications network that meets their needs and ensures the nation's international competitiveness. So what's the new government doing here?
Well, it's saying the right things. The new Culture Secretary Jeremy Hunt assured an industry conference in July 2010: I hope you are in no doubt whatsoever about how important the Government considers broadband as a part of our economic infrastructure. He asserted: All of us share the ambition that, by the end of this Parliament, this country should have the best superfast broadband in Europe and be up there with the very best in the world.
Of course, it's one thing make the declaration of intent; it's another thing to deliver policies that make a positive difference on the ground or in this case in the ground. The first decisions have been exceedingly disappointing.
As far as current generation broadband is concerned, the last government planned to roll out a universal broadband commitment of 2 Mbit/s a speed lambasted by the Conservative Opposition as paltry by 2012, but the current government has retained the 2 Mbit/s speed and delayed implementation until 2015.
As far as next generation broadband is concerned, the last government planned a 50p a month levy on all fixed lines to fund delivery of next generation access to the final third of the country that would not receive it under private sector provision, but the present government immediately dropped the levy idea and has now announced that it will not decide whether public funding is needed until January 2012 and, if the decision is affirmative, funding probably from the BBC licence fee - will not commence until 2013.
Another way that the government might have assisted private sector investment in next generation broadband is through an amendment to the present taxation rules. In Opposition, the Conservatives undertook to conduct a review. Now Communication Minister Ed Vaizey has abandoned that promise to look again at the taxation regime for new fibre networks.
The current taxation regime works to the advantage of BT and Virgin Media, but potential competitors to these companies claim that the current arrangement adds 10% to their costs compared to the situation faced by BT and Virgin.
So, is there any good news from government these days?
DCMS Secretary of State Jeremy Hunt has announced that there will be three trials of the delivery of next generation access in rural and hard-to-reach areas. Broadband Delivery UK the organisation which will be the delivery vehicle for the Government's broadband policies will manage the procurement of these testing projects.
Also the Coalition Government has at last backed a simplified version of a package of proposals on spectrum, that the Labour Government failed to get through the Commons before the General Election. This should in time enable mobile networks to make a better contribution to the delivery of new high-speed services.
But the truth is that the Coalition Government is still a long way from having the necessary detailed and credible set of policies to deliver a truly digital Britain. Watch this space.
"Digital Britain Final Report" click here
Jeremy Hunt's speech of 8 June 2010 click here
Jeremy Hunt's speech of 15 July 2010 click here
Ever wondered how web sites obtain their names? Our Internet columnist Roger Darlington explains the system.
HOW THE WEB WORKS
The World Wide Web - the graphical part of the Internet - was invented by a British scientist, Tim Berners-Lee in 1989 while he was working at the European Centre for Nuclear Research (CERN). Every day it becomes bigger and bigger the Wikipedia site alone has some 3.5 million pages in English, each of which has its own unique address so that all users can find it and link to it.
Each web page address can be thought of as having three elements. The first element is everything up to the first dot. This usually involves the letters 'http' which stands for Hypertext Transfer Protocol and the letters 'www' which of course stands for World Wide Web.
The second element is everything after the first single forward slash which identifies the particular page on a web site with more than one page which is almost every site. The third element is everything in between the other two elements. It is called the domain name and is the subject of the rest of this column.
There are two sorts of domain name. Generic Top Level Domains abbreviated as gTLDs are those ending in terms like .com or .org (currently there are around 20). Then there are Country Code Top Level Domains abbreviated as ccTLDs such as .uk for Britain or .de for Germany (a wide definition of the term country means that there are over 240).
Generic Top Level Domains which tend to be used by companies or organisations that operate beyond a single nation are overseen by an organisation called the Internet Corporation for Assigned Names and Numbers (ICANN) which is a not-for-profit private sector organisation based in California.
Country Code Top Level Domains which tend to be used by bodies operating solely within a particular nation state are managed by a separate body for each country with ICANN providing a coordination role. In the UK, that national registry is called Nominet which was founded in 1996.
Nominet has its offices at the Oxford Science Park where it employs some 115 staff. It is the second largest national registry (after Germany) with around 9 million .uk domain names, the very first of which was issued 25 years ago (domain names preceded the web). In fact, some four-fifths of these names are not actively used.
Nominet itself 'sells' very few domain names directly. Instead it operates through a large body of registrars who sell domain names to web site owners and often provide other web-related services such as hosting. The UK has a very large community of registrars many more than Germany running to some 3,000, but the largest 20 account for about 75% of .uk domain names.
Domain names are cheap: all registrations are for two years and Nominet charges a 'wholesale' price of just £5. Registrations are renewable on the same terms without limit. I am one of the 9 million with a .uk domain name for my web site.
Disputes over the use of domain names tend to be of two main types.
One case concerns intellectual property: A company might complain that their name has been hijacked or misused in a domain name and Nominet has a system of 40 independent arbitration experts to resolve around 700 such disputes a year.
Then there are cases of sites responsible for misrepresentation and fraud and on occasion Nominet will suspend domain names in response to a police request because the name is being used for criminal activity.
Nominet is a not-for-profit organisation with a turnover around £20 million which makes a healthy operating surplus, but much of this is ploughed into underpinning the security and resilience of the system.
It is very important to the government and to stakeholders that Nominet operates in an open and transparent way and it has recently carried out a major review of governance. Following this review, the approach to making policy in the .uk space has been changed to a stakeholder-led process based on issues and anybody who wishes can now set up an appropriate Issue Group to address a perceived problem.
The new .uk policy process is supported by a Stakeholder Committee appointed by the Nominet Board. This committee has four Nominet members and four with a wider background. I have been selected to reflect the interests of consumers - hence your columnist's knowledge.
Internet Corporation for Assigned Names and Numbers click here
Nominet click here
Nominet .uk policy process click here
Nominet Stakeholder Committee click here