Since 2003, I have written regular articles on information technology for Connect, which used to be a separate trade union and in January 2010 became a section of the larger union Prospect. Since 2013, I have contributed such articles for both the Connect section newsletter "DigitalEye" and the Prospect magazine "Profile". The text of all these articles, with relevant hyperlinks, are filed on my web site and this page brings together all those from 2013. If you would like to comment on any of them e-mail me.
January 2013 Prediction Not An Exact Science February 2013 Does Britain Really Have The Best Broadband In Europe? May 2013 What Is Money Anyway? July 2013 A Vision For TV's Future September 2013 Have We Become Too Dependent On The Internet? October 2013 How Not To Conduct a Communications Review December 2013 Should Ofcom Have A Role On Internet Content?
Forecasting technological change is not just about whether, but when and how, such change will happen, explains ROGER DARLINGTON.PREDICTION NOT AN EXACT SCIENCE Some predictions of new technology are just plain wrong - at least for now. For instance, in 1968 the film "2001: A Space Odyssey" represented a future three decades away in which the moon would host a working community. This seemed like a realistic prospect but, in fact, nobody has even walked on the surface of the moon since 1972. We have assumed that flight times for travellers will become shorter and shorter but, since Concorde stopped flying in 2003, no civilian aircraft flies faster than the speed of sound. BT and a couple of banks launched a trial of something called Mondex - an electronic wallet or purse - in Swindon in 1995 but, two decades later, we are still waiting for a ubiquitous system of e-cash. Many technological predictions of "the end of ..." variety have been totally off the mark. It is not that the new technology did not happen; it is just that the old technology has managed to coexist with the new. So television did not eliminate radio or the cinema, both of which are still going strong. The cheap and easy availability of recorded music has not taken away the popularity of live music. In 1980, I visited the infamous Watergate building in Washington DC to view one of the world's first set ups of a totally paperless office but, even with current tsunamis of e-mails and text messages, you may have noticed that your workplace still has paper. However, most predictions of technological change are accurate in form but misplaced in timing. The American scientist Roy Amara is credited with the astute observation: "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run." When two cellular mobile networks - then called Cellnet and Racal Vodafone - were first launched in the UK in 1985, it was thought that usage would be confined to vehicles and a major retailer called itself Carphone Warehouse (anachronistically, it still does). Yet today more people have a mobile than have a fixed line. When Bill Gates founded Microsoft in 1976, he had a vision of a computer in every home which, at the time, seemed utterly fanciful. Now, with the explosive growth of smartphones, we can realistically talk soon of a computer in very pocket or purse. In the late 1990s, the notion of e-commerce was introduced and commentators talked of "the new economy", but market share grew very slowly because most consumers were not on the Net and those that were connected through dial-up. Now that 80% of British homes are on the Net and have increasingly fast broadband connections, e-commerce is ravaging traditional bricks and mortar commerce. In the last few months alone, Comet, Blockbuster, HMV and Jessops have all gone to the wall, in large part because digital products and online delivery have transformed the retail industry. Obviously those products that come in digital form or can be converted to it - such as software, music, film, books, newspapers - are going to be at the forefront of this revolution, but what has happened with e-commerce will eventually come to pass with e-education, e-health and e-government. This is partly because current technologies have still not run their course, so remember Moore's Law (computers double in power every two years) [click here] and Kryder's Law (the amount of data that can be fitted onto a disk of a given size doubles every year) [click here]. It is partly because a raft of even newer technologies is on the way including biotechnology, nanotechnology and 3D printers. These technologies threaten current business models, companies and jobs, but they also offer us the prospect of longer and healthier lives and genuine lifelong learning and they offer the world easier ways of feeding a growing population, creating cleaner sources of energy, and tackling climate change. So what is the lesson of this review? You need to consider from time to time how technological change is likely to affect your life - including your job - and be prepared for transformation, because sooner or later change will happen.
The Government has laudable aims for broadband but needs to be held to account, insists ROGER DARLINGTON.DOES BRITAIN REALLY HAVE THE BEST BROADBAND IN EUROPE? Government policy is as succinct as it is grand: “Our ambition is to deliver the best superfast broadband network in Europe by 2015”, declares the web site of the Department of Culture, Media & Sport. But who is going to assess whether Government is meeting this aim and what measures will be used to determine whether the ambition has been achieved? Government has outsourced this monitoring task to the regulator Ofcom that was going to produce a balanced scorecard to assess broadband provision in the UK against the five metrics of coverage, speed, price, choice and take-up to assess how the UK is progressing against other European nations. This was originally intended to form part of the Communications Infrastructure Report, published by Ofcom in November 2012 [click here]. However, it was not, mostly due to the lack of comparable data across Europe, particularly on speed. Other EU member states measure speeds in a much more unreliable way. Meanwhile the Broadband Stakeholder Group has published its work programme for 2013 which contains a narrative overview on where the BSG thinks we are and where we are going and talks briefly to the Ofcom metrics [click here]. The UK has led Europe in coverage of first generation fixed broadband availability and this looks set to continue in terms of superfast coverage, though the UK is unlikely to lead the league tables in fibre to the home – as opposed to fibre to the cabinet - provision. For wireless services, other countries have had a head start on the UK for 4G, but we can expect competition to be fierce amongst operators on driving coverage and take up of 4G services from 2013. In terms of speed, Ofcom’s latest Communications Infrastructure Report [click here] states that the average fixed-line broadband speed is now 12.7 Mbps, an increase of 69% on the previous year. The average speed across superfast connections is 45.5 Mbps. Assessing speeds on mobile services is more difficult to measure. Taking price and choice together, Ofcom’s International Communications Market Report 2012 shows the UK coming second only to Poland in having the cheapest average fixed broadband revenue per person in 2011 at £54. This reflects the pressure on pricing created by vigorous competition between cable, BT Retail and Local Loop Unbundling (LLU) providers. Choice and price will need to be closely monitored as superfast services become increasingly available and are taken-up more by consumers. Ensuring an effective transition from high levels of competition in the first generation world to a superfast environment is incredibly important and it is right and proper that Ofcom has identified this as a priority in 2013. On the issue of take-up, the UK is fairly strong on first generation take-up at 80% of homes, but that still leave a fifth of homes not on the Net at any speed. Ofcom reports take-up of superfast services to be only 7.3% of the 65% of UK premises which have access to them. This might seem a low figure, but demand for superfast services has built gradually in all markets, including those markets in the Far East that pursued major network upgrades a decade ago. So in benchmarking the UK against other European countries, we look set to do well on take-up of current generation and mobile services, coverage of superfast services, and competitive pricing and choice of products for consumers. However, it is likely that the UK will feature less strongly on the issues of coverage of fibre to the home infrastructure and the tricky issue of speed (depending on how this is benchmarked). Meanwhile, the BSG points out that the five metrics currently used by Ofcom miss out the key variable of how broadband networks are actually used by homes and businesses and usage is going to be an important part of the organisation’s work plan for this year. So don’t be surprised if, at the next General Election, the Government claims success for its broadband ambition. There are so many metrics making up the scorecard and the comparable data from other European countries – especially on speed – just won’t be available.
They say that money makes the world go round, but increasingly this is not notes and coins but simply the binary digits nought and one, as our columnist ROGER DARLINGTON explains.WHAT IS MONEY ANYWAY? I recently returned from a month off work travelling literally around the world and I used my credit card everywhere on numerous occasions. It was as easy physically and (almost) as painless emotionally to pay for a helicopter flight over Uluru/Ayers Rock as to buy a cappuccino in an airport departure lounge. So, why are we still using physical money? What is the future of virtual money? And what is money anyway? We need to start by understanding the different uses of money: to buy, to save, and to invest. As far as purchases are concerned, we seem to have been talking forever about the notion of an e-wallet or e–purse. As a Londoner, I use an Oyster card for journeys on the underground, but the facility to make small-value payments for other goods and services using similar technology is still very limited through schemes such as Barclaycard OnePulse. However, an obvious idea is to deploy the mobile smartphone which so many of us now have and use for so many purposes. So a new approach to micro-payments is coming from UK mobile operators EE, O2 and Vodafone. The joint venture was originally called Project Oscar and is now being rolled out with the more consumer-friendly name Weve since the notion is that you will simply ‘wave and pay’ [for more information on Weve click here]. The Weve service will use a technology called near field communications (NFC) which will put an e-wallet on your phone with deductions made by passing the mobile over a NFC-enabled reader at a till or check out. Weve itself will not be a consumer brand but a platform for a whole range of loyalty cards and networks. Now consider money as a means of saving. Most modern economies suffer from inflation and all currencies fluctuate in value, so saving in pounds, or euros or dollars is problematic. Also ultimately most currencies – notably the mighty dollar – are backed by gold reserves but the price of gold has been falling quite dramatically. This has led Internet entrepreneurs to experiment with the notion of virtual currencies. Perhaps the most successful to date has been Bitcoin [for more information on Bitcoin click here]. Bitcoin is technologically very sophisticated. It uses a software system that enables people with access to powerful computers to "mine" Bitcoins and then securely use the resulting "coins" for online trading. The system is set up so that the number of Bitcoins can never exceed 21million and that they will become progressively harder to "mine" as the years go by. However, nobody knows who is behind Bitcoin – the best guess is that it is a Japanese guy in his 30s. Also the valuation of the “coins’ seems to be the subject of a speculative bubble. It fell to a low of $7 in August 2011; then it hit a new high of $266 in early April 2013 before falling $160 in a day. Buying and saving are minor uses of money compared to transactions – officially investments - on the world’s stock markets. Here the volume and speed of electronic transactions are so great that human beings have a limited role. Instead algorithms make split-second decisions and powerful computers make vast numbers of sales and purchases of stock. This manic system is wide open to manipulation. In April 2013, the Syrian Electronic Army successfully attacked the Twitter account of Associated Press news agency and issued bogus messages announcing that President Obama had been injured in a bomb attack. The Dow Jones fell 143 points before sense prevailed [for more information on such cyber-attacks click here]. But what is money anyway? Economist define money as a medium of exchange or a store of value, but it can be anything – shells, coins, gold, binary digits – in which users have a sufficient degree of trust and confidence. If that trust goes – as we saw in Germany in the 1930s or at Northern Rock in 2007 – all bets are off. The future of money seems to be increasingly digital, but it remains to be seen whether consumers and citizens will place more trust in algorithms and programmers than in banks and governments.
Our technology expert ROGER DARLINGTON puts us in the picture on the future of television.A VISION FOR TV'S FUTURE Last Autumn, Britain completed the nationwide switchover to digital television but, if you thought that augured a period of calm in the television marketplace, think again. What we used to call simply ‘the box’ is set to continue transforming how we consumer audio-visual content in an age of increasing convergence. The Digital TV Group (DTG) - which represents all the UK's major broadcasters - has launched a project with the snappy title of the Future of Innovation in Television Technology (FITT) Taskforce [click here]. The Taskforce has been split into five work streams and I am a member of the Advisory Group on Consumer Trends In The Next 10 Years, so I have been thinking recently about the future of television. First, a few facts:
As the Internet mediates more and more human activity, we become ever more exposed to the vulnerabilities of the network. Our IT columnist ROGER DARLINGTON asks:HAVE WE BECOME TOO DEPENDENT ON THE INTERNET? In just one month – August 2013 – we had a succession of dramatic reminders about how modern societies need the Net to work faultlessly if we are not to suffer at minimum inconvenience and financial loss and at worst major societal breakdown. In order of the occurrences, the month saw the following events:
“When you try to achieve great scale with automation and the automation exceeds the boundaries of human oversight, there is going to be failure. That goes for governments, for consumer companies, for Google, or a big insurance company. It is infuriating because it is driven by unreasonable greed. In many cases, the systems that tend to fail, fail because of an attempt to make them run automatically with a minimal amount of human oversight.”A weakness in human oversight is only one form of vulnerability that we now face. Another is a deliberate attempt to hack into systems by competitors, terrorists, or states. In their book “The New Digital Age” [for my review click here], Eric Schmidt and Jared Cohen – both top executives at Google – talk of the Cold War having been replaced by the Code War. They write: “The logical conclusion of many more states coming online, building or buying cyber-attack capability and operating within competitive spheres of online influence, is perpetual, permanent low-grade cyber war.”
And what happens when code war becomes real war? Today the Internet is the fifth battlefield after land, sea, air and space. George Friedman in his book “The Next 100 Years” [for my review click here] insists that “electricity will be to war in the twenty-first century as petroleum was to war in the twentieth century”.No electricity means no Internet. No Internet increasingly means no way of life as we currently understand it. There are some important lessons here. Every home, business, organisation and government should have protection against malware and malcontents appropriate to the volume and the sensitivity of the data on their systems and automatic back-ups of data to secure off-site locations. Also they should have practical policies for how they would operate if they were off-line for any significant period of time and effective contingency plans for getting back online as quickly as possible. Oh, and try not to get into a war.
You'd expect a communications review by the government to be an impressive operation. You'd be wrong, insists our IT columnist ROGER DARLINGTON.HOW NOT TO CONDUCT A COMMUNICATIONS REVIEW On 16 May 2011, the then Secretary of State for Culture, Media & Sport Jeremy Hunt launched a Communications Review with an "open letter" [for text of the letter click here] which posed 13 questions and asked for submissions in six weeks in the form of no more than four or five pages. The open letter gave no clear direction to the review but made these ridiculous demands on stakeholders. It was not until December 2011 that the Department finally published the first 168 submissions to the review including my own on the regulation of convergence [for text of my submission click here]. The Hunt letter concluded: So how did it all work out? There was no Green Paper; one was drafted but nothing was published. There was no White Paper; instead, on 30 July 2013, a Strategy Paper was finally published. Hardly anyone noticed its publication (the House of Commons had already risen for the summer recess) and there was virtually no media coverage (although the content was all about the media). This was probably just how the Government wanted it, because it had become increasingly embarrassed about a review that never knew where it was going and never got anywhere interesting. So what about the Draft Bill? The Strategy Paper states: “Given the pace of change, it is clear that we will need to keep the legislative framework for the sector under review. Rather than making sweeping changes to legislation, instead we are proposing to make incremental changes – updating the framework of 2003 as necessary." So no new Communications Bill either. After over two years, almost 200 submissions and five seminars, what do we have? We have a 46-page Strategy Paper entitled "Connectivity, Content And Consumers" [for text click here] . which is a less than riveting read. Part 1 of the paper deals with "World class connectivity and digital inclusion”. There is nothing new on broadband. The paper confirms the messages in the Spending Review that Government is no longer talking of the UK having the best superfast broadband network in Europe by 2015; instead it is talking of extending superfast broadband to 95% of homes by 2017 and at least 99% by 2018 – both dates well after the General Election There is nothing new on digital inclusion. The Government simply confirms its intention to work with Go On UK and the Big Lottery Fund (which will be awarding £15M early in 2014 for a digital skills programme) and mentions favourably the work of the Online Centres Foundation, recently renamed the Tinder Foundation (on whose Board I sit). Part 2 of the paper deals with "World beating content". A review Is promised of how the prominence of Public Service Broadcasters (most notably the BBC) can be maintained as viewers move away from standard definition formats, viewing at the time of broadcast, and traditional numerical channel lists, to a world of high-definition, catch-up TV with more dynamic and tailored menus. That will be a challenge. Part 3 of the paper deals with "Consumer confidence and safety", This is a really interesting section of the report, which addresses the complicated issues of illegal, harmful and inappropriate content, especially in the online environment. It is ironic that it was the finalisation of this section that caused the latest delays in publication of the report because the “open letter” launching the Communications Review made no specific reference to this problem. Part 4 of the paper is entitled "Cost of living", but this section might as well have been called “Odds And Sods”. Some of the issues covered are certainly about cost to the consumer – switching, bill shock, premium rate services. Other issues, however, are hardly about consumer costs – policy on Internet traffic management, a review of the broadcasting competition regime (welcome to BT), and a new communications regulatory appeals process (welcome to Ofcom). In short: too little, too late, too weak.
The growing debate on Internet content raises questions about the responsibility of everyone from Government to parents. ROGER DARLINGTON wonders about the role of the regulator. SHOULD OFCOM HAVE A ROLE ON INTERNET CONTENT? When the Communications Act 2003 was enacted, five regulators were merged to create the new regulator Ofcom with responsibility for broadcasting, telecommunications and spectrum. Subsequently regulation of posts has been added to Ofcom’s duties. At the time, some people were surprised that one word was missing from the Communications Act. There was no mention of the Internet. In many ways, this was understandable. The Net was still a relatively new phenomenon and most UK homes were still not online; content comes from multiple sources around the globe including users themselves through social media; and the Government did not want to be accused of state control or censorship. However, today around 80% of UK homes are online and, according to the latest survey from the Oxford Internet Institute, 44% of Net users want the Government to regulate it more. Government itself is constantly – but haphazardly – entering the debate whenever there is a new scandal or a controversial court case whipped up by the media. In the last few months alone, DCMS Secretary of State Maria Miller and Prime Minister David Cameron have both held ‘summits’ with Internet service providers and the likes of Google and Facebook. In her invitation to the first of these events, Maria Miller said: ”Recent horrific events have again highlighted the widespread public concern over the proliferation of, and easy access to, harmful content on the internet.” [For full text of the Miller letter click here] But the ‘summits’ focused mainly on illegal content and neglected the issues of potentially harmful or grossly offensive content. Is it time for a less political and more evidence-based review of controls on Net content and services? A year ago, the House of Lords Select Committee on Communications conducted an inquiry into media convergence and its report was published in March 2013 [for text of the report click here]. The Committee recommended: “The next Communications Bill should establish a more pro-active role for Ofcom regarding the Internet than has been the case to date, to be reflected in Ofcom’s general duties.” It added: “Specifically, Ofcom should be required, in dialogue with UK citizens and key industry players, to establish and publish on a regular basis the UK public’s expectations of major digital intermediaries such as ISPs and other digital gateways.” In the Committee’s view, Ofcom should not have any powers or sanctions in relation to the Internet, but: “Should these reviews reveal a major concern on the part of the UK public, which the industry repeatedly and without reason fails to respond to, Ofcom would then be required to advise the Secretary of State.” So what happened to this most interesting recommendation? When the Government response to the Lords report was published by the DCMS in June 2013 [for text click here], it omitted to make any mention of the recommendation. Indeed the whole thrust of the DCMS response was that these issues would be addressed in the promised Strategy Paper concluding the Communications Review – but they were not. In “Connectivity, Content And Consumers” published in July 2013 [for text click here], there was simply a general exhortation: “In preparation for a more converged future, we want industry and regulators to work together on a voluntary basis to ensure that there is a common framework for media standards.” Interestingly Ofcom itself has produced a response to the Select Committee’s report – also in June 2013 – which specifically mentioned the recommendation on Ofcom and the Internet [for text click here]. The thoughtful letter from Claudio Pollack said: “We can see merit in the approach recommended by the Committee.” Obviously there would be a number of issues to address. As Ofcom put it: “Consideration would need to be given to the legislative changes required in order to carry out such a duty effectively, including the ability to request information from a set of stakeholders which do not fall under the scope of Ofcom’s remit, such as Google and Facebook, and how the review function might be funded.’ These are not insurmountable obstacles but Ofcom warned: “As these are matters relating to Ofcom’s duties, such a role is for the Government and Parliament to determine.” So how do we take forward this interesting idea?