Sunday 22 July 2012

Steady as she goes…

‘Poked with a twig’ was just one of several, relatively benign metaphors that the Lex column in last week’s FT used to describe the reaction of Europe’s telecom incumbents to Neelie Kroes’ decision not to impose price reductions on access to their legacy, copper networks.  Predictably, that view was not shared by the competitor fraternity: a swift response from ECTA argued that “the direction…will harm the competitive conditions of the broadband markets and will eventually harm consumer interests without fostering investments”.  But the incumbents’ feeling of this coming as a ‘welcome respite’ was based not just on the favourable pricing decision but also from its durability: Kroes said she wanted the regulatory guidance ‘to apply at least until 2020’. 

The idea of regulatory stability is routinely trotted out as a precursor of private sector investment but Kroes was right to give this factor renewed prominence in her policy statement:  From another perspective, Sean Williams, Group Strategy Director of BT, provided a very tangible illustration of the investor’s fear of instability in the evidence he gave to the House of Lords Select Committee on superfast broadband: 

“One of the most supportive things that Ofcom could do is make a long-term commitment to its regulatory policy towards fibre networks. At the time when we made our fibre investments, we had a conversation with Ofcom about how it was seeking to regulate this risk investment that we were making. It made some very supportive remarks but it did not feel the need to regulate this investment. In fact, it was so uncertain that it would be very difficult to regulate, and we completely agree with that. But every three years it has to make that choice again. That is a very difficult environment for us to make a 20-year-view investment with a 12-year payback if the regulatory regime can change every three years…” 

He has a point!

Thursday 12 July 2012

The eye of the beholder


Ofcom’s Annual Report was published yesterday and, guess what?, the regulator gives itself a largely glowing end-of-term assessment.  To be fair, the Report does at least dissect Ofcom’s individual goals, and attempts to monitor progress on each, but some of the progress statements take a rather blinkered view of regulatory achievement.  For example, as regards broadband, Ofcom identifies one of its aims as “to create an environment that gives confidence to potential investors, enabling them to make a case to roll out new superfast networks”. Here’s the first action review:

“After we required BT to offer access to its network of underground ducts and telegraph poles to allow companies to offer superfast broadband services, BT published prices for these in October 2011. These prices are among the lowest, if not the lowest, for comparable products elsewhere in Europe….In areas where BT has no commercial plans to invest, access to these ducts and poles will allow other providers to bid for the funding which will be made available by Broadband Delivery UK (BDUK)…” 

Sounds plausible enough but, in the real world, things look a little less rosy.  We heard just this week that Fujitsu, BT’s only qualified rival under the BDUK tendering framework, has withdrawn from two further broadband contests. Indeed, there’s a growing consensus that BDUK’s whole approach to subsidy allocation unreasonably favours the incumbent.  As to competitive access to BT’s passive infrastructure, there is countless anecdotal evidence from competitive providers that the PIA product is simply not fit for purpose.  No wonder, then, that BT itself acknowledges a lack of interest: in evidence to the House of Lords Select Committee on superfast broadband, Sean Williams, BT’s Group Strategy Director, admitted that “there is no demand for PIA”.

The success of other facets of Ofcom’s broadband policy can (and will) be questioned on another day but, for current purposes, a more realistic judgement in Ofcom’s report card might be: ‘could do better’.

Monday 2 July 2012

Plumbing matters

For anyone who believes that the open Internet and cloud computing are simply dependent on the exchange of ideas and content services, June 30th provided a timely reminder.  On that day, as blogger TechnoLlama reports, a significant part of the ‘cloud’ failed when a key Amazon data centre (EC2) in Virginia was hit by a violent storm, knocking out its power.  This particular cloud centre hosted important content for various Internet services, including Netflix and Instagram, putting them offline for up to six hours.  Nor was this a wholly isolated incident: the same data centre had crashed earlier in the month, and it seems that cloud outages are now becoming so frequent that websites and Twitter accounts have sprung up to document them.

A central assumption behind the growth of cloud services is that storage and processing are distributed amongst different data centres, in theory making outages less likely. Their whole principle of distribution rests on the assumption of network resilience: if one server is knocked down, others can take up the load. Indeed, this is the one of the founding pillars of the Internet as we know it.  However, the reality is that, from a distributed model, we have been migrating content to more and more centralized services – in terms of both geography and industry concentration.  It’s reported that the top 10 cloud providers are now all based in the US, and that Amazon alone holds an estimated 15% of the cloud market
 
For TechnoLlama, the danger is that this growing reliance on fewer providers has made legal or regulatory control of the Internet an easier task.  For our purposes, the simpler, but no less vital lesson is that the ‘information superhighway’ relies crucially on the resilience of its road (network) infrastructure. Put another way, plumbing matters!