Friday, 31 October 2014

The weakest link


Not long ago, I confessed that I had become a subscriber to BT’s Infinity (FTTC) service.   The upgrade in capacity was substantial – in my case, increasing download speeds from about 8 Mbps to more than 70 Mbps.  I’m obviously pleased to have so much more power under the bonnet, and I’m still amazed BT managed to squeeze that extra power out of an old fashioned village network. And my turbo-charged system certainly does do some things a great deal faster.  And yet…  And yet, if I’m honest, the majority of my web browsing seems little affected by the extra power available.  I still see frequent messages like ‘Waiting for XYZ website..’ and ‘Loading…’  This is all just a bit unexpected – and a trifle disappointing. 

Fortunately, there is some analysis of this phenomenon in an article in this week’s TechPolicy Daily, itself based on data from Akamai’s respected ‘State of the Internet’ reports.  The thrust of the article is that ‘browsing speed’ is the product of both network capacity and non-network factors – the responsiveness of web servers, web browsers and end user devices.  According to the author, this means that “increasing network capacity has a minimal effect on the user experience of surfing the web.  In fact, upgrading from a 10 Mbps broadband connection to a faster one is unlikely to produce a perceptible effect most of the time”.
 
While the fact that network capacity is not the only determinant of a faster web seems pretty obvious, the surprise is that those non-network factors are far more influential in the browsing experience – up to 15-20 times more important. Akamai has logged these ratios internationally for the first quarter of 2014 and some of their results are shown below:
 
Nation
Ratio
 
 
Japan
7.4
Canada
8.9
Germany
8.3
US
11.2
Italy
6.4
France
8.3
UK
16.0
Ratios of non-network to network factors in web page load time. (Source: Akamai)

 Two results from this analysis are particularly striking: first, the dominant contribution of non-network factors applies equally in markets with high speed networks – such as Japan – and those, like Italy, where broadband speeds are generally lower.  Secondly, the UK appears particularly susceptible to non-network delays in browsing: the ratio of 16.0 reflects the fact that while UK network delays average only about 0.3 seconds, non-network factors add a whopping 4.8 seconds.   

Sadly, the reasons for the UK’s extreme result are not explored in this US publication but the overwhelming message from the analysis is clear: a faster internet is going to rely not only on network capacity but, even more crucially, on ”faster servers, faster laptops and smartphones, better organised web pages and faster browsers”.  Our current infrastructure planning, dominated as it is by network and broadband issues, perhaps needs to acknowledge that more clearly.

Thursday, 9 October 2014

No more heroes

Who are the heroes and villains of the internet?  In any contemporary discussion, it’s a fair bet that the consensus view on villainy will be based on something like this: 

“The open Internet’s founding principle is under attack. Policymakers in the U.S. are considering rules that would erase net neutrality, the principle that all data on the Internet should be treated equally. If these rule changes go through, many fear it will create a ‘two-tier’ Internet, where monopolies are able to charge huge fees for special fast lanes while everyone else gets the slow lane”. 

While it could have come from any number of neutrality lobbyists, this particular clarion call originates with Mozilla, the developers of the not-for-profit browser, Firefox.  For the avoidance of doubt, Mozilla helpfully identify the villainous ‘monopolies’ here as the US Cable companies and major Telco’s.  More recently, the company has joined forces with the Ford Foundation to launch what they call the Open Web Fellows Program, “A global initiative to recruit the heroes of the Open Internet”.  Here’s some of their rhetoric: 

We are at a critical point in the evolution of the Internet. Despite its emergence as an integral part of modern life, the Internet remains a contested space. Far too often, we see its core ethos – a medium where anyone can make anything and share it with anyone – undermined by forces that wish to make it less free and open”. 

“We are building a global movement to protect the free and open Web.  Mozilla supports the heroes of the Web – the developers, advocates and people who fight to protect and advance the open Internet”. 

So does all that resolve the identity of the good guys and the bad guys…?  Well, in the case of Mozilla, not entirely….  It turns out that the company (“We exist to protect the free and open web”) is almost entirely dependent on search engines for its income, and the vast majority of that ($300m in 2012) comes of course from Google, an operator not known for its advocacy of the ‘open web’.  And even Mozilla’s new ‘heroes’ initiative has faced criticism.  As Glyn Moody points out, with much regret, Mozilla has endorsed the incorporation of Digital Rights Management (DRM) in the latest version of internet software (HTML5).  DRM is, he says, “the very antithesis of openness and of sharing”.  On that basis, he argues that: 

“Mozilla's Firefox is itself is a vector of attack against openness and sharing, and undermines its own lofty goals in the Open Web Fellows programme”.

 Not so clear, then.  In practice, the ecosystem of the internet is deceptively complex, and one that defies easy characterisations like heroes and villains.

Thursday, 18 September 2014

Betting on the future

The dog days of summer appear to be with us still...  And,finding so little UK news of interest, I’ve yet again been forced to look to the US for inspiration.  The problem there is that so much of the useful editorial comes laden with large dollops of political lobbying – probably more so than on this side of the Atlantic.  But sometimes the propaganda can itself be enlightening, and what soon caught my eye was the report recently compiled by the Progressive Policy Institute (PPI) on its ‘US Investment Heroes of 2014’. 


In its third year of publication, this report focuses on identifying the U.S.-based corporations with the highest levels of domestic capital expenditure.  The authors, Diana Carew and Dr Michael Mandel, hope that their listing “can help inform good policy for encouraging continued and renewed investment domestically.”  Well, Amen to all that but first, here’s the Top-10 from the latest spending list (excluding energy companies):
 
RANK
COMPANY
EST. 2013 US CAPEX ($m)
 
 
 
1
AT&T
20,944
2
Verizon
15,444
3
Walmart
8,652
4
Intel
8,442
5
Comcast
6,596
6
Google
4,697
7
General Motors
4,591
8
Apple
3,807
9
Union Pacific
3,496
10
Ford Motor
3,392
 
The top Investment Heroes of 2014 actually look very similar to those of last year.  The continued strength of domestic investment by telecommunications and cable companies remains very clear: together, telecom and cable companies recorded $46bn of domestic CAPEX, just over 30% of the US total.  Within the sector, AT&T, which has invested significantly in expanding its U-verse fibre optic network, has remained top spender in all three annual reports.  Similarly, Verizon has focused its investment on building out its 4G LTE wireless network, and remains in the runner-up spot.  Comcast moves up from being in the 10th spot last year to ranking 7th this year, on the strength of investment in its X1 cable platform equipment, wireless gateways, and network capacity.
 
It’s hard to scope UK comparisons against these figures – given huge differences in addressable markets, the technology used, accounting conventions and so on.  Nonetheless, according to a learned colleague, writing earlier this year, BT's infrastructure spend has been running at about $1.86 billion (£1.15bn) a year, out of total CAPEX (including non-UK) of $4bn.  Using a very broad brush, and similar methodology to the PPI Report, we might end up with relevant domestic CAPEX of $2.5-3bn annually.  For the dominant UK player, that seems rather lightweight against AT&T’s $21bn.  On the other hand, the BT figure excludes a sizeable chunk of government (BDUK) subsidy  Indeed, some UK commentators - and I include myself in this group - have seen the existence of these government funds as a basis for criticising BT for not investing more from its own resources.  At any rate, we tend to see such private sector spending as being a predictable response to a rational business case.  That attitude contrasts sharply with that of the authors of the PPI Report, who firmly believe that the maintenance of discretionary investment merits some species of ‘compensation’ through government or regulatory concessions of some kind.  The Report dwells on these incentive measures at some length but they can essentially be summed up in biblical terms, i.e. 
 
 
The Eight US Commandments*:
(* Per PPI Report)
1
First, investment heroes should be commended publicly for their willingness to ‘bet on the nation’s future’
2
Policymakers must be aware that all regulations impact on investment appetite – so take care! 
3
The unintentional accumulation of past regulations can also impede investment flow so keep these constantly under review
 
(For the FCC)
4
Ensure the next spectrum auction proceeds as scheduled in mid-2015.
5
IP-transition trials must form part of a gradual, complete transition to new network technology
6
Local governments should be deterred from deploying their own broadband networks.
7
Utility-style, ‘Title II’ regulation of the internet must be avoided
8
New data privacy measures could also adversely affect the investment climate.
 
It’s easy enough to scoff at such a daunting ‘shopping list’ but it might actually provide some clues for UK policy.  As we continue to seek private sector investment in sustainable (fibre-based) broadband networks, perhaps the UK will have to accept that commitment to such investment may well require some form of regulatory incentive – probably not the generic concessions suggested in the above ‘Commandments’  but possibly some form of franchising or geographic exclusivity.  After all, we desperately need more ‘heroes’.

 

Thursday, 4 September 2014

Broadband aspiration: our missing link..?

In the dog days of July and August, my holiday reading was largely dominated by stories from the US about initiatives to facilitate municipal broadband projects, and the corresponding attempts by incumbents (and state laws) to suppress these - see here, for example.  The debate has become so heated that even content providers have felt obliged to muscle in.  Netflix, for instance, has lobbied the FCC in these resonant terms: 

“[a] single fiber-optic strand the diameter of a human hair can carry 101.7 terabits of data per second, enough to support nearly every Netflix subscriber watching content in HD at the same time.” When municipalities harness that technology to extend new opportunities to new communities, federal and state laws should encourage that initiative, or at the very least, get out of the way. The Commission can and should take a hard look at state laws that facilitate the efforts of incumbents to artificially constrain broadband availability and capacity.  “[B]roadband is not a finite resource. No statute—state or federal—should make it one". 

The American system of local government is clearly very different to ours but I wonder whether we couldn’t learn some lessons from their predicament.  Recall, for instance, the fiasco in Birmingham when BT and Virgin successfully thwarted the plans by the City Council to provide ‘ultrafast’ broadband connectivity to businesses in some previously unserved areas of the city. That situation seems to me very similar to the battles now being fought across a number of metropolitan areas in the US. Indeed, this summer’s report by the City Growth Commission had some harsh words to say about the UK environment for state-aided broadband investment in urban areas, e.g. 

“Government should commission a comprehensive review on how our current and future needs for digital infrastructure can be met, especially in the face of strict EU State Aid rules and a highly concentrated high-speed broadband market in which major players such as BT and Virgin can constrain supply and market competition.” 

Susan Crawford, Visiting Professor at the Harvard Law School, and an arch critic of the US cable incumbents, believes the answer is enhanced regulation - not at the federal level but state regulation in the hands of local mayors.  These mayors would act as digital champions, as some have done already, thus fostering competition between metropolitan areas for their levels of broadband access - interview here. 

Hard to say whether a comparable model might work in this country but there has to be some risk that our cities’ broadband infrastructure – now and for future upgrades- could be caught between competition/overbuild policies at the macro level and bottom-up, infill projects at the very local, micro kevel.

Thursday, 21 August 2014

A guilty admission

I readily admit that I’ve long been a cynic about BT’s reliance on ADSL technology for its roll-out of superfast broadband access.  And I’m certainly not alone in grumbling about the limited shelf-life of ADSL as a network solution.  But then there comes the real world matter of expediency…

I live in a small rural village in which BT has been the only viable ISP on offer – providing typical – and surprisingly respectable -download speeds of 6-8Mbps.  I watched with wry amusement the promises by my County Council that superfast availability would be “here soon” (under the aegis of a BDUK scheme), doubting whether the village’s older-age profile would make it a BT priority.  But, in the fullness of time, Openreach vans started to become prominent and the old BT cabinets were gradually replaced with their newer, much chunkier versions.  Still, the fact that my house sits outside the village centre and is served by overhead wires made me sceptical about its superfast prospects.  I was therefore hugely surprised to see that BT’s online checker suggested I might qualify for its Infinity service at speeds of up to 76 Mbps. Bah!!

A couple of weeks later, an unsuspecting BT engineer arrived on a sunny morning to install the service in my 15th century home.  He faced not only the property’s “bizarre” telephone wiring but also an electrical system that bears all the scars left by the ‘enthusiastic amateur’ who sold me the house.  The poor man visibly wilted but settled in for what was clearly going to be a long day… He finally left at 6.30pm.

And the outcome?  Well, I’ve run several online speed checks and the results are pretty consistent: I’m getting download speeds at 73-75 Mbps, uploads at around 12 Mbps.  Interim technology it may be, but not half bad for a country bumpkin!  Hats off to BT.

Wednesday, 16 July 2014

God bless Canada

I recently mentioned the, possibly suspect broadband statistics which Prof. Christopher Yoo derived to demonstrate the superiority of US regulatory policies over those of Europe.  Well, the Professor has been at it again, using the same US/EU data to make some even more sweeping claims. For instance: 
“…the U.S. focus on private investment and competition has placed it far ahead of Europe in terms of Internet speed and access…. 
U.S. broadband was cheaper for all speed tiers below 12 megabits and is comparably priced at speeds between 12 and 30 megabits, which makes it easier for low-income families to become broadband users….

if the FCC were to impose European-style regulation, these studies indicate that the investments that have enabled such a healthy and vibrant U.S. broadband infrastructure may wane”.
As the above extract shows, Yoo’s argument rests heavily on his assessment of the (mostly) lower pricing of US broadband.  His own metrics concede that U.S. broadband costs are higher for services above 30 megabits but he argues that ‘that cost differential is justified by the fact that average U.S. households consume more than 50 percent more bandwidth than their European counterparts’.   
Does higher usage really justify a higher unit cost?  Anyway, I was interested to see that an entirely independent cost study came to a rather different conclusion on the pricing issue. The Canadian telecoms regulator, CRTC, recently published its own retail cost study, including a comparison againsth other G7 countries.  With apologies to CRTC, the extract below excludes Canada but includes the US, the UK and our nearest European neighbours.  All reported prices are expressed in purchasing power parity (PPP) adjusted Canadian dollars.  

Average monthly prices in PPP adjusted $CDN (2014)
Broadband – fixed access
US
UK
France
Germany
Level 1 (≤ 3 Mbps, 7.5 GB/month)
$62.5
n/a
n/a
n/a
Level 2 (4 – 15 Mbps, 30 GB/month)
$72.9
$30.2
n/a
$26.1
Level 3 (16 – 40 Mbps, 75 GB/month)
$79.8
$46.9
$51.2
$38.3
Level 4 (≥ 40 Mbps, 120 GB/month)
$103.2
$47.8
$56.0
$58.5
 
 
 
 
 
Broadband – mobile access t (≥ 3G)
 
 
 
 
Level 1 (2 GB/month)
$63.7
$21.9
$18.5
$34.4
Level 2 (5 GB/month)
$69.1
$45.9
$43.0
$49.7
 
While the UK prices might not be the best in Europe in every case, the comparison with the US looks pretty decisive in favour of Europe.  Slam-dunk…?