Friday 31 October 2014

The weakest link


Not long ago, I confessed that I had become a subscriber to BT’s Infinity (FTTC) service.   The upgrade in capacity was substantial – in my case, increasing download speeds from about 8 Mbps to more than 70 Mbps.  I’m obviously pleased to have so much more power under the bonnet, and I’m still amazed BT managed to squeeze that extra power out of an old fashioned village network. And my turbo-charged system certainly does do some things a great deal faster.  And yet…  And yet, if I’m honest, the majority of my web browsing seems little affected by the extra power available.  I still see frequent messages like ‘Waiting for XYZ website..’ and ‘Loading…’  This is all just a bit unexpected – and a trifle disappointing. 

Fortunately, there is some analysis of this phenomenon in an article in this week’s TechPolicy Daily, itself based on data from Akamai’s respected ‘State of the Internet’ reports.  The thrust of the article is that ‘browsing speed’ is the product of both network capacity and non-network factors – the responsiveness of web servers, web browsers and end user devices.  According to the author, this means that “increasing network capacity has a minimal effect on the user experience of surfing the web.  In fact, upgrading from a 10 Mbps broadband connection to a faster one is unlikely to produce a perceptible effect most of the time”.
 
While the fact that network capacity is not the only determinant of a faster web seems pretty obvious, the surprise is that those non-network factors are far more influential in the browsing experience – up to 15-20 times more important. Akamai has logged these ratios internationally for the first quarter of 2014 and some of their results are shown below:
 
Nation
Ratio
 
 
Japan
7.4
Canada
8.9
Germany
8.3
US
11.2
Italy
6.4
France
8.3
UK
16.0
Ratios of non-network to network factors in web page load time. (Source: Akamai)

 Two results from this analysis are particularly striking: first, the dominant contribution of non-network factors applies equally in markets with high speed networks – such as Japan – and those, like Italy, where broadband speeds are generally lower.  Secondly, the UK appears particularly susceptible to non-network delays in browsing: the ratio of 16.0 reflects the fact that while UK network delays average only about 0.3 seconds, non-network factors add a whopping 4.8 seconds.   

Sadly, the reasons for the UK’s extreme result are not explored in this US publication but the overwhelming message from the analysis is clear: a faster internet is going to rely not only on network capacity but, even more crucially, on ”faster servers, faster laptops and smartphones, better organised web pages and faster browsers”.  Our current infrastructure planning, dominated as it is by network and broadband issues, perhaps needs to acknowledge that more clearly.

Thursday 9 October 2014

No more heroes

Who are the heroes and villains of the internet?  In any contemporary discussion, it’s a fair bet that the consensus view on villainy will be based on something like this: 

“The open Internet’s founding principle is under attack. Policymakers in the U.S. are considering rules that would erase net neutrality, the principle that all data on the Internet should be treated equally. If these rule changes go through, many fear it will create a ‘two-tier’ Internet, where monopolies are able to charge huge fees for special fast lanes while everyone else gets the slow lane”. 

While it could have come from any number of neutrality lobbyists, this particular clarion call originates with Mozilla, the developers of the not-for-profit browser, Firefox.  For the avoidance of doubt, Mozilla helpfully identify the villainous ‘monopolies’ here as the US Cable companies and major Telco’s.  More recently, the company has joined forces with the Ford Foundation to launch what they call the Open Web Fellows Program, “A global initiative to recruit the heroes of the Open Internet”.  Here’s some of their rhetoric: 

We are at a critical point in the evolution of the Internet. Despite its emergence as an integral part of modern life, the Internet remains a contested space. Far too often, we see its core ethos – a medium where anyone can make anything and share it with anyone – undermined by forces that wish to make it less free and open”. 

“We are building a global movement to protect the free and open Web.  Mozilla supports the heroes of the Web – the developers, advocates and people who fight to protect and advance the open Internet”. 

So does all that resolve the identity of the good guys and the bad guys…?  Well, in the case of Mozilla, not entirely….  It turns out that the company (“We exist to protect the free and open web”) is almost entirely dependent on search engines for its income, and the vast majority of that ($300m in 2012) comes of course from Google, an operator not known for its advocacy of the ‘open web’.  And even Mozilla’s new ‘heroes’ initiative has faced criticism.  As Glyn Moody points out, with much regret, Mozilla has endorsed the incorporation of Digital Rights Management (DRM) in the latest version of internet software (HTML5).  DRM is, he says, “the very antithesis of openness and of sharing”.  On that basis, he argues that: 

“Mozilla's Firefox is itself is a vector of attack against openness and sharing, and undermines its own lofty goals in the Open Web Fellows programme”.

 Not so clear, then.  In practice, the ecosystem of the internet is deceptively complex, and one that defies easy characterisations like heroes and villains.