Thursday 30 December 2010

Three cheers for pragmatic traffic management reg’s

When Ofcom published its Discussion Document on ’Traffic Management and Net Neutrality’ earlier this year, one of my concerns was that the regulator was seeking too much prescription of what it would regard as ‘reasonable’ traffic management practice. I pointed out at the time that this was in contrast to the case-by-case approach being advocated by the FCC (notably in its earlier adjudication of the Comcast dispute with BitTorrent), and I endorsed the FCC’s reasoning for this policy.  First, the Commission had noted that the Internet is still at a formative stage and it therefore hoped to provide some guidance to consumers and the industry “without unduly tying our hands should the known facts change.” Second, the FCC explained that because Internet networks are so complex, it was not confident that a “one-size-fits-all approach is good policy.” Finally, the FCC argued that the restraint of a case-by-case approach best suited the wider recognition that “broadband services should exist in a minimal regulatory environment that promotes investment and innovation in a competitive market.”

Following last week’s (narrow) FCC vote to approve new Net Neutrality rules, I’m glad to see that the earlier pragmatism has been preserved in the detailed provisions of the FCC'S Order. As before, the document stresses the need to ‘balance clarity with flexibility’, and this remains the approach towards regulation of traffic management policies.  For instance: “A network management practice is reasonable if it is appropriate and tailored to achieving a legitimate network management purpose… Broadband providers should have flexibility to experiment, innovate, and reasonably manage their networks… We will further develop the scope of reasonable network management on a case-by-case basis, as complaints about broadband providers’ actual practices arise”.  Bravo!

Monday 20 December 2010

Net neutrality off target - again

I’m grateful to Chris Marsden once more for some of the links he provides in his blog.  As someone who sees net neutrality in policy, rather than political terms, I don’t pay too much attention to the factional rhetoric of politicians on either side of the debate.  However, in connection with tomorrow’s FCC vote on new Net Neutrality rules, Chris has introduced me to the views of Senator Al Franken of Minnesota.  Apparently, Senator Franken has written to FCC Chairman, Julius Genachowski, urging him to make sweeping changes to the Commission’s draft Order on net neutrality – ‘If this Order is adopted as drafted, it would be the first time in the Commission's history that it effectively legitimated blatantly discriminatory conduct on the Internet - against lawful applications, content, and devices.’

Now, I happen to disagree with the Senator about the FCC’s proposals – as I made clear a couple of weeks back – but there’s obviously room for debate here, and he’s entitled to his view.  What I do find objectionable are the Senator’s attempts to conflate net neutrality arguments with matters of antitrust law and industry structure.  For example, incensed by Comcast’s proposed acquisition of NBC, the Senator said in an August speech:

I believe that preventing media consolidation is a big part of the fight for a free and open Internet…. I urge you to oppose any and all efforts to undermine net neutrality or impede the free flow of information. This means opposing the Comcast/NBC merger-because it will hurt competition and the marketplace of ideas that has made the Internet what it is today.’

I’m sure even Senator Franken would concede that vertical integration is a rational tendency in many industries, one that can increase efficiency and promote consumer welfare, and the economic literature has shown this tendency to be particularly strong in industries facing ‘asset specificity’ – industries like telecoms.  Of course, vertical integration may also confer market power, and it’s not impossible that this power might later come to be abused.  But tackling such abuse is entirely what the competition authorities were created for: please, please don’t pretend that we need net neutrality rules in order to prevent anti-competitive behaviour.

Thursday 16 December 2010

Christmas reading

Here’s a challenge for all of those who have asked Santa to put an iPad or a Kindle in their Christmas stocking.  Use the enforced leisure of the festive break to browse Ofcom’s latest magnum opus, Simplifying Non-Geographic Numbers.  It’s obviously not that simple: the document runs to 482 pages!

Wednesday 15 December 2010

No more fruit salad?

An astute follower, Downbytheriverside, has pointed out that my already rather weary fruit analogy for call termination between BT and other fixed networks might be stretched a little further…..

In its September consultation on ‘Fair and reasonable charges for fixed geographic call termination’, Ofcom again tackled the question of what ‘reciprocity’ should actually mean in practice.  The nub of the problem, as Ofcom explains, is that call termination on non-BT networks has been seen as a different beast because of ‘the larger area typically covered by a local switch in BT’s competitors’ networks relative to the area typically covered by a local exchange in BT’s network’. On that basis, non-BT termination charges should lie somewhere between BT’s single tandem and local exchange rates – and the old ‘reciprocity’ agreement offered one method of deriving that blend (although not a very good one).  Put another way, a banana ought to cost more than a kiwifruit but less than a pineapple.  But maybe not in Ofcom’s eyes…

Having looked at several charging options, Ofcom delivers the killer blow thus:

We have reviewed this position and question whether such a blended approach continues to be appropriate. While it may be true that a local switch in BT’s competitors’ networks often covers a greater geographic area than a typical BT DLE, in the current environment of diverse technologies and multiplicity of networks this is not necessarily a reliable indicator either of the actual costs of terminating a call in the network in question or that such costs are necessarily efficiently incurred.

So that’s it.  Forget the underlying differences: in future, for bananas, read kiwis.

Monday 6 December 2010

A tale of fruit

Imagine, if you will, two fruit-growing islands somewhere in the Pacific – LARGE and SMALL.  LARGE produces pineapples and kiwifruit, while SMALL grows bananas.  A new competition authority for the Pacific islands, ‘PICA’, is set up to regulate the prices of LARGE because of its dominance in the fruit market.  It sets cost-based prices for both pineapples and kiwifruit.  It sees no reason to regulate banana prices but asks both LARGE and SMALL to suggest a methodology for establishing ‘fair and reasonable’ pricing.

In the interests of simplicity (sic), LARGE suggests that banana pricing should be linked to the regulated prices of pineapples and kiwifruit, and the volume of these that SMALL chooses to buy from LARGE.  So, if SMALL buys only pineapples from LARGE, bananas would be priced at the same level; similarly for kiwis.  If SMALL requires a mixture of pineapples and kiwis (as it does in practice), bananas are priced at the weighted-average pricing of SMALL’s purchases

SMALL says it would prefer to establish a free-standing (exogenous) price for bananas.  It objects to the LARGE algorithm on two grounds, i.e.

  • SMALL is minded to buy its pineapples from another Pacific island but recognises that doing so would, using the LARGE algorithm, reduce the value of its own banana exports (because its purchases from LARGE would be weighted more towards kiwis).

  • SMALL knows that some of the retailers and restaurants on the island organise their own sourcing of pineapples, and that these are not counted in SMALL’s official import figures.  LARGE has resisted attempts to consolidate the purchasing data.

PICA has said it wants to establish a long-term pricing strategy for bananas but has so far declined to comment on the fairness (or otherwise) of the LARGE algorithm.  DISCUSS.

Well, that discussion is currently going on – or should be.  With a bit of creative licence, the algorithm described above bears a striking resemblance to the old BT ‘reciprocity agreement’ for call termination charges on other fixed networks.  At its heart, this agreement provides that what BT pays to terminate voice traffic on another fixed network is a weighted average of its own single tandem and local exchange conveyance rates, the weights being derived from the other operator’s chosen mix of termination at BT’s tandem and local switches.  For fixed operators other than BT, this methodology creates very much the same problems as those faced by SMALL in the fruit trade, i.e.

·        An operator which contemplates buying less single tandem termination (pineapples) from BT, perhaps because it wants to route some traffic via a transit operator, will see its own termination revenue (banana price) fall;

·        Similarly, an operator that is already using transit to terminate calls on BT’s network (pineapples sourced directly by restaurants) might well argue that this traffic should be ‘counted’ in the calculation of its tandem/local mix.

The latter phenomenon formed part of a recent dispute between BT and COLT; the former has been a central issue in numerous disputes over non-BT call termination.  In all these cases, Ofcom has ultimately ruled that deviating from the (so-called) reciprocity agreement would not be ‘fair and reasonable’ – hence the agreement continues as the default solution for setting termination charges.   However, because the agreement was originally negotiated by the industry itself, and not imposed by the regulator, Ofcom (and Oftel before it) has never felt it necessary to judge whether that reciprocity agreement is itself fair to the other fixed networks.  Maybe a story of fruit will change that…




Friday 3 December 2010

Wake-up call for Ofcom

Ofcom rightly prides itself on being an evidence-based regulator but that house style of cool detachment can sometimes appear rather self-satisfied -even smug, Nowhere is this presentational problem more evident than in the UK’s relative performance in broadband, The concern of some observers is that, despite numerous upbeat and back-slapping statements from Ofcom, the UK’s broadband infrastructure has increasingly been slipping behind the demand curve.  In its latest edition, Ofcom’s annual International Communications Market report, appears to justify that concern.

This report makes it clear that, on the demand side, the UK leads the field in many ways:

·        We are way ahead in terms of internet commerce – over a six-month period, the average UK consumer spent over £1,000 online, double that of some other countries.
·        the UK saw the highest growth in smartphone take-up among the countries surveyed by Ofcom, with a 70% rise in subscriber numbers in 2009.
·        The UK also leads when it comes to take-up of digital TV,

There is clearly a healthy appetite for advanced services in this country but the Ofcom report concedes that, by international standards, our infrastructure is not up to the job: Only a tiny percentage of UK homes have super-fast broadband and mobile speeds are slow compared to other nations. The latter is a particular concern – the theoretical maximum mobile download speed in the UK is 7.2Mbps, compared to 28Mbps in Germany, 30Mbps in the US, 42Mbps in Japan and 100Mbps in Sweden”.

Ed Richards of Ofcom tends to blame this broadband mismatch on factors outside his control, specifically the delays in releasing suitable spectrum.  But his defensiveness also takes more quirky forms, e.g.

"I think sometimes the cycles of these things are different in different countries. It is not just a matter of flicking a switch,"

No, it’s not but maybe the time has come for the regulator to take a more proactive position on the UK’s broadband networks.  The risk is that if we wait to gather compelling evidence of market failure, it may already be too late.

Thursday 2 December 2010

Shock horror! FCC’s net neutrality plans sound pretty sensible

Following months of intensive negotiations aimed at finding middle ground in the net neutrality policy dispute, the FCC has finally come up with some ‘draft rules of the road’ for internet governance.  The new proposals, which are subject to an FCC vote later this month, were outlined by Chairman Julius Genachowski in a speech yesterday.
Importantly, the FCC has decided not to reclassify broadband as a Title II telecommunications service, meaning that it will continue as a lightly regulated ‘information service', rather than be subject to more onerous ‘common carrier’ rules.

The draft Framework includes many of the protections sought by neutrality advocates.  For instance, broadband providers will be barred from slowing down or blocking content from competitors. They will also have to be transparent about how they manage congestion on their networks.  On the other hand, the new rules would not preclude all traffic management practices, nor would they necessarily block the introduction of new business/pricing models:

“The proposed framework also recognizes that broadband providers must have the ability and investment incentives to build out and run their networks...To this end, broadband providers need meaningful flexibility to manage their networks - for example, to deal with traffic that’s harmful to the network or unwanted by users, and to address the effects of congestion...The record also demonstrates the importance of business innovation to promote network investment and efficient use of networks, including measures to match price to cost such as usage-based pricing”.

There’s much detail yet to be debated, and the political makeup of the ‘Commission means the vote (on 21st December) could go either way, but at least we have what appears to be a workable basis for resolving the hitherto polarised positions in the net neutrality debate. 

Tuesday 30 November 2010

Vertical separation as cure for adversarial regulation?

My digging around in the Telstra story has unearthed some interesting perspectives on the UK’s own experience with vertical separation.  The link comes courtesy of Kip Meek, former eminence grise at Ofcom, who was commissioned by Telstra to report on comparative experience of separation in the UK and Australia and to give a view on whether something like the Openreach model would be appropriate for the Australian market. (‘Operational Separation in Australia and the UK’.  Report by Kip Meek, Chairman, Ingenious Consulting Network, 24 June, 2008).  Given the scale of Australia’s broadband ambitions, he thought not, i.e. ’The demand risks and uncertainties associated with building an NGN, especially where it is intended to replace the PSTN, seem to me to raise doubts about whether a non-vertically integrated approach would be able to achieve the necessary level of investment co-ordination’.

Of more interest, perhaps, are Kip Meek’s reflections on why separation was seen as a suitable ‘remedy’ for the UK.  At the time, most of us thought this resulted from BT’s disappointing progress with local loop unbundling - and Ofcom’s Strategic Review’ of telecoms policy certainly placed heavy emphasis on the UK’s comparative performance in broadband penetration.  In fact, the UK’s relative broadband performance turned out to be pretty good, and Kip Meek recollects that the real objective of (negotiated) separation was a paradigm shift in behavioural regulation, e.g.
a highly adversarial and intrusive relationship had emerged between incumbent and regulator. Oftel maintained a welter of price regulation at the retail level and also felt compelled to reach into the heart of BT’s business…. The Communications Act of 2003 and the creation of Ofcom did however present an opportunity to break the attritional cycle of mutual mistrust that had become embedded in the relationship between incumbent and regulator

That worked well, then…

Vertical separation down under

Have been enjoying the histrionics in the Australian Senate as the government finally passed the enabling legislation to structurally separate Telstra.  (Good background to the story here).  Amidst all the excited rhetoric surrounding the creation of the country’s National Broadband Network (NBN), the jury is still out on whether such vertical separation of incumbents is actually a good idea.  The economics literature on vertical integration would certainly suggest otherwise.  For example, a 2009 paper by Bob Crandall and others concluded that ‘there is both theoretical and empirical support for the proposition that forced vertical separation of telecommunications networks will reduce economic efficiency, slow innovation, and impede performance in markets where it is imposed’.  The paper goes on to consider evidence of whether broadband development has been enhanced in the five OECD countries where some form of vertical separation has been mandated, i.e. Australia (functional, rather than structural separation), Italy, New Zealand, Sweden and the UK.  Again, this empirical analysis is less than supportive, e.g. ‘the evidence shows no increase in either investment or broadband penetration in nations that have mandated vertical separation; indeed, the evidence suggests that vertical separation has impeded the rollout of next generation networks’.

Good luck with the experiment, guys!

Tuesday 23 November 2010

Free at last??

The end-game of free voice telephony finally appears to be in sight…

Sometimes, you have to do a reality check: it’s not very long ago that I was joined in some intense commercial battles over the division of spoils in call termination. Now, a study for the European Commission by French consultants, Tera, has concluded that, by the end of 2012, fixed and mobile termination charges will converge at around 1 Eurocent/minute under the ‘improved’ calling-party-pays regime being mandated by the EU (this being based on the controversial adoption of “pure” LRIC network costing).  In that event, Tera believes, a move to a true Bill & Keep interconnection regimes becomes highly likely – if only because the cost of maintaining complex billing systems is no longer justified.

(But a few, final skirmishes in the Voice market still seem likely...).


Monday 22 November 2010

Net Neutrality off target?

Chris Marsden, passionate defender of net neutrality, bemoaned last week that the European ‘competitive ISP model’ – much cherished by Ofcom, Ed Vaizey and others – is illusory.  As he says, even if you don’t like your ISP’s traffic management policies, and choose to switch provider, “you'll still be dealing with the underlying cable-telco duopoly…”

Chris is right, of course, but the focus on network infrastructure is overdue. One of the weaknesses of much net neutrality rhetoric is that it has focused on the wrong competitive problem, i.e. it is aimed at preserving competition in applications and content, sections of the industry that are already highly competitive and the least protected by entry barriers (and likely to remain that way).  Arguably, the real focus should instead be on the impact network neutrality regulation would have on the competitiveness of the broadband access market.  In this context, academics such as Professor Christopher Yoo have concluded that mandating network neutrality can have the perverse effect of reinforcing sources of market failure by frustrating the introduction of differentiation in the access market, thereby restricting networks to competing on the basis of price and network size, factors that favour the existing providers.

Pete

Friday 19 November 2010