Archive for September, 2007

It is that time of the year again. Subex Azure and Analysys have issued their 2007 attitudes survey. The word attitude is important, because although the survey asked people to tell them numbers, as far as we can tell most of them are just opinions, not hard fact. But they get quoted as if they are derived from solid, irrefutable data. Which if you think about it, would make no sense. If you had solid data about this level of leaks, then you would be more effective at fixing the leaks! Despite the reservations that people like me have about the survey, Subex Azure, who have continued the survey that was started by Azure, know they are on to a good thing. The survey is regularly quoted all over the place, earning them plenty of kudos and helping to reinforce the strength of their brand in revenue assurance circles.

Here are the highlights that they picked out. The bits in italics are me interpreting the data (in my own inimitably sarcastic style).

The survey findings are based on responses from 96 operators from around the world.

Subex Azure boasts over 150 “installations” in over 60 countries, so a fair few of the operators must be Subex Azure customers.

Average revenue leakage has increased dramatically from 12.1% to 13.6% of turnover.

Oh dear! Does this mean leakage is going up for Subex Azure’s customers too? But surely if you buy their products then leakage should go down?

If 13.6% of turnover is the mean average, then that means for every telco with 0% leakage, there is another losing over a quarter of all revenues. But hang on… no CEO ever got sacked because of revenue leakage. Does this mean shareholders do not share the same attitudes as people working in revenue assurance?

Operators’ beliefs about what is an ‘acceptable’ amount of revenue loss has risen to 1.8% from 1.1% last year.

That explains 0.7% of the rise in leakage then. Some telco revenue assurance departments just decided to make their own targets easier. Perhaps shareholders should sack the CEOs that let revenue assurance departments decide what is and what is not acceptable.

The increases in revenue leakage are predominantly due to external fraud, internal fraud and fraud by other operators.

Operator-to-operator fraud – nice to see someone admitting it takes place. But presumably none of the operators that responded to the survey volunteered figures on how much they made by committing fraud. For the purposes of the survey, should the money operators make from defrauding others be netted off against the money they lose from being the victim of fraud?

Average fraud losses have grown to 4.5% this year from 2.9% of turnover last year.

Very interesting. So these telcos can confidently measure how much money they allow somebody else to steal from them. If they are so good at measuring the theft, which means they can detect it when it has happened, then why are they not better at preventing theft from happening again?

Subex Azure makes a lot of money from selling fraud detection systems. Perhaps some Subex Azure customers do not know how to set up the fraud detection systems they purchased

In this year’s survey, we specifically asked about the impact of convergence. We found that increased network convergence means that more revenue assurance managers have responsibilities for both fixed and mobile networks

Increased network convergence? This does not sound like anything to do with convergence to me. It sounds like more revenue assurance managers are finding they cannot continue to put off the big chunks of work they previously never had time for.

One final trend is that the mid-size operators (between 100k – 1m subscribers) tend to lose the most revenue. Smaller operators tend to have simple products and processes which result in lower losses, whilst large operators have the resources to approach the problems systematically.

There is some sloppy language here, and my guess is it leads to sloppy maths. When they say that mid-size operators lose the “most” revenue, they probably mean they have the highest proportionate losses i.e. stated as a % of revenues. This is different to saying that they have the highest absolute losses i.e. stated in numbers of dollars. Chances are that the headline “average” leakage for the industry is also misleading. You cannot take an “average” of different %’s of revenue leaked. 5% of a big telco’s revenues may be a lot more money than 15% of a mid-size telco’s revenues. So you cannot say the “average” loss of the two companies is 10% of the total revenues. But that error is made over and over again – taking the industry average % leakage quoted by this survey, multiplying it by the total industry revenues, and claiming the answer is the total absolute leakage in dollars. That is bad maths. If large operators leak lower proportions of revenue, then the mean average leakage across the whole telco industry may well be a lot less than the attention-grabbing headline stated here. Instead of just employing Analysys to ask people’s opinions, they should also employ someone to do some proper stats.

By the way, it says on the survey webpage that if I discuss the numbers, I have to give credit by stating these exact words: “Source: Subex Azure Operator Attitudes to Revenue Management Survey 2007.” Now that I have, I assume there is little danger of a law suit. So in case you missed it before, it is the Subex Azure survey I am making fun of ;) I hope that is clear to everyone!

Bookmark and Share

Subex Azure has slashed its revenue and profits forecast. For the financial year ending March 2008, the business is now forecasting profits after tax of US$26m, down from US$38m. Revenues are now expected to be US$130m; previously the advice had been US$150m. See here for the details from Subex Azure. The drop is blamed on the delay of US$20m sales to a single customer.

This manifest dependence on a few big customers belies the rosy stories that analysts give about the growth of the revenue assurance market. If Subex Azure, which boasts of having 32 of the 50 biggest operators amongst its clients, can be this vulnerable to the decisions of a single operator then their rival suppliers are at even more risk of erratic swings in earnings. But the information that even Subex Azure is dependent on a very few big contracts is already out there. Their Q1 advice made that plain: 20% of revenues from a single customer, over two thirds of revenues from just five customers.

The good news, depending on who you are, is that Subex Azure is going at full speed with shifting jobs to India. Aggressively cutting costs is one way to generate shareholder value, but is pretty depressing for an industry that likes to promise it can deliver sky-high returns on minimal investment. Whilst cost management is the message for shareholders it was not enough to stop a 13% drop in its stock.

For now, buying into revenue assurance is risky. Whether you are a telco buying tools and services, or an investor looking for good picks in the software industry, be warned that the returns on revenue assurance may not be as great as promised.

Bookmark and Share

Here is the fourth and final part of the series of blogs about net neutrality. In parts one, two and three I discussed the implications if the net stopped being neutral. But there is reason to believe the US Government will support neutrality.

The US Department of Justice’s submission to the FCC may have landed a hard blow in support of allowing telcos to put short-term profits ahead of everything else – including long-term profits and the good of the wider economy. But the submission is by no means the end of the line for the net neutrality argument. The FCC may well pay more heed to the arguments raised by internet founders like Vint Cerf, one of the men who created the TCP/IP protocols, than they do to ideologically dogmatic lawyers in a cushy government job. Cerf recently gave an excellent summary of the net neutrality hazards, amongst other things, whilst speaking to the Financial Times – watch the video here. Though even if the FCC screws up, it may not mean the complete end for US domination of the internet. Cerf is also one of the most outspoken advocates for the transition from IPv4 to IPv6. By having a lot more capacity for IP addresses, migrating to IPv6 will avoid a rapidly-emerging bottleneck as all the possible IPv4 addresses get used up. This is vital for the developing world, which will need an increasing share of IP addresses in line with its rapid and increasing take up of internet devices, including mobile phones. Access to the internet will be key to enabling their economies to communicate, access information, and compete fairly with the rest of the world, hence fueling growth and prosperity. So a conspiracy theorist might fear that a laggardly approach to IPv6 migration may help the US to continue to dominate the internet. There is little cause to entertain conspiracies yet. In fact, the US government set June 2008 as its deadline for when all US government agencies must have transitioned to IPv6. How ironic that whilst one branch of the US government helps to open up the internet all around the world, another branch, if it gets its way, would impede the benefits the internet brings to US citizens.

Bookmark and Share

This blog continues the response, in parts one and two, to the US Department of Justice’s recent arguments that net neutrality should not be protected.

Paying so that certain kinds of traffic get a higher class of service on the internet is like paying extra to be the first passenger to get off a plane. That may save a few minutes, but as the hostess in Snakes On A Plane points out, economy customers land at the same time as the ones in first class. If you keep getting off planes and boarding new ones, then maybe those saved minutes can add up to something significant (ignoring the fact that the next plane may be delayed or leave without you… but you get the idea.) However, as soon as you change to a carrier where you have not paid for priority treatment, or to one which treats all customers the same, then you stop getting any advantage. So how much is priority treatment on a network really worth to the customer? Probably not that much. It only becomes worthwhile if the standard level of service becomes so shabby, so slow and unreliable that nobody wants to suffer it unless there is a very large price differential, and if customer’s traffic stays on the one or few networks where the customer is prepared to pay for priority treatment. If the price differential is small, or if the traffic goes elsewhere, it all becomes rather uncompelling as a sales proposition. If the differentials are small, then everyone can pay for priority treatment. Which means everybody just paid a little bit more for the same class of service they would have got if nobody paid a little bit more. Remember, the only thing worth paying for here is the right to jump ahead in the queue. If everybody has the right to jump ahead in the queue, then everyone just ends up in the same place in the queue. So the price differential has to be severe enough that some will not be prepared to pay it, and are willing to wait whilst others go first. On the other hand, the value of jumping ahead is exactly determined by how many people you jump ahead of. To be worth a lot, you need to jump ahead of a lot. Which means not many customers will be paying for premium treatment. That only begs the question of just how much extra revenue can be generated for network carriers this way.

Think of the possible class of service price differentials that networks could introduce much like the price differentials between flying first class or economy. Introducing that kind of differential to pricing and service may help the network carriers make more money, but it would be overall negative for the economy. Commerce thrives on open lines of communication, which is why the internet plays host to big business. Damaging those lines of communication through inefficient economic models for distributing resources and encouraging investment, would ultimately hurt the wealth and standard of living enjoyed by all citizens. The damage will be two-fold if it discourages new entrepreneurs to enter the market and encourages entrepreneurial businesses to establish themselves in other countries offering a high standard of undifferentiated internet service. Throughout history, relying on unfettered free markets to deliver an efficient communication and transport infrastructure, be it roads, rail, telegraph, mail, airlines, telephony or the internet, has failed as often as it has succeeded. In these sectors, even government monopolies can be more efficient than free market competitors. The lack of innovation and of a competitive spur is offset by how a comprehensive and reliable service encourages more total throughput. Not many people travel on US trains any more because they do not join up, do not run many services and do not go to many destinations. In contrast, a high quality telephone service that is always available encourages people to make many more calls than would a service which worked well only sporadically or where the quality depends on who you call. The internet is popular because it consistently works well and is consistently getting better. If, on the other hand, the internet fails customers sometimes, chances are the customers will use it less often in general, and not discriminate between which parties are poorly supported and which are well supported. So the short-term advantages enjoyed by network operators that differentiate service may well be outweighed by the reduction in long-term growth.

The relative importance placed on short-term over long-term profits is one of the reasons why free markets are sometimes poor at providing adequate investment to large infrastructure projects. This is where governments, with their overriding concern for the health of the whole economy, and not just of one business, often need to intervene. That does not mean the US government needs to start paying for network investment straight from taxpayer’s pockets. But it does mean that the FCC needs to consider the interests of its nation, the USA, and not just of some businesses, when reviewing the arguments for and against a neutral net. The FCC needs the freedom to judge that there may be no overall harm in forcing the network operators to carry the burden of investment, especially if all that means is that customers pay higher for a top-grade undifferentiated internet. That price will be worth paying if those same customers also benefit from lower prices for everything else – movies, food, accounting services, information, shoes… anything that can be sold over the internet – as a result of fostering a more efficient marketplace. The danger is that the FCC will look at the costs to telcos and to customers from a narrow perspective. If they look at internet services in isolation from the goods and services sold over the internet, then they may help to force down the cost of internet services, but force up the price of everything sold over the internet.

Whole nations suffer the cost of poor infrastructure, which slows the pace of economic growth and encourages entrepreneurs to move to more lucrative parts of the world. The US DoJ is right to focus its concern on the level of investment in the internet, as higher levels of investment should help the economy at large. However, the DoJ is wrong to assume that a hands-off approach from government automatically leads to the highest levels of investment. Part of the DoJ argument is that the lobby for net neutrality is not able to show a current need for intervention. The argument is entirely circular; as nobody currently differentiates service, the net is currently neutral. That hardly demonstrates that there is no need to intervene to keep it neutral. Net neutrality is an argument to ensure the status quo, of undifferentiated service, does not change. The DoJ is capricious when it argues a change to differentiated services would be good for the customer and for investment, whilst also arguing that there is no need to intervene because there have been no problems with the undifferentiated services so far. That is like saying the current scenario works well, so things will be better if we change it! I much prefer the argument that if it ain’t broke, don’t fix it.

Governments should be willing to intervene to ensure the best possible infrastructure, as was the case with the US Postal Service, and its founder, Benjamin Franklin. In the case of net neutrality, the best decision for the US economy would be to enforce it, irrespective of the way this distorts the market for the network carriers. Free market ideology does not always guarantee the best financial returns – just look at Enron. If the US turns its back on net neutrality, it risks making an error that will erode the economic advantage it has gained from its pre-eminence on the internet. It may further reinforce a shift in the worldwide balance of power and wealth already being driven by dwindling energy resources, off-shoring, and the flight of capital due to simultaneous over-regulation (think Sarbanes-Oxley) and under-regulation (think subprime crisis). The US can ill-afford to also lose its dominance over the 21st century marketplace for communication, media and trade that is the internet. However, narrow considerations in the debate over the neutral net pose just that threat to the economic well-being of the USA.

The next blog will be the fourth and final part of this series on the neutral net. It discusses why a healthy and neutral internet is vital not just for the US economy, but for the whole world.

Bookmark and Share

Yesterday’s blog covered why the US Department of Justice (DoJ) had got it completely wrong by arguing that neutral net laws might stifle investment and hurt customers. Today we should take a look at why they came to the wrong conclusion.

If you read the full DoJ submission to the FCC on net neutrality it is not hard to find the main principle that sits behind all their arguments. Unfortunately, it is a crude example of political ideology, rather than a well-considered economic analysis befitting an institution supposedly working for the best interests of the populous. Here is the key paragraph in full:

“The Department submits, however, that free market competition, unfettered by unnecessary governmental regulatory restraints, is the best way to foster innovation and development of the Internet. Free market competition drives scarce resources to their fullest and most efficient use, spurring businesses to invest in and sell as efficiently as possible the kinds and quality of goods and services that consumers desire. Past experience has demonstrated that, absent actual market failure, the operation of a free market is a far superior alternative to regulatory restraints.”

There you go. Simple really. Free market good, regulation bad. There is no need to do any fancy maths or economics equations, or take into account the views of free market champions like Google, Amazon and eBay ;) Do not get me wrong. I am no communist. The free market is the right answer, most of the time. Not all of the time. Here are some examples where the free market was not so successful for the US economy:

  1. The biggest mobile handset manufacturer in the world, Nokia, comes from Finland, not the US. The reasons why are pretty simple: Finland is an affluent society with high levels of education but where people are geographically distributed, and Europe got its act together and agreed excellent common standards and rules for mobile networks whilst the U.S. was encouraging an unfettered free market.
  2. The smartest guys in the room, the ones running Enron, loved the free market. Kenneth Lay, CEO and Chairman of Enron, was a fervent campaigner for deregulation of the energy market, despite being employed in the 1970′s by the federal energy regulator. He was so keen on trading in free markets that Enron diversified from trading energy into trading communications bandwidth. Their free market instincts won them many accolades in the US; Fortune magazine named Enron as “America’s Most Innovative Company” for six years in a row. Only one thing was wrong: the Enron business model was based on a complete fiction. Cue blackouts in California, financial collapse and many workers who lost their pensions.
  3. The US Postal Service… whoops, no, that is a government monopoly.

How peverse, then, that the DoJ cites the US Postal Service as an example of the benefits of being unregulated:

“The U.S. Postal Service, for example, allows consumers to send packages with a variety of different delivery guarantees and speeds, from bulk mail to overnight delivery. These differentiated services respond to market demand and expand consumer choice.”

Well, for a start, this rather seems to contradict the DoJ’s own mantra that the free market is best. The US postal market is highly regulated. The US Postal Service is a monopoly for many the services it provides and is a branch of the US government. However, it is easy to understand the DoJ’s analogy. They are equating packets of data sent over a network with the physical packets sent through the post. The DoJ hence concludes that offering a variety of classes of service, at a variety of different prices, is best for the customers sending and receiving those packets, whether they be physical or data. There is only one thing wrong with the analogy. The US Postal Service is a network. It can control the quality of service from the time a letter is posted to the time it is delivered. The internet is not a network. It is an inter-network (the clue is in the name). No one network can control the quality of service experienced by a user of the internet any more than the US Postal Service can punish the Postal Corporation of Kenya for losing a letter sent from New York to Nairobi.

Crude political ideology and inappropriate argument by analogy: these are tell-tale signs that the US Department of Justice lacks the competence to understand the implications of its recommendations. But perhaps that is no surprise. Should we really expect lawyers to think hard about what works well in practice, as opposed to what words provide the best cover for legal backsides? The DoJ is probably more concerned with the fact that it lacks the competence to argue with the big network players. The strange thing here, though, is that they do not need to. All they need to do is to balance the arguments of the big network players with the arguments of the big business that favours intervention to preserve a neutral net. For some reason, those arguments, from the Googles and Amazons who also have a lot at stake, have not been held in high regard. One aspect of this is strange. If the DoJ has a political motivation, it is clearly not just erring on the side some big business interests versus other big business interests. It is also erring against the consumer. Like it or not, network carriers are not popular with customers in the way that the Googles and Amazons are. That is hardly reason to believe that the networks are automatically wrong, but it does play into the hands of cynics who accuse government agencies of serving narrow interest groups instead of society as a whole. That argument is simplistic, but in this case I think it is pretty straightforward to show how it is right.

In the next part of this multi-part blog, it is time to discuss the wider economic implications of making some internet traffic appear faster by putting the brakes on other internet traffic.

Bookmark and Share