Eric

Eric Priezkalns is a founder of talkRA. He is a widely recognized expert on risk management and business assurance for communications providers. After a successful full-time career, Eric now splits his time between occasional consulting projects for trusted customers, and his many other passions. Eric was Director of Risk Management for Qatar Telecom and has worked with Cable & Wireless Group, T-Mobile UK, BSkyB, Worldcom UK, and Nawras, as well as advising various software developers and system integrators.

Eric is a qualified chartered accountant; he trained whilst employed by the Enterprise Risk Services division in Deloitte's London office. His Masters in Information Systems was earned with distinction, and he holds a first-class degree in Mathematics and Philosophy.

In 2006, Eric was already a popular speaker at conferences, but he decided to reach out to a broader audience with the first blog dedicated to revenue assurance. Many have since copied him, but none have matched his output.

Eric was the first leader of the the TM Forum's Enterprise Risk Management team, a founding member of the TM Forum’s Revenue Assurance team, and he developed the original Revenue Assurance Maturity Model. In the UK, Eric is known for his critique of billing accuracy regulations. In Qatar, Eric was a founding member of the National Committee for Internet Safety. Eric currently serves on the committee of the Revenue Assurance Group, and he is an editorial advisor to Black Swan.

Do you ever share files for work? Most of us do, and most of us find it can be an annoyingly convoluted process. You might try to email the files, but discover that no amount of zipping will pack them small enough. I know of one technology firm in our sector that is too devoted to Microsoft, with the result that every time they email me an attachment, my Mac receives a useless winmail.dat file instead. To send large files, we could use FTP, but only a minority of telco employees will have FTP clients installed on their work computer (or know how to use them). Some of us sign up with internet middlemen like Dropbox, even though we know that any free service will eventually start asking for money. Google like to keep their services free, but if you try to share files using Google Docs, you discover that some people do not want to set up a Google account, whilst others are using browsers that are too old for Google’s interface to work properly. In the end, many resort to using a USB stick and carrying it from one computer to another – even though USB sticks are a proven security risk. What an extraordinarily bad advert this is for telecoms. We work in the telecoms sector, but we struggle with a basic and regular necessity – sending a file from one person to another.

So what is the way forward? Well, the answer is pretty straightforward, and the fact we struggle to find it tells us something about what is wrong with the business of electronic communication. My computer is on the internet. Your computer is on the internet. How hard can it be, to implement a method for the two computers to talk to each securely, just to send a file? It is not hard at all. The problem was a lack of motivation. The open source community has stepped up to fill the gap, like they so often do. OnionShare is a peer-to-peer (P2P) tool to share files of any size. It supports encryption, and because it runs over the Tor network, the people who send and receive files will remain anonymous even if somebody was trying to spy on them.

I suppose that last part also contributes to the problem. On the one hand, we want people to be able to communicate privately. On the other hand, we do not. The telecoms industry is torn. Large groups like Vodafone are taking a lead by disclosing how they try to protect human rights, whilst doing what is legally required to support surveillance. But telcos suffer a lot of government scrutiny on a lot of fronts; challenging a government’s snooping might lead to adverse government decisions when it comes to taxes, or price controls, or a hundred other areas where governments can mess with telecoms businesses and tilt the competitive playing field. And telco ‘partners’ might not like privacy for their own reasons. The music industry have nothing to gain from people being able to share files – which might be music files – without anyone being able to monitor who is sending what. P2P is far more problematic for media businesses than centralized services like YouTube, because it is easier to impose control over a service that has a centre and which is run for profit.

Should business assurance practitioners care? I suspect many would think not. The temptation is to think in terms of operations, rather than of the corporate strategy. Herein lies the major difficulty as some business assurance people seek to bridge the gap to risk management. If they cannot become more strategic in focus, they will fail. Trying to deliver effective assurance by solely managing operational risks is like installing better brakes, better seat belts, better bumpers, and better airbags in your motor car, and then handing the keys to a driver whose strategy is to run the car over a cliff. Whatever you were trying to accomplish at the operational level can be rendered insanely redundant by what occurs at the strategic level. Of course, the people who sell the brakes, seat belts, bumpers and airbags might not care that your efforts are doomed, which is why they will happily take over a strategic role and use it to define your job as purely operational in scope. They have nothing to lose by taking this approach. In contrast, you might lose your job, when a risk takes you over the cliff edge. In fact, if you go over a big enough cliff, everyone in your business might lose their job. The telco’s relationship to the government, to its customers, to its partners, and the proliferation of free software that encourages network traffic are all factors that will influence profits. If we are not gathering and understanding the data about aspects of ordinary routine communications activities – like sharing a file – then we are not really assuring anything. We are like passengers who assure the car is travelling within the speed limit, whilst failing to notice the wild-eyed stare of the driver, and how he has just turned the car off-road…

P2P traffic is both a source of profit and loss for telcos like ISPs. The desire to connect one computer to another computer will motivate customers to sign up with ISPs. On the other hand, the heaviest P2P users dominate the narrow band of customers whose bandwidth consumption is far greater than all other customers put together. Understanding how P2P services compete with other kinds of services is hence a factor in the setting of tariffs, monitoring usage limits, and planning for future network traffic. It would be better to anticipate changes in customer behaviour, rather than merely reacting to changes after we see them. More than this, P2P might be a competitor to services offered by our competitors, by our partners, and even by our own business. Telcos keep talking about moving up the value chain, and the dreaded fear of becoming ‘just’ a dumb bitpipe. OTT players may be our enemies, unless we try to partner with them. And whether we are doing it as a partnership, or by mimicking their offerings with a service created in-house, we also have an interest in network usage which competes with the value-added services being offered. Dropbox is a business. Netflix is a business. There are many business models threatened by the adoption of free P2P services, lawful or otherwise. Should telcos intervene in the flow of traffic over a network, in order to protect its own revenue streams, and those of other businesses which run services over the top of our networks? This leads us to the sticky subject of net neutrality. And it also begs a question: if P2P services are the enemy of the OTT business, and the OTT business is the enemy of my telco, then should I count my enemy’s enemy as a friend?

If my telco is able to charge for use in such a way that heavy exploitation of P2P is financially rewarding, and not a burden, there is no good reason for telcos to want to limit P2P traffic. P2P services are used by the consumer free of charge, except for the price levied by the network provider. In contrast, if the network is carrying the traffic of an OTT service, the consumer must pay for both the OTT provider and for the network use, suggesting the telco will receive a smaller share of revenues generated by a smaller volume of traffic. One solution to this latter problem is that the network provider seeks revenues from the OTT business – but is this likely to be as beneficial as preferring P2P to OTT traffic? Furthermore, the relative success of P2P makes the customer more reliant on one relationship with one supplier: the network provider. Trying to make money from OTT implies the network has already become dependent on a business that sits between the telco and the end consumers who ultimately pay for everything. Why would telcos want to encourage the emergence of large, powerful businesses that can exercise significant bargaining power to drive down the price charged for network use? By its nature, P2P services cut out the middlemen. As a result, P2P spares networks from the headache of having to recover some of its shortfall in revenues by trying to obtain money from a business which is motivated to drive up telecoms costs (by driving up network traffic) whilst sharing the least revenue possible with the network provider.

And this is before I mention compliance with government diktats, and the costs of compliance. People say there is a lot of money to be made from data, but data leads to a lot of costs too. We should avoid behaving like bankers: they took the money up front, and relied on taxpayers to cover the costs which came later. If telcos get themselves in a bad situation, there might not be a bailout for a second catastrophe caused by the misadventures of big business. Of course, a lot of bankers behaved well, which is why it is wrong to generalize. But if some telcos – or some OTT providers, whether in cooperation with telcos or independently of them – go too far with exploiting customer data, then all might suffer the backlash. And this is before I mention the risks that governments are also drawn to data, for reasons which might be moral and justified, or might not. Though there may be less revenue to be made from traffic which is distributed, encrypted, and secure – just like a lot of P2P traffic is – there will also be less cost, and less risk, because there will be less reason to engage in various kinds of data gathering and surveillance. Some will argue that centralized control is preferable, because it makes it easier to counter the worst abuses, including the transmission of child pornography and the coordination of terrorist activity. However, my observation would be that criminals and terrorists will choose to utilize P2P anyway, as will journalists reporting from inside repressive countries, and freedom fighters who want to counter official propaganda. There is a deep flaw in any logic which says we should push most ordinary people to use centralized modes of communication as a way to detect and control the activities of extremists. And so telcos need to be rational, and strategic, in deciding how best to encourage P2P traffic as a means to disrupt the business models of our competitors, whilst maximizing the revenues and minimizing the costs created by the network traffic.

One possible future involves business assurance becoming more forward-looking, and shifting the emphasis from detection of historic faults towards analysing data in order to accurately predict network use, customer behaviours etc. This creates the opportunity to increase revenues whilst being more efficient with expenditure. But you cannot predict the future without first constructing theories for how the future might play out, and understanding how you want it to play out. We are in danger of becoming like very clever car mechanics, sitting blindfold in the passenger seat: we know the engine is in perfect working order, but have no idea if the driver is going in the right direction. We risk becoming what the Germans would call a fachidiot; our narrow view allows us to absorb lots of data, but we cannot see anything else that is happening in the world. And so the implications of a simple task that many of us perform routinely – like how to get a file from one place to another, with all the consequences for cost, security, efficiency etc – is lost to us, even though the data travels over our own networks. I must admit that I was blind to this too, until somebody pointed me toward OnionShare, and by implication, everything I had been missing with how people could and should transmit files.

With the rise of Big Data, we should soon be in a position to know if a telco is better off having customers that provide for their own needs via freebie P2P technology, or customers whose needs are satisfied by OTT businesses. Answering the question, with all it entails, will still involve a lot of hard work. But the hardest job is identifying which questions should be asked, and why. Those teams that ask the right questions will find how effective machines can be, when they work for people. Those which ask the wrong questions, or who ask none, will be like my metaphorical car mechanic: possibly highly-skilled, but employed to service machines.

Bookmark and Share

The vendor formerly known as Mara-Ison Connectiva has changed its official name to Connectiva Insights and Analytics Ltd, and has adopted the new (but very similar) brand name of iConnectiva; see their press release here.

The change of name is partly motivated by a desire to re-position themselves as a supplier of analytics solutions, in contrast to offering revenue assurance and fraud management tools. As the press release noted:

This is in line with the transformation of the company from a Telecom Revenue Assurance/Fraud Management product provider to an analytics solutions company.

This shift in focus is also enabled by underlying changes in technology:

The company is in process (sic) of porting its existing products RA (Affirm) and FMS (Sentry) to its big data technology based platform CMETRICA

What does this mean for the business assurance market? Put simply, it means the people running the new Connectiva think they will make more money by selling general-purpose analytics than from selling narrowly-defined RA and FMS tools. This is despite them claiming to run the “world’s largest revenue assurance deployment“.

I think they are right to change tack, given the decline in business assurance sales in recent years, including the original Connectiva’s collapse in 2012. Business assurance is an overcrowded market. Some of the current competitors need to look elsewhere for future revenues. It is right for Connectiva to change strategy – though the change has come much later than it should.

Bookmark and Share

Does your telco have a SOC? That was the most fundamental question raised during the pre-conference training workshop at the WeDo WWUG14 user event, earlier this year. The SOC is a new addition to the family of xOCs. All network operators have a NOC, to monitor their network. Telcos have somewhat adopted the idea of a ROC, which is meant to monitor revenues, though the popularity of the concept may have been constrained by Subex’s decision to trademark the term ‘ROC’. Praesidium, the consulting unit of Mainroad, a sister company to WeDo, now say that telcos will increasingly need a SOC – a Security Operations Centre. So why do telcos need a SOC, and why is this being discussed at a conference for business assurance people?

If a NOC ensures the service is being provided to customers, the ROC ensures these services are generating a financial return to the telco. Complexity is being driven by the convergence of networks and IT, by the increasing sophistication of services, and by the range and power of the devices belonging to end users. This complexity makes security more challenging, and opens more security gaps that might lead to financial loss if left unclosed. These motivations suggest a similar solution to one which has been used before – implement an xOC, to continuously monitor the relevant internal and external intelligence feeds and information sources.

WWUG14 keynote speaker Robert Strickland, former CTO of Leap Wireless and former CIO of T-Mobile US, also talked about RA, fraud and security coming together. Whilst the end consequences might differ, the root causes of security loopholes, fraud weaknesses and revenue leaks will often be connected. This partly explains why a conference of business assurance people is being told about the need to implement a SOC.

However, I am not entirely convinced that the simple trend analysis, and the big bold metaphors and the repeating of established themes, leave any of us knowing what we are talking about, when we talk about the ‘convergence’ of RA, fraud and security, or the need for a SOC. The more operation centres you create, and the more they monitor disparate things, the more you raise the question of whether you could and should implement monitoring in a more holistic fashion. At the same time, saying that RA, fraud and security are converging sounds wonderful, until you wondered what a ‘converged’ practitioner looks like. There is not a single human being alive who is master of every topic that sits under the category of security. What are the chances that we might educate someone to do ‘converged’ RA, fraud and security? In fact, was there not some fundamental disagreement exhibited in this event, because we had a keynote speaker talking about convergence, whilst there was a workshop calling for another, specialized, operations centre to perform different, separate monitoring?

I think the root of this contradiction lies in complexity itself. When dealing with a complex problem, we need a big view that incorporates all aspects, or there is a risk that we misunderstand the problem, and fail to identify some elements of the causes, or some of the consequences that flow from them. This pushes us towards a ‘converged’ view, because we need to see and understand everything at once. However, complexity means an increase in detail, and there is a limit to how much detail any individual human can cope with. So as the volume of detailed information grows, it becomes necessary to create sub-divisions and sub-categories, compartmentalizing information and relying upon ever more narrowly-defined experts to manage each compartment. And that encourages us to establish yet more new, and specialized, teams.

In the past, I have written about ‘the zoom‘, the ability to shift your mental perspective from one where you work at incredibly low levels of detail, to one where you stand right back and see the big picture, to then zoom into detail elsewhere, and so appreciate all the connections. The ability to mentally ‘zoom’ is becoming more and more important, but that does not make it easier for people to master (or for some people to understand the point I am trying to make).

Whilst the call for both a converged view of security with business assurance, and for a SOC, are simultaneously both right, they are simultaneously both wrong. We need an appropriate level of resources to be deployed in managing all risks faced by telcos, and those resources may need to increase if risk profiles deteriorate. But we also need to understand the limits in coordinating resources. Efficiency degrades with scale, and eventually we reach a point where no amount of resources will help us to monitor more effectively, because the organization is unable to prioritize and to make the right decisions.

To put it another way, more monitoring is a viable strategy if there is a sensible limit to how much more monitoring is needed. But endless monitoring just leads to wasted resources – things are monitored for no good reason – whilst creating a logjam for decision-makers when nobody is able to prioritize the conflicting messages from all the data being monitored by different people around the business. So one strategy to deal with increasing complexity is not just to trumpet the mitigation of risks – in ways that specialist suppliers usually do – but to actively reduce complexity, by being less complex! And that might involve resisting the temptation to keep adding new technology to an already overly complicated architecture, aggressively decommissioning technology and services of declining importance, and splitting the telco into separate businesses.

Business assurance practitioners should welcome the convergence with security, but they would be wise to fear it too. It is true that the root causes of security exploits, frauds and leaks will be increasingly intertwined. But business assurance practitioners will not be able to rise to the challenge by taking the same happy-go-lucky, few-days-here-and-there, scam-training-but-who-cares-as-long-as-the-certificate-has-the-right-words-on-it, learn-by-trial-and-error approach to education that they have taken before. It was never fit for purpose, but we got away with it because nobody expected more, and nobody did better. However, this lax attitude to education would prove disastrous if applied to the coming challenges in security. Somebody needs to invest in people, to raise their knowledge and skill levels to the standard necessary to deal with the converged challenges of business assurance and security – and we know that privately-owned telcos tend to be lousy at making this kind of investment in their people.

Governments have realized the significance of the shortfall in private enterprise, and increasingly they are taking the lead by investing in cybersecurity, which includes a crucial investment in educating people. But these governments will rightly focus taxpayer’s money on the more narrow dimensions of security, and not on the broader and related commercial challenges concerning fraud and loss. If business assurance practitioners do not find a way to improve their education, the convergence of business assurance with security might prove to be nothing like a marriage of equal partners; it will be the takeover of business assurance by highly-trained security professionals.

Bookmark and Share

I have a very serious point to make today. But in order to make it, I will start with a comic daydream…

Usain Bolt struck his lightning pose for the TV cameras. He smiled, and it felt like the world smiled back. A hush descended around the stadium. The Jamaican shook each long limb in turn. He was a sprinter, built for speed like no man had ever been. He was relaxed. He was ready.

On your marks.

Bolt sauntered forward. He was casual, but deliberate too, carefully placing each foot into position, soles pressed firmly against the starting blocks. Bolt then knelt upright, and made the sign of the cross. Now was the time. He looked up at the sky, and blew a kiss to the heavens. And then he faced down, as if in prayer.

Set.

His mind suddenly raced ahead, imagining himself at the finish line. This was the 2016 Rio Olympics, and Bolt was going to win his third consecutive gold for 100 metres. And in doing so, he would run faster than ever before, faster than anyone had ever run. Bolt was sure of it.

The starting gun fired.

Bolt exploded from the blocks. His legs unwound, consuming the ground before him. And it was over, finishing so soon there was no time for thought, nor words. There was only speed, and sound. The crowd roared louder than a hurricane, overawed by Bolt’s power, blown away by what they had witnessed.

Bolt raised his hands, praising the crowd as they praised him. Somebody handed him a Jamaican flag, and he held it aloft, flying it behind him as he toured the track. Then, halfway round, he stopped, to watch the replay on the stadium’s big screen. Bolt waited to see the official time. He knew he had broken his record. The only question was whether he had done something previously unimaginable. Was Bolt the first man to run 100 metres in under 9 seconds?

Bolt waited. He raised an eyebrow. He waited. He smiled, but less broadly. He waited. Why was it taking so long to show the official time for his race?

“Is everything alright, Mr. Bolt? We need you to move on, the high jumpers use this part of the track for their run-up.”

“Where’s my time?” quizzed Bolt. “I want to know my time for the 100 metres.”

“Oh.” The diminutive Olympic official tugged the front of his blazer, and looked nervously around him. “Didn’t anyone tell you? We’re not bothering to measure the times in this Olympics. There’s no need.”

“What do you mean, there’s no need?” asked Bolt, perplexed.

“You had the other guys racing alongside. What’s the point of measuring the time you ran? You definitely came first!”

“I want to see if I did better or worse than before. I might have gone under 9 seconds in that final.”

“Yes, but targets like that promote the wrong sort of competition. You won’t become a faster runner by trying to beat a target like that.”

All trace of a smile was erased from Bolt’s face. His eyes widened. He stepped closer to the official, leaning over him. He uttered one word: “what?”

“The Olympic Committee has brought in a much better way to improve athletic performances. Instead of using targets that get objectively measured, we now have some sports scientists, coaches and doctors who will review the video tape of your performance. They’ll write you a report, explaining how you could improve your running process further.”

They’re going to write some papers that will improve my running process? Nobody timed my run?”

“That’s right.”

Bolt bared his teeth again, but he was not grinning. He raised his hands again, but not to the sky. They were wrapped around the neck of the Olympic official, throttling him.

Okay, so that was a silly story, and so far I have not mentioned telecoms once. But sometimes I get confronted by supposedly serious people acting in such ridiculous ways that it is difficult to describe the enormity of their silliness. So imagine the following. You want to be the Usain Bolt of revenue assurance. You want your telco’s bills to be more accurate than they have ever been before. You want to be a real champion, delivering bills which are more accurate than anyone’s bills have ever been. You want this, because you really strive to always improve your performance. Now answer one question:

Is it better to set accuracy targets and objectively measure your performance against those targets? Or not?

I think you know which conclusion I am expecting you to draw. There is a reason why we record the performance of athletes using numbers, why we measure companies in terms of profits, and why even governments set themselves targets (even if they always miss them). So you will be as bemused as I am, that a bunch of ‘experts’ in the United Kingdom believe the exact opposite to you and me. The UK has a long tradition of believing its telecoms billing is more accurate than everyone else’s. They believed this because people wrote pieces of paper saying they met accuracy targets which were tougher than anyone else’s. But now, to become even better, they have decided the way forward is to not have any targets at all.

Common sense. Logic. Past experience. Real data. None of these apply, when regulators make decisions like these. Here is the rationale, point by point, per Britain’s regulator:

The removal of the target-based requirements and the retention of the existing process-based requirements should ensure that the approval and audit processes, and ongoing reporting by CPs to ABs, were focussed on CPs identifying and analysing all billing errors. Instead of having targets which envisaged an ‘acceptable’ error rate, the remaining provisions, while recognising that errors may occur, would aim to achieve ongoing improvement in CPs’ systems and processes to ensure that, where errors occur, corrective measures are put in place that address the risk of repetition. This would more closely align the Direction with the provisions of GC11.1 which requires CPs to ensure that all bills represent and do not exceed the true extent of any such service actually provided, rather than setting targets for billing accuracy.

Let us break this down. To begin with, CPs are comms providers i.e. the telcos, whilst ABs are approval bodies i.e. a kind of ‘specialist’ auditor that checks bill accuracy. Some of them believe, and the regulator agrees with them, that the previously mandated level of accuracy somehow discouraged people from measuring and responding to all the billing errors that occurred. The target somehow prevented them from doing these utterly sensible things, which they are legally and morally obliged to do anyway, even though the target was for the total billing error. Words fail me already. What have they been attempting to measure so far, if not the sum total of all billing errors? Why would the absence of any target make somebody work harder to detect and resolve errors?

GC11.1 is clause 1 of General Condition 11, a requirement to be satisfied by every UK telco. It says, in short, that telcos should not overcharge customers. Duh. And yet, as even the regulator admits, mistakes happen. Previously, this reality was dealt with by stating how much error would be tolerated, before a telco was punished. This was the accuracy ‘target’, so to speak, though it should properly be called a tolerance, as it was supposedly mandatory to stay within the target. The theory was that telcos who break the limits deserve punishment. This does not preclude punishing telcos for behaving badly even if they do not break the limits. Governments want people to drive safely, and whilst it is very possible to drive dangerously at low speeds, it also makes sense for law enforcement authorities to automatically punish people who exceed speed limits. But the UK regulator has somehow reversed that kind of thinking. They now believe that having a limit somehow encourages telcos to behave badly, because they will drive up to that limit. Already we are in fanciful territory, as if telcos routinely discover very very very small overcharging errors and then decide they should do nothing about correcting them. But the regulator now thinks that tolerances encourage telcos to believe they will not get punished if they stay within the tolerance, though nothing currently stops the regulator from enforcing GC11.1 more strictly, if it wanted to. However, the regulator’s new approach is to do away with any tolerance, and to leave it completely vague when anybody will be punished for any bad behaviour.

CPs point out that they use process-based requirements for their own internal audits and for ensuring billing accuracy, so compliance costs could be reduced. This could also encourage voluntary compliance with the Direction by CPs with annual relevant revenues under £40 million not covered by its scope.

I was an auditor, once. I mean, I was a real auditor, once. In the real audit world, fees have to be earned, budgets managed, profits made, and there were a fair few other audit firms competing for the same work. In that real world, there is a simple but important rule of thumb: process-based audits are cheap, substantive auditing is expensive. Auditing a process is cheap because it is subjective. You contemplate whether anything might go wrong, and if you can think of nothing that might go wrong, then you conclude that nothing will go wrong. Substantive auditing is expensive because you have the hard work of actually checking real data, to see if it really is right or wrong. As there might be a lot of complicated data to check, this can get expensive, even on a sample basis. But whilst substantive auditing is hard and expensive, you must do some, or else your audit might be a total crock.

Contrast that real world insight to the paragraph from the regulator. It implies that telcos, and with the full knowledge of some kinds of auditors, have already been favouring the cheaper kind of audit work, which involves more of the subjective stuff, and less of the actual checking of data. The regulator notes that the less real checking done by telcos, the cheaper their audits will be. This is correct. And if audits get cheaper, the regulator hopes telcos will start volunteering to do audits even if they are not mandatory. This is pure speculation, and hardly a desirable goal anyway. Why encourage the voluntary proliferation of weak auditing practice, instead of focusing stringent audits on business practices that really need auditing? And how is any of this a valid argument for ending a requirement that was designed to protect customers from too much overcharging?

The arrangements should be more adaptable and future-proof as they would be based solely on processes rather than targets which might need to be changed as usage and services changed.

Remember, this regulation is supposedly about protecting real customers of real comms providers. This clause implies customers will be better protected in future, if no objective targets are set. Why? Because setting objective targets is difficult. But if it is difficult to set objective targets, how much harder would it be to decide the appropriate punishment for a telco that demonstrably overcharges its customers, when there is not even a guideline target to compare their performance to? And what, if anything, is objectively measured by a process-based audit?

Although we would be removing elements of the current requirements, we believe the remaining requirements of the Direction would be adequate to protect consumers by ensuring that CPs processes were focussed on ensuring the accuracy of bills. Indeed, for the reasons explained above, we consider that the focus on the process-based requirements should result in a closer alignment with the objectives of GC 11.1 and should therefore be more effective at protecting consumers.

In other words, though they are making the regulations easier, they claim this does not really make them any easier. In fact, they somehow claim it makes the regulations tougher, and hence more effective at protecting customers. Telcos will no longer need to do things they previously had to do. By telcos not doing as much, somehow customers will be protected more than before.

There is a simple counter-argument to the regulator’s position. Hardly any countries mandate targets for bill accuracy. The regulator continues to be challenged by telcos who complain British bill accuracy regulations are much tougher than those found anywhere else. For many years, the British regulator insisted it was necessary to set targets, even though they knew other countries did not. Either they were wrong to set targets before, or they are wrong to do away with targets now. Which one is it? What else changed, that might justify this u-turn? How is it that the UK regulator argues it was right, when they said it was necessary to adopt the toughest billing accuracy tolerances that the world has ever seen, and the UK regulator argues it is still right, when they say there is no need to measure performance against any kind of tolerance at all?

In a recent podcast with Mike Willett, I said the UK’s bill accuracy scheme was prone to ‘shenanigans’. Whilst it can be long and boring to explain what is involved in these shenanigans, I believe this latest regulatory twist provides ample evidence of the fundamental problems with the UK’s billing accuracy regime. The historic source of these problems is straightforward: the regime began with unrealistic and unworkable targets, endorsed by people who simply never accumulated enough data to realize how badly they had underestimated the true extent of billing error. From that point on, nobody was allowed to lose face. The auditors could not lose face, for being less knowledgeable and experienced than they pretended. The telcos could not lose face, because any telco that did would unfairly suffer relative to every other telco. And the regulator could not lose face, because they would have to admit they implemented a regulation that delivered collective delusion instead of consumer protection.

Now UK customers are reaping the rewards of all that face-saving. Instead of protecting customers with realistic targets, backed by tough data-driven audits and a genuine desire to penalize the worst offender, the UK’s regulator has ensured nobody lost their job. To do this, they needed to deliver a through-the-looking-glass explanation for why the only thing better than having an insanely narrow accuracy target is to not have any target at all!

After his disappointment in Rio, Usain Bolt called a press conference, and said he would run just one more race, with the intention of setting a new 100 metres world record that would never be broken.

“Wow Usain!” shouted one of the press corps. “What’s your target for this unbreakable new record? Is it 0.01 seconds?”

“What are you smoking?” replied Bolt. “Nobody can run 100 metres in 0.01 seconds. That target would be madness.”

Another member of the press corps shouted out. “If you’re not going to run it in less than 0.01 seconds, why bother having a target at all?”

Bolt put his head in his hands. Mo Farah, his friend, sat alongside. Farah put his hand on Bolt’s shoulder, consoling his old pal. Bolt slowly turned to face Farah. “Don’t be down,” said Farah, “they just don’t understand what people like us are trying to accomplish.”

“Maybe they don’t even care,” mused Bolt.

Bookmark and Share

Basset, the Swedish revenue management software firm, has been purchased by Enghouse Systems for USD10mn. You can see the press release here.

Though their strengths lay in managing wholesale and roaming revenues, Basset has long been a minor player in the business assurance market. Formerly called Bassetlabs, they have offered FMS and RA tools for over 10 years – but with little sales success. Rival software developers invested more heavily in their assurance products, and soon eclipsed Basset in terms of market share, riding a wave of growth that eluded Basset. Nevertheless, Basset continued to pitch themselves to the revenue assurance and fraud management community, though their misreading of the market was exemplified when they became the leading vendor sponsor of GRAPA. Basset’s current product portfolio makes little mention of assurance; we shall have to see if their new owners decide to exit the market.

Enghouse Systems is a publicly-traded Canadian software conglomerate, whose strategy appears to be based around acquiring underperforming enterprise software firms. Basset is just one of three firms they have already bought this year. Enghouse is growing rapidly, and are on course to earn over USD200mn in revenues by year end. Purchasing Basset should help Enghouse to increase sales in Europe.

Bookmark and Share