Archive for the Opinion Category
Posted by: Eric in Opinion
Do you ever share files for work? Most of us do, and most of us find it can be an annoyingly convoluted process. You might try to email the files, but discover that no amount of zipping will pack them small enough. I know of one technology firm in our sector that is too devoted to Microsoft, with the result that every time they email me an attachment, my Mac receives a useless winmail.dat file instead. To send large files, we could use FTP, but only a minority of telco employees will have FTP clients installed on their work computer (or know how to use them). Some of us sign up with internet middlemen like Dropbox, even though we know that any free service will eventually start asking for money. Google like to keep their services free, but if you try to share files using Google Docs, you discover that some people do not want to set up a Google account, whilst others are using browsers that are too old for Google’s interface to work properly. In the end, many resort to using a USB stick and carrying it from one computer to another – even though USB sticks are a proven security risk. What an extraordinarily bad advert this is for telecoms. We work in the telecoms sector, but we struggle with a basic and regular necessity – sending a file from one person to another.
So what is the way forward? Well, the answer is pretty straightforward, and the fact we struggle to find it tells us something about what is wrong with the business of electronic communication. My computer is on the internet. Your computer is on the internet. How hard can it be, to implement a method for the two computers to talk to each securely, just to send a file? It is not hard at all. The problem was a lack of motivation. The open source community has stepped up to fill the gap, like they so often do. OnionShare is a peer-to-peer (P2P) tool to share files of any size. It supports encryption, and because it runs over the Tor network, the people who send and receive files will remain anonymous even if somebody was trying to spy on them.
I suppose that last part also contributes to the problem. On the one hand, we want people to be able to communicate privately. On the other hand, we do not. The telecoms industry is torn. Large groups like Vodafone are taking a lead by disclosing how they try to protect human rights, whilst doing what is legally required to support surveillance. But telcos suffer a lot of government scrutiny on a lot of fronts; challenging a government’s snooping might lead to adverse government decisions when it comes to taxes, or price controls, or a hundred other areas where governments can mess with telecoms businesses and tilt the competitive playing field. And telco ‘partners’ might not like privacy for their own reasons. The music industry have nothing to gain from people being able to share files – which might be music files – without anyone being able to monitor who is sending what. P2P is far more problematic for media businesses than centralized services like YouTube, because it is easier to impose control over a service that has a centre and which is run for profit.
Should business assurance practitioners care? I suspect many would think not. The temptation is to think in terms of operations, rather than of the corporate strategy. Herein lies the major difficulty as some business assurance people seek to bridge the gap to risk management. If they cannot become more strategic in focus, they will fail. Trying to deliver effective assurance by solely managing operational risks is like installing better brakes, better seat belts, better bumpers, and better airbags in your motor car, and then handing the keys to a driver whose strategy is to run the car over a cliff. Whatever you were trying to accomplish at the operational level can be rendered insanely redundant by what occurs at the strategic level. Of course, the people who sell the brakes, seat belts, bumpers and airbags might not care that your efforts are doomed, which is why they will happily take over a strategic role and use it to define your job as purely operational in scope. They have nothing to lose by taking this approach. In contrast, you might lose your job, when a risk takes you over the cliff edge. In fact, if you go over a big enough cliff, everyone in your business might lose their job. The telco’s relationship to the government, to its customers, to its partners, and the proliferation of free software that encourages network traffic are all factors that will influence profits. If we are not gathering and understanding the data about aspects of ordinary routine communications activities – like sharing a file – then we are not really assuring anything. We are like passengers who assure the car is travelling within the speed limit, whilst failing to notice the wild-eyed stare of the driver, and how he has just turned the car off-road…
P2P traffic is both a source of profit and loss for telcos like ISPs. The desire to connect one computer to another computer will motivate customers to sign up with ISPs. On the other hand, the heaviest P2P users dominate the narrow band of customers whose bandwidth consumption is far greater than all other customers put together. Understanding how P2P services compete with other kinds of services is hence a factor in the setting of tariffs, monitoring usage limits, and planning for future network traffic. It would be better to anticipate changes in customer behaviour, rather than merely reacting to changes after we see them. More than this, P2P might be a competitor to services offered by our competitors, by our partners, and even by our own business. Telcos keep talking about moving up the value chain, and the dreaded fear of becoming ‘just’ a dumb bitpipe. OTT players may be our enemies, unless we try to partner with them. And whether we are doing it as a partnership, or by mimicking their offerings with a service created in-house, we also have an interest in network usage which competes with the value-added services being offered. Dropbox is a business. Netflix is a business. There are many business models threatened by the adoption of free P2P services, lawful or otherwise. Should telcos intervene in the flow of traffic over a network, in order to protect its own revenue streams, and those of other businesses which run services over the top of our networks? This leads us to the sticky subject of net neutrality. And it also begs a question: if P2P services are the enemy of the OTT business, and the OTT business is the enemy of my telco, then should I count my enemy’s enemy as a friend?
If my telco is able to charge for use in such a way that heavy exploitation of P2P is financially rewarding, and not a burden, there is no good reason for telcos to want to limit P2P traffic. P2P services are used by the consumer free of charge, except for the price levied by the network provider. In contrast, if the network is carrying the traffic of an OTT service, the consumer must pay for both the OTT provider and for the network use, suggesting the telco will receive a smaller share of revenues generated by a smaller volume of traffic. One solution to this latter problem is that the network provider seeks revenues from the OTT business – but is this likely to be as beneficial as preferring P2P to OTT traffic? Furthermore, the relative success of P2P makes the customer more reliant on one relationship with one supplier: the network provider. Trying to make money from OTT implies the network has already become dependent on a business that sits between the telco and the end consumers who ultimately pay for everything. Why would telcos want to encourage the emergence of large, powerful businesses that can exercise significant bargaining power to drive down the price charged for network use? By its nature, P2P services cut out the middlemen. As a result, P2P spares networks from the headache of having to recover some of its shortfall in revenues by trying to obtain money from a business which is motivated to drive up telecoms costs (by driving up network traffic) whilst sharing the least revenue possible with the network provider.
And this is before I mention compliance with government diktats, and the costs of compliance. People say there is a lot of money to be made from data, but data leads to a lot of costs too. We should avoid behaving like bankers: they took the money up front, and relied on taxpayers to cover the costs which came later. If telcos get themselves in a bad situation, there might not be a bailout for a second catastrophe caused by the misadventures of big business. Of course, a lot of bankers behaved well, which is why it is wrong to generalize. But if some telcos – or some OTT providers, whether in cooperation with telcos or independently of them – go too far with exploiting customer data, then all might suffer the backlash. And this is before I mention the risks that governments are also drawn to data, for reasons which might be moral and justified, or might not. Though there may be less revenue to be made from traffic which is distributed, encrypted, and secure – just like a lot of P2P traffic is – there will also be less cost, and less risk, because there will be less reason to engage in various kinds of data gathering and surveillance. Some will argue that centralized control is preferable, because it makes it easier to counter the worst abuses, including the transmission of child pornography and the coordination of terrorist activity. However, my observation would be that criminals and terrorists will choose to utilize P2P anyway, as will journalists reporting from inside repressive countries, and freedom fighters who want to counter official propaganda. There is a deep flaw in any logic which says we should push most ordinary people to use centralized modes of communication as a way to detect and control the activities of extremists. And so telcos need to be rational, and strategic, in deciding how best to encourage P2P traffic as a means to disrupt the business models of our competitors, whilst maximizing the revenues and minimizing the costs created by the network traffic.
One possible future involves business assurance becoming more forward-looking, and shifting the emphasis from detection of historic faults towards analysing data in order to accurately predict network use, customer behaviours etc. This creates the opportunity to increase revenues whilst being more efficient with expenditure. But you cannot predict the future without first constructing theories for how the future might play out, and understanding how you want it to play out. We are in danger of becoming like very clever car mechanics, sitting blindfold in the passenger seat: we know the engine is in perfect working order, but have no idea if the driver is going in the right direction. We risk becoming what the Germans would call a fachidiot; our narrow view allows us to absorb lots of data, but we cannot see anything else that is happening in the world. And so the implications of a simple task that many of us perform routinely – like how to get a file from one place to another, with all the consequences for cost, security, efficiency etc – is lost to us, even though the data travels over our own networks. I must admit that I was blind to this too, until somebody pointed me toward OnionShare, and by implication, everything I had been missing with how people could and should transmit files.
With the rise of Big Data, we should soon be in a position to know if a telco is better off having customers that provide for their own needs via freebie P2P technology, or customers whose needs are satisfied by OTT businesses. Answering the question, with all it entails, will still involve a lot of hard work. But the hardest job is identifying which questions should be asked, and why. Those teams that ask the right questions will find how effective machines can be, when they work for people. Those which ask the wrong questions, or who ask none, will be like my metaphorical car mechanic: possibly highly-skilled, but employed to service machines.
Posted by: Eric in Opinion
Does your telco have a SOC? That was the most fundamental question raised during the pre-conference training workshop at the WeDo WWUG14 user event, earlier this year. The SOC is a new addition to the family of xOCs. All network operators have a NOC, to monitor their network. Telcos have somewhat adopted the idea of a ROC, which is meant to monitor revenues, though the popularity of the concept may have been constrained by Subex’s decision to trademark the term ‘ROC’. Praesidium, the consulting unit of Mainroad, a sister company to WeDo, now say that telcos will increasingly need a SOC – a Security Operations Centre. So why do telcos need a SOC, and why is this being discussed at a conference for business assurance people?
If a NOC ensures the service is being provided to customers, the ROC ensures these services are generating a financial return to the telco. Complexity is being driven by the convergence of networks and IT, by the increasing sophistication of services, and by the range and power of the devices belonging to end users. This complexity makes security more challenging, and opens more security gaps that might lead to financial loss if left unclosed. These motivations suggest a similar solution to one which has been used before – implement an xOC, to continuously monitor the relevant internal and external intelligence feeds and information sources.
WWUG14 keynote speaker Robert Strickland, former CTO of Leap Wireless and former CIO of T-Mobile US, also talked about RA, fraud and security coming together. Whilst the end consequences might differ, the root causes of security loopholes, fraud weaknesses and revenue leaks will often be connected. This partly explains why a conference of business assurance people is being told about the need to implement a SOC.
However, I am not entirely convinced that the simple trend analysis, and the big bold metaphors and the repeating of established themes, leave any of us knowing what we are talking about, when we talk about the ‘convergence’ of RA, fraud and security, or the need for a SOC. The more operation centres you create, and the more they monitor disparate things, the more you raise the question of whether you could and should implement monitoring in a more holistic fashion. At the same time, saying that RA, fraud and security are converging sounds wonderful, until you wondered what a ‘converged’ practitioner looks like. There is not a single human being alive who is master of every topic that sits under the category of security. What are the chances that we might educate someone to do ‘converged’ RA, fraud and security? In fact, was there not some fundamental disagreement exhibited in this event, because we had a keynote speaker talking about convergence, whilst there was a workshop calling for another, specialized, operations centre to perform different, separate monitoring?
I think the root of this contradiction lies in complexity itself. When dealing with a complex problem, we need a big view that incorporates all aspects, or there is a risk that we misunderstand the problem, and fail to identify some elements of the causes, or some of the consequences that flow from them. This pushes us towards a ‘converged’ view, because we need to see and understand everything at once. However, complexity means an increase in detail, and there is a limit to how much detail any individual human can cope with. So as the volume of detailed information grows, it becomes necessary to create sub-divisions and sub-categories, compartmentalizing information and relying upon ever more narrowly-defined experts to manage each compartment. And that encourages us to establish yet more new, and specialized, teams.
In the past, I have written about ‘the zoom‘, the ability to shift your mental perspective from one where you work at incredibly low levels of detail, to one where you stand right back and see the big picture, to then zoom into detail elsewhere, and so appreciate all the connections. The ability to mentally ‘zoom’ is becoming more and more important, but that does not make it easier for people to master (or for some people to understand the point I am trying to make).
Whilst the call for both a converged view of security with business assurance, and for a SOC, are simultaneously both right, they are simultaneously both wrong. We need an appropriate level of resources to be deployed in managing all risks faced by telcos, and those resources may need to increase if risk profiles deteriorate. But we also need to understand the limits in coordinating resources. Efficiency degrades with scale, and eventually we reach a point where no amount of resources will help us to monitor more effectively, because the organization is unable to prioritize and to make the right decisions.
To put it another way, more monitoring is a viable strategy if there is a sensible limit to how much more monitoring is needed. But endless monitoring just leads to wasted resources – things are monitored for no good reason – whilst creating a logjam for decision-makers when nobody is able to prioritize the conflicting messages from all the data being monitored by different people around the business. So one strategy to deal with increasing complexity is not just to trumpet the mitigation of risks – in ways that specialist suppliers usually do – but to actively reduce complexity, by being less complex! And that might involve resisting the temptation to keep adding new technology to an already overly complicated architecture, aggressively decommissioning technology and services of declining importance, and splitting the telco into separate businesses.
Business assurance practitioners should welcome the convergence with security, but they would be wise to fear it too. It is true that the root causes of security exploits, frauds and leaks will be increasingly intertwined. But business assurance practitioners will not be able to rise to the challenge by taking the same happy-go-lucky, few-days-here-and-there, scam-training-but-who-cares-as-long-as-the-certificate-has-the-right-words-on-it, learn-by-trial-and-error approach to education that they have taken before. It was never fit for purpose, but we got away with it because nobody expected more, and nobody did better. However, this lax attitude to education would prove disastrous if applied to the coming challenges in security. Somebody needs to invest in people, to raise their knowledge and skill levels to the standard necessary to deal with the converged challenges of business assurance and security – and we know that privately-owned telcos tend to be lousy at making this kind of investment in their people.
Governments have realized the significance of the shortfall in private enterprise, and increasingly they are taking the lead by investing in cybersecurity, which includes a crucial investment in educating people. But these governments will rightly focus taxpayer’s money on the more narrow dimensions of security, and not on the broader and related commercial challenges concerning fraud and loss. If business assurance practitioners do not find a way to improve their education, the convergence of business assurance with security might prove to be nothing like a marriage of equal partners; it will be the takeover of business assurance by highly-trained security professionals.
Posted by: Eric in Opinion
I have a very serious point to make today. But in order to make it, I will start with a comic daydream…
Usain Bolt struck his lightning pose for the TV cameras. He smiled, and it felt like the world smiled back. A hush descended around the stadium. The Jamaican shook each long limb in turn. He was a sprinter, built for speed like no man had ever been. He was relaxed. He was ready.
On your marks.
Bolt sauntered forward. He was casual, but deliberate too, carefully placing each foot into position, soles pressed firmly against the starting blocks. Bolt then knelt upright, and made the sign of the cross. Now was the time. He looked up at the sky, and blew a kiss to the heavens. And then he faced down, as if in prayer.
His mind suddenly raced ahead, imagining himself at the finish line. This was the 2016 Rio Olympics, and Bolt was going to win his third consecutive gold for 100 metres. And in doing so, he would run faster than ever before, faster than anyone had ever run. Bolt was sure of it.
The starting gun fired.
Bolt exploded from the blocks. His legs unwound, consuming the ground before him. And it was over, finishing so soon there was no time for thought, nor words. There was only speed, and sound. The crowd roared louder than a hurricane, overawed by Bolt’s power, blown away by what they had witnessed.
Bolt raised his hands, praising the crowd as they praised him. Somebody handed him a Jamaican flag, and he held it aloft, flying it behind him as he toured the track. Then, halfway round, he stopped, to watch the replay on the stadium’s big screen. Bolt waited to see the official time. He knew he had broken his record. The only question was whether he had done something previously unimaginable. Was Bolt the first man to run 100 metres in under 9 seconds?
Bolt waited. He raised an eyebrow. He waited. He smiled, but less broadly. He waited. Why was it taking so long to show the official time for his race?
“Is everything alright, Mr. Bolt? We need you to move on, the high jumpers use this part of the track for their run-up.”
“Where’s my time?” quizzed Bolt. “I want to know my time for the 100 metres.”
“Oh.” The diminutive Olympic official tugged the front of his blazer, and looked nervously around him. “Didn’t anyone tell you? We’re not bothering to measure the times in this Olympics. There’s no need.”
“What do you mean, there’s no need?” asked Bolt, perplexed.
“You had the other guys racing alongside. What’s the point of measuring the time you ran? You definitely came first!”
“I want to see if I did better or worse than before. I might have gone under 9 seconds in that final.”
“Yes, but targets like that promote the wrong sort of competition. You won’t become a faster runner by trying to beat a target like that.”
All trace of a smile was erased from Bolt’s face. His eyes widened. He stepped closer to the official, leaning over him. He uttered one word: “what?”
“The Olympic Committee has brought in a much better way to improve athletic performances. Instead of using targets that get objectively measured, we now have some sports scientists, coaches and doctors who will review the video tape of your performance. They’ll write you a report, explaining how you could improve your running process further.”
“They’re going to write some papers that will improve my running process? Nobody timed my run?”
Bolt bared his teeth again, but he was not grinning. He raised his hands again, but not to the sky. They were wrapped around the neck of the Olympic official, throttling him.
Okay, so that was a silly story, and so far I have not mentioned telecoms once. But sometimes I get confronted by supposedly serious people acting in such ridiculous ways that it is difficult to describe the enormity of their silliness. So imagine the following. You want to be the Usain Bolt of revenue assurance. You want your telco’s bills to be more accurate than they have ever been before. You want to be a real champion, delivering bills which are more accurate than anyone’s bills have ever been. You want this, because you really strive to always improve your performance. Now answer one question:
Is it better to set accuracy targets and objectively measure your performance against those targets? Or not?
I think you know which conclusion I am expecting you to draw. There is a reason why we record the performance of athletes using numbers, why we measure companies in terms of profits, and why even governments set themselves targets (even if they always miss them). So you will be as bemused as I am, that a bunch of ‘experts’ in the United Kingdom believe the exact opposite to you and me. The UK has a long tradition of believing its telecoms billing is more accurate than everyone else’s. They believed this because people wrote pieces of paper saying they met accuracy targets which were tougher than anyone else’s. But now, to become even better, they have decided the way forward is to not have any targets at all.
Common sense. Logic. Past experience. Real data. None of these apply, when regulators make decisions like these. Here is the rationale, point by point, per Britain’s regulator:
The removal of the target-based requirements and the retention of the existing process-based requirements should ensure that the approval and audit processes, and ongoing reporting by CPs to ABs, were focussed on CPs identifying and analysing all billing errors. Instead of having targets which envisaged an ‘acceptable’ error rate, the remaining provisions, while recognising that errors may occur, would aim to achieve ongoing improvement in CPs’ systems and processes to ensure that, where errors occur, corrective measures are put in place that address the risk of repetition. This would more closely align the Direction with the provisions of GC11.1 which requires CPs to ensure that all bills represent and do not exceed the true extent of any such service actually provided, rather than setting targets for billing accuracy.
Let us break this down. To begin with, CPs are comms providers i.e. the telcos, whilst ABs are approval bodies i.e. a kind of ‘specialist’ auditor that checks bill accuracy. Some of them believe, and the regulator agrees with them, that the previously mandated level of accuracy somehow discouraged people from measuring and responding to all the billing errors that occurred. The target somehow prevented them from doing these utterly sensible things, which they are legally and morally obliged to do anyway, even though the target was for the total billing error. Words fail me already. What have they been attempting to measure so far, if not the sum total of all billing errors? Why would the absence of any target make somebody work harder to detect and resolve errors?
GC11.1 is clause 1 of General Condition 11, a requirement to be satisfied by every UK telco. It says, in short, that telcos should not overcharge customers. Duh. And yet, as even the regulator admits, mistakes happen. Previously, this reality was dealt with by stating how much error would be tolerated, before a telco was punished. This was the accuracy ‘target’, so to speak, though it should properly be called a tolerance, as it was supposedly mandatory to stay within the target. The theory was that telcos who break the limits deserve punishment. This does not preclude punishing telcos for behaving badly even if they do not break the limits. Governments want people to drive safely, and whilst it is very possible to drive dangerously at low speeds, it also makes sense for law enforcement authorities to automatically punish people who exceed speed limits. But the UK regulator has somehow reversed that kind of thinking. They now believe that having a limit somehow encourages telcos to behave badly, because they will drive up to that limit. Already we are in fanciful territory, as if telcos routinely discover very very very small overcharging errors and then decide they should do nothing about correcting them. But the regulator now thinks that tolerances encourage telcos to believe they will not get punished if they stay within the tolerance, though nothing currently stops the regulator from enforcing GC11.1 more strictly, if it wanted to. However, the regulator’s new approach is to do away with any tolerance, and to leave it completely vague when anybody will be punished for any bad behaviour.
CPs point out that they use process-based requirements for their own internal audits and for ensuring billing accuracy, so compliance costs could be reduced. This could also encourage voluntary compliance with the Direction by CPs with annual relevant revenues under £40 million not covered by its scope.
I was an auditor, once. I mean, I was a real auditor, once. In the real audit world, fees have to be earned, budgets managed, profits made, and there were a fair few other audit firms competing for the same work. In that real world, there is a simple but important rule of thumb: process-based audits are cheap, substantive auditing is expensive. Auditing a process is cheap because it is subjective. You contemplate whether anything might go wrong, and if you can think of nothing that might go wrong, then you conclude that nothing will go wrong. Substantive auditing is expensive because you have the hard work of actually checking real data, to see if it really is right or wrong. As there might be a lot of complicated data to check, this can get expensive, even on a sample basis. But whilst substantive auditing is hard and expensive, you must do some, or else your audit might be a total crock.
Contrast that real world insight to the paragraph from the regulator. It implies that telcos, and with the full knowledge of some kinds of auditors, have already been favouring the cheaper kind of audit work, which involves more of the subjective stuff, and less of the actual checking of data. The regulator notes that the less real checking done by telcos, the cheaper their audits will be. This is correct. And if audits get cheaper, the regulator hopes telcos will start volunteering to do audits even if they are not mandatory. This is pure speculation, and hardly a desirable goal anyway. Why encourage the voluntary proliferation of weak auditing practice, instead of focusing stringent audits on business practices that really need auditing? And how is any of this a valid argument for ending a requirement that was designed to protect customers from too much overcharging?
The arrangements should be more adaptable and future-proof as they would be based solely on processes rather than targets which might need to be changed as usage and services changed.
Remember, this regulation is supposedly about protecting real customers of real comms providers. This clause implies customers will be better protected in future, if no objective targets are set. Why? Because setting objective targets is difficult. But if it is difficult to set objective targets, how much harder would it be to decide the appropriate punishment for a telco that demonstrably overcharges its customers, when there is not even a guideline target to compare their performance to? And what, if anything, is objectively measured by a process-based audit?
Although we would be removing elements of the current requirements, we believe the remaining requirements of the Direction would be adequate to protect consumers by ensuring that CPs processes were focussed on ensuring the accuracy of bills. Indeed, for the reasons explained above, we consider that the focus on the process-based requirements should result in a closer alignment with the objectives of GC 11.1 and should therefore be more effective at protecting consumers.
In other words, though they are making the regulations easier, they claim this does not really make them any easier. In fact, they somehow claim it makes the regulations tougher, and hence more effective at protecting customers. Telcos will no longer need to do things they previously had to do. By telcos not doing as much, somehow customers will be protected more than before.
There is a simple counter-argument to the regulator’s position. Hardly any countries mandate targets for bill accuracy. The regulator continues to be challenged by telcos who complain British bill accuracy regulations are much tougher than those found anywhere else. For many years, the British regulator insisted it was necessary to set targets, even though they knew other countries did not. Either they were wrong to set targets before, or they are wrong to do away with targets now. Which one is it? What else changed, that might justify this u-turn? How is it that the UK regulator argues it was right, when they said it was necessary to adopt the toughest billing accuracy tolerances that the world has ever seen, and the UK regulator argues it is still right, when they say there is no need to measure performance against any kind of tolerance at all?
In a recent podcast with Mike Willett, I said the UK’s bill accuracy scheme was prone to ‘shenanigans’. Whilst it can be long and boring to explain what is involved in these shenanigans, I believe this latest regulatory twist provides ample evidence of the fundamental problems with the UK’s billing accuracy regime. The historic source of these problems is straightforward: the regime began with unrealistic and unworkable targets, endorsed by people who simply never accumulated enough data to realize how badly they had underestimated the true extent of billing error. From that point on, nobody was allowed to lose face. The auditors could not lose face, for being less knowledgeable and experienced than they pretended. The telcos could not lose face, because any telco that did would unfairly suffer relative to every other telco. And the regulator could not lose face, because they would have to admit they implemented a regulation that delivered collective delusion instead of consumer protection.
Now UK customers are reaping the rewards of all that face-saving. Instead of protecting customers with realistic targets, backed by tough data-driven audits and a genuine desire to penalize the worst offender, the UK’s regulator has ensured nobody lost their job. To do this, they needed to deliver a through-the-looking-glass explanation for why the only thing better than having an insanely narrow accuracy target is to not have any target at all!
After his disappointment in Rio, Usain Bolt called a press conference, and said he would run just one more race, with the intention of setting a new 100 metres world record that would never be broken.
“Wow Usain!” shouted one of the press corps. “What’s your target for this unbreakable new record? Is it 0.01 seconds?”
“What are you smoking?” replied Bolt. “Nobody can run 100 metres in 0.01 seconds. That target would be madness.”
Another member of the press corps shouted out. “If you’re not going to run it in less than 0.01 seconds, why bother having a target at all?”
Bolt put his head in his hands. Mo Farah, his friend, sat alongside. Farah put his hand on Bolt’s shoulder, consoling his old pal. Bolt slowly turned to face Farah. “Don’t be down,” said Farah, “they just don’t understand what people like us are trying to accomplish.”
“Maybe they don’t even care,” mused Bolt.
Posted by: Guest in Opinion
Today’s guest post is by Ahmad Nadeem Syed, Director of Revenue Assurance and Fraud Management at Mobilink. When long-standing talkRA contributor David Leshem argued why people are more valuable than tools, he prompted a flurry of replies, both agreeing and disagreeing with David’s opinions. Ahmad was amongst the respondents, leaving a thoughtful comment that said neither is more valuable than the other, because value is generated when tools complement people. He finished his comment by asking that the next talkRA blog on the topic should discuss that theme of the complementary combination of people and tools – so I invited him to write it! Ahmad graciously agreed, and here is his article on why we should focus on the combination of people and tools, rather than dwelling on each element separately.
In the Stone Age, human needs were limited to eating and sleeping. Their food comprised of available wild animal meat, fruits and vegetables. They used stones or wooden arrows for hunting and long sticks for plucking fruit that was out of reach. We can see the apes doing the similar thing even today, while breaking a coconut.
The activity incorporates three things: (1) the goal – the instinct to survive; (2) the people – the Stone Age human; and (3) the tools – stones and wooden arrows. Imagine various situations; there are hungry people with available tools but not knowing (the skill) how to aim and throw the stone or arrow at the prey. In another situation: there are hungry people with skill, but no tools available with lot of food available. In both the situations the ultimate goal of survival is not met if the variables become constant.
With the passage of time, the goals kept moving and growing from survival to comfort to luxuries, both vertically and horizontally. This necessitated having more and better tools coupled with people having advanced skills who are able to use these tools for the purpose of achieving the desired goals.
Come to today’s modern age; if it can be called modern in the eyes of the generation coming after say 50 years, in the wake of rapidly changing human needs and the technology. Humans have always been more and more dependent upon technology with the passage of every moment, may it be personal, social or business life. The mobile phone, for example, has become an essential part of our lives. Some are using the latest smart phones, and some are still content with only voice and SMS. Both classes have developed the skills to use these tools to meet their varying needs. Can anybody today imagine toady’s life without mobile phones, and in reverse, lot of mobile phones but no people?
The computers are integral part of any business, but then we need people with varying skills to develop the applications and run the computers. Can anybody imagine any business today with lots of people but no computers or vice versa? Some may argue some small businesses still rely on pen and paper, but then these are also tools.
I call this the PNT (People-Need-Tool) phenomena. People are meant to live, and so they have needs, and tools allow them to meet their needs. Once the first set of needs is met, another set of needs become a necessity and therefore another set of tools is required.
Let me mention a busted myth here. It was commonly said that automation will cause widespread unemployment as computers will replace the human. But the reality is different. As the business and social needs increased, the way of managing these sectors changed, generating and using high volumes of data. Handling of these data volumes require better, high performance and sometimes specially designed computers/tools. The operation of these tools needed skilled people, therefore the automation instead of creating unemployment, paved the way for IT education and thus generating new opportunities.
Let us take the examples of telecommunication. I work for a GSM operator, where network elements are producing over a billion CDRs on a daily basis. These CDRs are processed by mediation, IN and billing systems. As the head of the Revenue Assurance and Fraud Management team, where I have a highly skilled team of analysts and IT professionals, I use very high performance RA and FM systems to ensure that no revenue leakage occurs. I will be completely stuck, the day either my key people are absent, or any of my key systems goes offline.
I therefore believe in people and tools being a complementary combination, without any preference.
Posted by: Eric in Opinion
The relationship between people and machines has been a recurring theme on talkRA. I discuss the tension between people and machines a lot. But I think the tension is not really between people and machines. The tension is between people who treat other people like human beings, and people who treat other people like machines. People are varied, difficult, unpredictable, individual, and demanding. It would be convenient for a business to have 500 employees who all behave the same way, and 5 million customers who all behave the same way. But people are not like that. Whether we talk about software, controls, or processes, there is a danger that somebody with little empathy for real people, and divorced from the consequences of their decisions, will make terrible choices that then become hard coded into the ‘rules’ of the business. These decisions may look good from the cold, abstract perspective of a spreadsheet, but can be terrible for the human beings affected by them. The end result is the kind of customer experience shared below.
American technology journalist Ryan Block called Comcast to cancel his internet service. After ten minutes of arguing with Comcast’s ‘retention specialist’, he decided to record the remainder of the call, capturing the final eight minutes. Afterwards, he shared the recording on SoundCloud, where it took just two days to reach 4 million people. Why did 4 million people take an interest in Block speaking to Comcast about cancelling his service? Because Comcast’s representative repeatedly demanded an explanation for why Block wanted to cancel his service – in the obvious hope that Block would simply give up and remain a customer. Listen for yourself…
I wanted to talk about this incident because these examples must be balanced against any data-centric analysis of how to boost revenues, reduce churn, and so on. This recording is also data. Unfortunately, it is the kind of data that is hard to compress into numbers and spreadsheets. But it is still vitally important data if we want to understand how well the business is performing. And this data says: “avoid Comcast as your service provider, because they treat customers badly.”
The recording also says that Comcast treats its staff like machines. Whilst the customer thinks they are talking to a human being, who has some discretion over how they behave, the customer might as well be speaking to an IVR. Comcast’s representative behaves like a slave to the rigid rules he is expected to follow. That means the employee is required to ‘save’ the customer by any means possible, even if the customer is absolutely determined to leave.
A statement issued by Comcast puts the blame solely on their representative, saying:
The way in which our representative communicated with them is unacceptable and not consistent with how we train our customer service representatives.
However, others have questioned Comcast’s corporate attitude. When Comcast tweeted to say they would take ‘quick action’, Block tweeted back:
I hope the quick action you take is a thorough evaluation of your culture and policies, and not the termination of the rep.
And somebody claiming to be a former employee of Comcast used Reddit to share a much more comprehensive analysis of why Comcast’s representatives would behave like this:
If I was reviewing this guys calls I’d agree that this is an example of going a little too hard at it, but here’s the deal (and this is not saying they’re doing the right thing, this is just how it works). First of all these guys have a low hourly rate. In the states I’ve worked in they start at about 10.50-12$/hr. The actual money that they make comes from their metrics for the month which depends on the department they’re in. In sales this is obvious, the more sales you make the better you do.
In retention, the more products you save per customer the better you do, and the more products you disconect the worst you do (if a customer with a triple play disconnects, you get hit as losing every one of those lines of business, not just losing one customer.) These guys fight tooth and nail to keep every customer because if they don’t meet their numbers they don’t get paid.
Comcast uses “gates” for their incentive pays, which means that if you fall below a certain threshold (which tend to be stretch goals in the first place) then instead of getting a reduced amount, you get 0$. Let’s say that if you retain 85% of your customers or more (this means 85% of the lines of businesses that customers have when they talk to you, they still have after they talk to you), you get 100% of your payout – which might be 5-10$ per line of business. At 80% you might only get 75% of your payout, and at 75% you get nothing.
The CAEs (customer service reps) watch these numbers daily, and will fight tooth and nail to stay above the “I get nothing” number. This guy went too far, you’re not supposed to flat out argue with them. But comcast literally provides an incentive for this kind of behavior. It’s the same reason peoples bills are always fucked up, people stuffing them with things they don’t need or in some cases don’t even agree to.
I find this account of Comcast’s rules to be credible. Comcast may have a rule saying their reps should not argue with customers. However, nobody is this overzealous unless they are motivated to be like this. In other words, something in Comcast’s rules, procedures and incentives is motivating this human being to be so dogged at retaining customers. Without a financial incentive, it would be normal for the rep to just do as Block asked, cancelling the service and ending the call as quickly as possible. Arguing for nearly 20 minutes shows that the rep has something personally at stake. In this case, the rep has too much at stake.
Whilst Comcast’s motivational techniques might deliver good results on their spreadsheet – there is no doubt this kind of high-energy ‘retention’ strategy will influence some customers – there are also downside consequences for real people which may not be shown by the data that management looks at. No matter how much data we think we have, when it comes it comes to marketing analysis, customer service, satisfaction and loyalty, we need to remember how difficult it is to reduce people’s attitudes and behaviour to numbers which computers can calculate. Decision-makers who ignore human consequences do not deserve respect, whether they intend to disconnect a batch of old services, and wait to see if any customers complain that they have been affected, or whether they give a salesman a big bonus for results, then plead ignorance of the salesman’s unethical tactics.
Data can be clean and straightforward, making it pleasant to work with. Much of business assurance is rightly oriented around data. Manipulating and managing data contrasts with the messy business of how people think and act, which is difficult to record, measure and describe using rules and formulae. But telcos exist to serve people, and business assurance professionals should always keep that in mind.