Archive for the Opinion Category
Posted by: Eric in Opinion
There is an old English saying: “look after the pennies, and the pounds will look after themselves.” If you are careful with even the smallest amounts of money, you will never need to fear that you are poorer than you realized. The same thinking can be applied to time, and to revenue assurance. When I worked in revenue assurance, I looked after milliseconds, as well as pennies. The challenge of revenue assurance is often treated as a question of data in vs. data out, where the data that comes in has already been simplified to a whole number of seconds. But if you only think in terms of abstract data, whizzing between computers, then you have already lost some data, and failed to see some opportunities for fraud and error. Events in the real world do not have durations that are measured in whole numbers of seconds. There is money in milliseconds, and where there is money, we should take an interest in the possibility of error and fraud.
To illustrate my point, let us ask how we feel about the following scenario. Suppose some unscrupulous business distributed a smartphone program which makes all calls last 0.5 seconds longer, by introducing a delay between when a user presses the button on their touchscreen, and when the phone signals that it wants to end the call. No matter how call durations are rounded, half of all calls will be recorded as lasting one second longer than previously. If calls are charged per second, and the average call has a duration of 90s, then revenues from affected customers will rise by 0.56%. Whilst 0.56% does not sound very much, revenue assurance has shown that there is very significant profit to be made by chasing these small fractions of percentages.
Having given the scenario, can I now ask who is responsible for ensuring this does not happen? Is it revenue assurance? Is it fraud management? Would we be secretly glad if somebody did this? How would we feel if handset manufacturers improved their code, and reduced the delay between when a user presses the button and when the phone signals the termination of the call?
Let me now translate this into a different problem that nobody seems to be taking that seriously. (Apologies to anyone who is taking this seriously – if you are one of them, then please respond, and set a good example for me and everyone else!) Everybody has a naive understanding of when a phone call starts, and when it ends, even if they do not really understand the technical details and propagation delays involved. But who has an understanding of how much data needs to be transmitted in any given situation? How would you be able to tell, if the data was deliberately padded to increase the volume? This might seem like a crazy scenario for telcos, who worry about the cost of servicing increasing demand for data services. But on the other hand, there are various ways that telcos currently get paid for data volumes, and there might be more ways in future. On the flip side, if networks are worried about capex spending on infrastructure, what more can they do to improve the efficiency of the data they carry? To get some feel for the numbers involved, take a look at the blog I wrote about accuracy expectations in relation to IPDRs and DOCSIS.
As you may have noticed by now, I am banging at doors, asking questions, without offering answers. My interest is in challenging existing practitioners about whether they have looked behind those doors. To be brutally frank, all of us will suffer from a lack of imagination about the ways that complex networks and systems can be subject to error or manipulation.
I have no desire to open every door, and explain what lies behind them all. For a start, that would involve a lot of hard work. Also, it would be thankless. But most importantly, many people should alreay be facing these challenges, so they have as much duty to share their insights as I do. With that in mind, let me ask some other questions that are perhaps left outside of the scope of our work, but need to be inside the scope of somebody’s work.
- Using your phone to make international money transfers sounds like a great idea. If somebody can manipulate the time when a currency transaction takes place, they can manipulate the foreign exchange rate that is applied. What is best practice to ensure customers are not cheated?
- So you want to charge Netflix, and businesses like them, for all the burden they place on your network. Fine. The internet being what it is, sometimes packets need to be resent. Who pays for that slice of the burden?
- 14-year old Suvir Mirchandani has identified how the US Government could save hundreds of millions of dollars by printing documents in typefaces that consume less ink. Telcos are also getting smarter in similar ways; they even hold conferences about energy efficiency. But who is responsible for analysing the cost of things like data which is stored in the cloud, but then never used?
Admittedly, some of these questions are a little odd. But that is partly my point. You cannot expect me to have three insights as good as Suvir’s idea about saving ink. We need to work together, and share ideas, to identify those which will have most impact. And if we do not, then cheaters and fraudsters will come up with similar ideas, with the hope we remain ignorant of what they do! History tells us to look for money in unexpected places. We may look in the wrong places many times, but one surprise finding may yield a jackpot. It is our responsibility to look in those places, even if we are often wrong, or find it difficult to solve the mysteries we have identified. With that in mind, take a look at this recent story from 60 Minutes, a news program on the American CBS Network. It explains how a smart, persistent man identified the way some stock market dealers convert just a few milliseconds’ advantage in their fibre optic network into millions of dollars of profit.
Posted by: Eric in Opinion
I am no expert on utilities. But if, like me, you worked in enterprise risk in a country where the energy sector dominates the national economy, you have to be conscious of the international market in energy. You need to be aware of how energy markets are affected by all sorts of things, including scientific advice about the environment, availability of transportation, industrial change, and consumer protection, in order to understand the knock-on effects on government revenues and business activity, and hence on telecoms use. When I worked in Qatar, I also took an interest in the business of supplying water. As a desert country with a growing population, the availability of water is a constraint on the economy, and a failure in supply could cause a crisis. So, based on what I learned about these sectors, I find my mind is blown by the condescension and ignorance of cVidya. They now regularly spam telcoprofessionals.com, a website which advertises job vacancies in the telecoms sector, and which is run by the same business which handles cVidya’s public relations. cVidya’s most recent opinion piece is attributed to CEO Alon Aginsky, and discusses assurance for utilities. Please indulge me, as I tear his ill-informed nonsense apart.
Time for Smarter, Stronger Utilities
In the next decade, the utilities sector will change beyond recognition.
No it will not. This is possibly the stupidest thing I have ever read.
Consider the timelines that impact utilities. If you think building a telecoms network is slow, consider that the North American Keystone Pipeline for crude oil was first proposed in 2005, Canadian approval was given in 2007, US approval in 2008, phase 1 construction took 2 years and was completed in 2010. Since then, there have been repeated and continuing delays in approving and constructing subsequent phases of the pipeline. And even after you pump the crude oil, and move the crude oil, it still needs to be refined, moved again, sold, and put into the power station that will finally burn it, if and when you fire up that power station, because oil-fuelled stations are more likely to be used on a short-term basis to deal with peaks in demand or shortfalls in supply due to reductions in supply from other kinds of station. Or consider the new nuclear reactor that will be built in the UK. After years of government indecision, the go-ahead was finally given in 2013. The reactor will become operational in 2023. Like other nuclear power plants, when it comes on, it will stay on, generating a fairly consistent amount of electricity for the following 60 years, barring the kinds of accidents that everyone wants to avoid. To reach a deal with the consortium that will build the reactor, the British government have agreed the price to be paid for its electricity during the first 35 years of use.
When it comes to energy, or water, nothing changes rapidly. Planning timelines are long. Utilities are the one business sector we can be sure will not change dramatically in a time period as short as ten years. And every stock market investor knows this, which is why utilities have low betas (a measure of systematic risk). In fact, even somebody with only a passing knowledge of the telecoms industry should know this, because the single greatest strategic risk facing publicly-listed telcos is that they will increasingly be treated like utilities, forcing them to pay higher dividends to sustain their share price. Like utilities, telcos may become defensive, low growth, low beta suppliers of an undifferentiated product where they can only compete on price, thus generating boring but predictable returns compared to the more rapid ‘dotcom’ style growth they have previously delivered.
As governments attempt to curb climate change and rising fuel bills, energy companies are increasingly having to work with their customers to make more efficient use of electricity and gas.
This is true, up to a point. But bear in mind that Margaret Thatcher was the first world leader to talk about the need to address the greenhouse effect, giving an important speech on the topic at the United Nations; you can see the video here. That speech was given in 1989. Since then, environmentalists complain that the pace of change has been too slow. And there has been a significant backlash from climate sceptics, who question the costs and benefits of rapidly reducing use of carbon-based fuels.
Even more importantly, fuel bills are not rising everywhere. In the US, the wholesale cost of natural gas is now just one-third of what it was in 2008, thanks to the exploitation of fracking and other new technologies for extracting previously inaccessible reserves. This is an unusually dramatic change, but it is exacerbated by the time it will take the US to agree to, invest in, and construct facilities to liquify and export natural gas, even though many American politicians have called for an acceleration of such projects following recent tensions between Ukraine and Russia.
Indeed, as a proud Israeli, Aginsky should be aware that Israel is investing in its navy in order to safeguard the discovery of 36 trillion cubic feet of gas off the Israeli coast.
As well as supplying energy, utilities now need to supply information – information their customers can use to adjust their usage patterns and get the biggest bang for their buck.
In practice that means the days of quarterly, or even annual, meter readings are drawing to a close.
This is false. Even when smart meters are implemented, physical visits to customer premises will continue. The visits will partly be motivated by the need to assure the data received is consistent with what the correct meter has recorded at source. They will also be motivated by the need to identify and deter meter tampering. Aginsky’s comment reveals a deep ignorance of risk management in utilities.
Aginsky is also wrong to imply that smart metering will inevitably lead to increased use of economic incentives to change consumption patterns. There is a countervailing trend that governments protect customers by forcing utilities to offer only a few simple tariffs. Whilst a simple distinction between peak and off-peak rates may motivate some changes in behaviour, customers will not need to analyse lots of data to understand why they should prefer off-peak consumption.
In some cases, utilities are rolling out smart meters hand-in-hand with dynamic pricing – different tariff rates for different times of the day – in a bid to smooth out the peaks and troughs in demand.
Yes, but ‘dynamic’ pricing is nothing new. On the contrary, many utilities have long offered different rates for different times of day. And just as in telecoms, the prices paid by large wholesale customers will have been individually negotiated, with attention paid to opportunities to balance load relative to other customers.
And some consumers are also becoming producers of energy, feeding electricity generated by the solar panels on their roof, or a wind turbine on their land, back into the grid.
True, but this is very minor. The EU has the most aggressive targets for renewable energy, and is currently contemplating adopting a new target that 27% of energy should come from renewables by 2020. Microgeneration will only account for a tiny fraction of that target. Better insulation and improved energy efficiency of household devices will have a far larger impact than microgeneration.
These fundamental changes mean the complexity of the utilities business is rising fast. And, with rising complexity, comes greater risk.
Duh. But cVidya clearly cannot be trusted to give a fair or balanced analysis of risk. They foolishly believe that remote transmission of data makes it unnecessary to mitigate risks by inspecting meters at the customer’s premises. Meanwhile, they imply that an upswing in the popularity of solar panels will be a significant driver of risk.
Although the introduction of smart grids should improve energy companies’ operational efficiency, new infrastructure and new processes also open the door to new kinds of revenue leakage and fraud. In developed markets, billing errors, meter tampering and other forms of fraud mean between 1% and 5% of utilities’ revenue is already leaking away, while in developing markets that figure can be as high as 20%.
No explanation is given of these numbers, or where they come from. They seem very round, and very vague. This is unsatisfactory, because utilities already perform simple reconciliations of the total electricity/gas/water they supply, versus the total they bill. Consider gas and water in comparison to phone calls. Unlike the digital ephemera of telecoms networks, gas and water are physical things – you can literally compare how much you put in one end of a pipe to how much comes out of the other end. I do not believe Aginsky’s numbers are derived from actual revenue protection work already performed by utilities. That is not to suggest Aginsky is alone in stating this kind of nonsense. But repeating somebody else’s bombast will not turn it into truth.
The funny thing about these measures of leakage is that many countries have transparent laws that state expectations for how accurate utility meters must be. Even though I work in telecoms, I know that UK law requires my gas meter to be accurate to +/- 2%, and the accuracy of my electricity meter must be within +2.5% and -3.5%. If those are the accuracy tolerances of the meter, before considering anything else, then why is Aginsky talking as if a 1% ‘leakage’ is a big thing? And exactly how bothered are customers about being overcharged, when the law says my gas meter may over-record my usage by 2%, and my electricity meter can over-record my usage by 2.5%? I cannot tell if the numbers given by Aginsky include or exclude these unavoidable metering variances. Aginsky should have quoted these kinds of accuracy expectations, in order to explain the leakage numbers he gives. Instead, Aginsky has recycled the same kind of numbers that he has often used to frighten telcos. But like the boy who cried wolf, Aginsky’s technique stopped being effective after a while, and I do not believe utilities will be influenced by it.
Of course, the real issue for most utilities is not the current supposed accuracy or inaccuracy of what they measure, but the difficulty in estimating charges for usage between meter readings, and the difficulties in chasing payment from delinquent customers when there are legal restrictions on disconnecting services. But then, cVidya does not sell software that can help with either of those challenges.
With the transition to smart meters, the rate of leakage could rise even higher as utilities implement new systems and processes.
Leakage could rise? Of course it could. Or maybe it could fall. Leakage will fall, if utilities implement good systems and processes. I thought cVidya wanted assurance people to think proactively, and to prevent leakage before it occurs? It says a lot about cVidya’s disjointed strategy that they try to sell transformation assurance to telcos, but do not think to offer utilities a service that assures the new systems and processes they implement, at the time of implementation.
As well as damaging the utility company’s reputation, errors in bills may result in dissatisfied customers deferring payment, creating new credit risks, or even removing their smart meter, as has been the case in North America.
This is supposed to be about utilities, but Aginsky is just repeating his telco sales pitch without any thought. Utilities have always managed credit risk. Because most utilities are older than most telcos, they have been managing credit risk for longer than most telcos. If anything, telcos learned about credit risk from utilities, by hiring managers of credit risk from utilities. Utilities face credit risks, and will continue to face credit risks, irrespective of smart meters. In fact, one key argument is that smart meters will reduce credit risk, thanks to the continuous supply of usage information, and the end of estimated bills. Other credit risks will remain. People already complain about their bills. People already refuse to pay. Utilities already manage their bad debt, and manage their public image, but Aginsky writes as if utilities managers are simpletons who need to be told about the importance of both.
Widespread negative publicity about smart meters in a particular market could put an energy company’s entire smart grid programme in jeopardy.
This is hyperbole. Nobody is going to jeopardize billions of dollars of government-mandated infrastructure investment. In fact, negative publicity cannot even stop cVidya from producing marketing claptrap like this.
Moreover, deregulation has made it straightforward for dissatisfied customers to switch to another supplier.
Good point. But set this against the trend for increased regulation that only allows utilities to offer a small number of easily-understood tariffs. Utilities may prefer the choice to offer more complex tariffs, but if forced to keep tariffs simple, then bills will be easier for customers to check, and less likely to be incorrectly calculated.
If utilities can capture and analyse the data generated by smart meters effectively they will have a comprehensive picture of customers’ changing energy needs, enabling them to develop compelling new services…
Except that is very unlikely if governments restrict the utility’s freedom to offer varied tariffs.
But the energy sector doesn’t need to reinvent the wheel. In effect, utilities are becoming more like telcos
Everyone working in telecoms, or investing in telecoms, is worried that telcos will become more like utilities. Aginsky optimistically states that utilities will be more like telcos. Who is Aginsky trying to fool? Utilities spend a lot of time negotiating with governments. Do the CEOs of utilities think their companies will suddenly be freed from government oversight, and allowed to behave like telcos did during the dotcom boom?
In many cases, utilities’ existing systems lack the sophisticated methodology and smart algorithms that underpin the commercial Revenue Assurance and Fraud Management systems used by leading telcos.
Maybe Aginsky should explain why utilities need such sophisticated systems. So far, he has completely failed to do so. In fact, he has only revealed that cVidya has a very unsophisticated understanding of what utilities do, and the conditions under which they operate. He says they will change beyond recognition, implying they will be like the newer telcos in liberalized Western markets around the year 2000. Those were the conditions that gave rise to telecoms revenue assurance, and those conditions would suit cVidya. But there is no good reason to believe that smart metering is going to lead a sudden reconfiguration of utilities, and their business model. Governments in the countries that will first adopt smart meters are also showing an increased willingness to determine the retail prices that utilities will charge. Where governments are more relaxed about retail prices, it is because wholesale energy prices are falling.
Utilities are shaped by long-term forces that override Aginsky’s short-term thinking about crunching data. Those forces include the availability of natural resources, geopolitics, big infrastructure, social welfare, climate change, and public opinion. I think anyone who works in utilities already knows this. But then, I suspect this piece is not written for anyone who actually works in a utility, given I found it published on telcoprofessionals.com.
Posted by: Eric in News, Opinion
A former insider has leaked some good news for fans of professionalism in revenue assurance, and bad news for fans of the Global Revenue Assurance Professional Association. The reliable source told talkRA that dwindling revenues have forced Papa Rob Mattison to cut GRAPA’s costs to the bone. As suspected, the departure of former GRAPA tutor Louis “I LOVE Revenue Assurance” Khor was a sign of GRAPA’s financial difficulties. Former GRAPA customers have contracted a bad case of ‘once bitten, twice shy’, leaving Papa Rob unable to secure repeat business. In turn, Khor had to leave because he was only paid on a piecemeal basis for the classes he taught. GRAPA’s Marketing Director (the woman responsible for all their email spam) has also left, and even members of Papa Rob’s family have decided to get real jobs elsewhere.
However, we should not rejoice too soon. Papa Rob and his wife Brigitte are like a pair of zombies. When one career/scam comes to its unnatural end, they revive themselves by proclaiming Rob to be a world-renowned expert at something else. Let us not forget that this couple have also described Papa Rob as: an internationally recognized expert in databases, data warehousing, objective technology and data mining; a sought-after speaker at database conferences around the world; a leading international authority on knowledge management; and a best selling author. This makes me wonder how many copies need to be sold, before Rob and Brigitte consider a book to be a ‘best seller’. Neither I nor my talkRA colleagues are best selling authors. But on Amazon, our revenue assurance book has consistently outsold Mattison’s RA manual.
Papa Rob’s brain was too small for a satisfying meal, said these GRAPA pupils
It must be admitted that there are still some signs of continued life at GRAPA. An advert for a replacement member of staff was recently advertised on Craigslist, paying USD10 per hour. However, those of us with long memories will recall that similar GRAPA jobs used to pay USD12 per hour.
GRAPA is not dead, but it should be buried. This so-called association, a marketing front for Mattison’s pre-existing consulting and training business, has done irreparable harm to real revenue assurance professionals. Papa Rob spread the irresponsible lie that anyone, with only a bare minimum of training, should expect extravagant pay raises and promotions in return for performing basic revenue assurance reconciliations. In truth, by setting absurdly low standards for qualification, and allowing inexperienced chancers to describe themselves as masters of the topic, they have encouraged an oversupply of under-qualified candidates, chasing an inadequate number of low-paid jobs.
There is one lesson we should learn from Rob Mattison and GRAPA. Papa Rob made some quick cash for himself, but he did not build anything which generated sustainable value in the long run. He has never done the hard work to educate himself, which is why he believes he can be a world-class expert on everything. In turn, he never expected hard work from his students. At GRAPA, he created a few low-grade, short-lived jobs for people without any relevant qualifications or experience, but these jobs did nothing to enhance the CVs or future prospects of the people who filled them. Instead of making revenue assurance a vital activity which rewards its elite practitioners, he turned it into a zombie profession, shambling from one meal to the next, with no sense of direction or purpose.
The obituary for GRAPA is long overdue. Real professionals need to tell their GRAPA-qualified peers that they have embarrassed themselves. We need to kill their zombie careers. When we do that, we give life back to the people who occupied those zombie careers, by giving them the chance to enjoy real professional growth instead. Not everybody’s career will survive. But those that do, will prosper.
Posted by: Eric in Opinion
The Electronic Privacy Information Center has complained to the US Federal Trade Commission about Facebook’s proposed purchase of Whatsapp. Their concern is that the 450mn Whatsapp users have not agreed to, and would not choose to agree to, Facebook’s exploitation their personal data.
The FTC’s response will give an important indication of the value of personal data obtained via a corporate takeover. Facebook has already said they would run Whatsapp as a separate business. However, given that Facebook monetizes data by using it to target advertising, there is no doubt that the data possessed by Whatsapp would be valuable to them. The acquisition begs further questions too. If the purchase goes ahead with legal constraints over how Facebook can use Whatsapp’s personal data, government authorities still have a very poor track record when it comes to detecting violations, and hence enforcing the rules they advocate. In Europe, the push towards forcing businesses to recruit Data Protection Officers indicates the authorities are incapable of enforcing laws if businesses do not police themselves.
Posted by: Eric in News, Opinion
Take a deep breath, as what I am about to write might shock you. Some software developers who talk about risk management are lying to you. Or at least, they do not tell you the whole truth, by refusing to comment on the things they cannot do, or do not understand. New proof comes from a software developer that knows a lot about about risk management. Palisade has been making risk management software since 1984. Headquartered in New York, and with offices in Tokyo, Sydney, London and Rio de Janeiro, they sell cost-effective risk management software to all sorts of customers – because many big businesses have more sophisticated risk management than that found in telcos. Palisade’s tools are based on Monte Carlo simulation, and they have just released a new case study about Enterprise Risk Management in MegaFon, the Russian telco.
Monte Carlo? Some readers will not know what Monte Carlo simulation is, including readers who have some risk management responsibilities in their job description. That was why I slipped the phrase into the text. I want to provoke people into thinking about all the risk management tools and techniques they currently know nothing about. Revenue assurance should teach people that our failings stem from the limitations of our knowledge. And yet, whilst we recognize this truth, telecoms risk management suffers from an insular viewpoint. Some narrow people claim to have broad expertise on every subject, including the whole of risk management. In truth they know only a telecoms-specific view of the world, and can only thrive because telcos are so far behind other industries when it comes to implementing risk management. They are like false prophets, giving instruction to a small band of people who live on a remote island. Whilst they claim to have knowledge of the universe and its mysteries, they have no knowledge of the world beyond their island. The quickest evolutionary path is for telcos to learn how other sectors manage risk. Or in this case, we can also learn from MegaFon’s example.
Put simply, Monte Carlo techniques reveal the likelihood of different outcomes by setting up a game, and then rolling the dice repeatedly, to see which outcomes win most often, and which lose most often. In this context, the game is a mathematical model of an organization or a project, and the role of the dice is played by a random number generator. If we estimate probabilities for a variety of factors that will influence the results of an organization or project, we can then use random numbers to run multiple simulations of how the causal factors interact, in order to map the distribution of overall results. As such, we can quantify the range of risk in any decision, and hence alter decisions according to our appetite for risk.
A poker player cannot determine which cards he is dealt, but a good poker player wins more often than a bad poker player, because he makes better decisions. In the same way, we cannot control all the factors that influence our business, but we can make better decisions if we methodically measure the influence of factors outside of our control. The MegaFon case study helps to explain how to do that in practice.
Here are some key extracts from the case study, explaining how MegaFon uses Monte Carlo techniques to manage risk in their budgeting process:
Each branch [of MegaFon] states the risks it faces, such as competition, changes in legislation that will require it to operate differently, price increases and changes to staffing costs. They also calculate how much each budget will be over or under the forecast.
The risk management team at MegaFon’s headquarters amalgamates the information from each of its offices and simulates possible scenarios… allowing the five critical factors most likely to significantly affect the company’s gross revenue to be identified and therefore mitigated.
In addition… minimum, best case and median budget figures and the probability of their occurrence… are compared to the budget plans to determine whether the forecast is too aggressive or not ambitious enough.
As well as budgeting for business as usual, MegaFon uses their Palisade Monte Carlo tools to help them make better decisions for capital investment:
In 2012, MegaFon took the decision to invest in a large construction project with the aim of minimising its operating costs and improving network quality and control over technical operations.
Two potential locations were shortlisted and the management team used Palisade’s software to make an informed decision on the optimal one. It first used Palisade’s TopRank to perform sensitivity analysis to identify the factors in each location that would have the most influence over the total cost of the project.
From here, the team used @RISK to forecast how these critical factors might change. This allowed MegaFon to understand the most likely Net Present Values (NPVs) for each possible location and identify the risks for building or not building (i.e. opportunity cost) each data centre.
@RISK allowed MegaFon to use graphs to show easily how NPV and cash flows could change over time, and the probabilities of those changes occurring, rather than the static number that they would have had to rely on without the risk analysis tool.
This is a beautiful example of how to manage risk in a telco. Hence, it is tragic that so few telcos use techniques like these. The tragedy is even greater because some telcos listen to software firms that push ‘risk models’ that do not deserve the name. In the meantime, MegaFon is using tried and tested techniques which have already been automated, making them accessible to risk managers who do not have the time to build a Monte Carlo model from scratch.
Dmitry Shevchenko, Head of Risk Management at MegaFon, is quoted:
“Palisade’s decision support software is a well-balanced and flexible instrument that can be applied to a wide variety of situations, making it ideally suited to managing risk across the enterprise.”
Compare that to the misnamed ‘risk models’ found elsewhere, and we see why they are not genuine models of risk. There are many kinds of risk across the telco, and the models will be different for each telco. Some of the so-called ‘risk models’ being pushed at telcos only model one or two specific kinds of risk, and the models are inflexible, implying all telcos have a similar risk profile. Why would any risk manager use software to model only one kind of risk, in a way that forces him to use the same generic model as every other telco, when there is software that allows him to model every kind of risk, and to build a model that is specific to his company? I assume there is only one answer to my rhetorical question: the risk manager did not know there were other, better, tools that he could have used.
Mike Willett recently interviewed me for the talkRA podcast, and I fear I may have offended some people when he asked my thoughts about revenue assurance managers seeking to become risk managers. I was blunt. I said the problem was a lack of training, and the danger was that under-trained people may take on responsibilities without having an appreciation of the gaps in their skillset, and how that will alter their perception of risk. Already, I know that under-trained and under-skilled individuals are being given risk management jobs in telcos. This is not a good thing for their business, nor for the individual. Whilst it may feel like a promotion, the undertrained risk manager must push back, and ensure they have the skills needed for the job, or their failures will have serious implications for their business, their colleagues, and themselves. They need to find trustworthy advisors, and not just listen to the comforting, convenient nonsense spewed by the false prophets. When speaking to Mike, I drew upon an analogy coined by Abraham Maslow:
If you only have a hammer, you tend to see every problem as a nail.
There is no doubt that RA practitioners have some very useful skills that can be applied to manage risks more generally. They possess some powerful tools. But they do not have as wide a range of tools as they need in their toolbox, if they are genuinely going to manage the range of risks implied by a job title like ‘risk manager’. It is no good to turn around later, and make the excuse: “it’s my job to manage this risk, but not that risk”. Was it clear from the job title which risks were being managed? Was it clear from the job description, and the list of responsibilities? And where the risk manager decides they are not responsible for a certain kind of risk, who is responsible for identifying situations where the company faces a risk, but nobody is managing it? These are big questions. And once again, the telco world is being misled by people who, lacking any answer to the big questions, refuse to acknowledge them. They offer answers, but only to those questions where they already have an answer.
Techniques like Monte Carlo simulation should be in the toolbox of every risk manager, so they can be used when they are the best tool for the job. I hope this brief and excellent case study from MegaFon and Palisade helps to open some eyes to the limitations of the tools being used by some telco risk managers. There is a general rule for risk, which states we cannot manage a risk until we have identified it. Let us be honest with ourselves, and admit to the gaps in our knowledge, skills, and tools. When we do that, we create the possibility of improving our performance, and closing those gaps.