In last weekend’s FT Terry Smith wrote about a $1.9bn error he had noticed in IBM’s 2009 cash flow statement. This got me thinking. How frequently is board and shareholder information inaccurate?
Probably a lot more than many businesses would like to admit.
This is of course anecdotal but I’ve seen plenty of reports that purport to show risk that aren’t right. This largely arises from two sources.
First the sheer volume of data now being collected, reported, copied and transposed leaves room for fat fingers, aka human error.
Second some risks aren’t picked up leaving the recipient unaware of the real exposures being run.
Is this significant?
It depends on whether we need accuracy or approximation.
A model error can cost millions. Other things are more subjective like the volatility estimate in a VAR model or the parameters used in stress testing.
Notwithstanding it’s helpful (where possible) to have your own estimate and to cross reference reported risk in order to validate it and understand how it is behaving. This needs time and experience.
What’s interesting in Smith’s article is that IBM confirmed the error and no one else had been in touch.
Was this because no one asked the question or was it because financial reports aren’t looked at?
That is surely the real question.
Are we getting lazy when dealing with numbers?
Comments