It seems that many major financial institutions are still using a version of the original operational risk technique developed in the late 1990’s by Bankers Trust and JP Morgan, the pioneers in operational risk management. What is even more surprising is that the financial institution’s own auditors have so far accepted the continued use of the antiquated operational risk technique even after the roll out of Basel II Accord in June 2004, which provided the guideline for a more robust operational risk methodology.
Understandably, soon after the Enron, Pamalat, and Arthur Andersen problems erupted, financial institutions quickly instituted operational risks management frameworks to appease the regulators. However, with a rush to put something in place, strategic planning of operational risk management was glossed over in favor a quick tactical fix of throwing more people and money at the problem.
The resultant of this quick fix method is that most financial institutions currently use a motley patch of probabilistic and statistical models, adapted from their existing credit risk framework, mated to a static a-point-in-time, backward-looking management report for their operational risk management process.
While this patchwork approach seemed to have been sufficient to meet the initial regulatory requirements, continued use of this ineffective operational risk management process leaves the financial institutions widely exposed to further operational risk losses, as clearly shown by the recent events at Societe Generale (SocGen).
Sphere: Related Content
No comments:
Post a Comment