Industry Events take on the Technology Impact of the Libor Replacement
As the 2021 deadline approaches, the financial sector is abuzz with preparing for the end of Libor. Certainly, there are a myriad of business, operational and IT considerations that firms must tackle in order to ready themselves for the transition to new risk-free rates (RFR’s).
Recently I had the pleasure of participating on the panels for two London-based events focused on Libor replacement. The first, was hosted by Risk.net. Discussions covered how to prepare for the impending Libor transition, including managing the risks and generating new opportunities. The other event, put on by Marcus Evans, addressed how global efforts to reform benchmark rates and the end of Libor will impact derivatives valuations and legacy trades. In today’s post, I’d like to share key takeaways from these two events.
Dealing with the Uncertainty
Industry participants are now faced with the reality that we are just one year away from Libor’s discontinuation and there are no universally agreed-upon instruments and standards for building curves post-Libor. Instrument conventions have not been decided for cross-currency swaps, caps and floors. What is worse, there is every possibility that market conventions and curve building best practices will continue to shift after Libor has been discontinued, even for the simplest instruments and curves.
With these unanswered questions at hand, how then can firms begin upgrading their systems in order to ready for the transition? One solution discussed is to build a centralized analytics system. When doing this, flexibility is key. As new market conventions appear and fade away, you need the ability to quickly reconfigure a single solution and not have to start a whole new (labor-intensive and potentially costly) IT project from scratch each time that addresses multiple systems.
Different Approaches to Updating Analytics
When it comes to updating analytics to prepare for the end of Libor, it is likely that the largest institutions will take a different approach then will their mid-ranged counterparts. Clearly, the biggest firms will be inclined to solve the problem internally –with robust quant teams and economies of scale. Furthermore, to their benefit, very large firms will typically have a good deal of centralization of their analytics already in place.
Medium and large institutions face additional challenges, as they usually have a mix in place of multiple systems with divergent analytics, some internally built systems and some vendor-based systems. Thus, these institutions will increasingly see the benefit of reorganizing their technology landscape to centralize analytics and improve consistency across the front- and middle-offices and other various desks and departments.
Interestingly, the shift towards reshaping technology to use a best-of-breed solution mix instead of all-in-one systems is getting a boost from a number of other technology trends we’re seeing in the industry. For instance, the shift to the cloud, the increasing need for real-time information, the desire for consistent analytics across departments, the emergence of PaaS offerings and the increased uptake of Python over Excel for customizing analytics –are just some of these trends that either call for using technology differently or make it much easier to onboard new, more targeted solutions.
Challenges with Data
Once upon a time, data challenges were relatively minor for most industry participants when constructing interest rate curves. One could simply hand pick liquid instruments for curve-building and use them nearly indefinitely. But, those days of simplicity are gone.
Now firms are dealing with a constantly changing universe of instruments and no guarantees of good liquidity in most of these instruments. This effectively ties data and analytics challenges together, calling for new approaches to curve building. Indeed, firms today may need a global curve-building solution capable of handling any and all instruments you throw at it. Most rich and flexible is a framework that offers multi-dimensional, whole curve calibration and which uses all instruments with appropriate weights, taking all market information into account. My colleague Bin Hou goes into further detail on the importance of being able to use multiple instruments concurrently in this blog: Connecting the SOFR Curve to SOFR Swaps.
Furthermore, use of Algorithmic Differentiation (AD) is essential to determining the right risks from such calibrations. AD is a mathematical technique that improves accuracy and radically speeds up the calculation of greeks. In fact, for typical portfolios it is not uncommon to get a 1000x improvement in calculation speed when compared against finite difference methods. For complex calibrations not only does AD greatly improve the speed of the calibration but crucially it ensures that risks are computed reliably and precisely. This is unlike the traditional bumping techniques where stable or even convergent calibration cannot be taken for granted under bumped quotes.
The future is bound to bring many competing best practices for curve-building. For this reason, having a future-proof and flexible framework that enables you to replicate any and all curves and easily configure your own is imperative. The best of these solutions will allow for quick configuration of newly traded instruments, easy set-up of complex indices (fallback rates, average rates and the like) and make your analytics central and flexible. Such a solution will empower you to be prepared for anything the future might bring.
For more on this topic, check out our related blog on simplifying curve construction.