Titolo della tesi: An extended generalized Markov model for the spread risk and its calibration by using filtering techniques in Solvency II framework
The Solvency II regulatory regime requires the calculation of a capital requirement, the Solvency Capital Requirement (SCR), for the insurance and reinsurance companies, that is based on a market-consistent evaluation of the Basic Own Funds probability distribution forecast over a one-year time horizon.
This work proposes an extended generalized Markov model for rating-based pricing of risky securities for spread risk assessment and management within the Solvency II framework, under an internal model or partial internal model. This model is based on Jarrow, Lando and Turnbull (1997), Lando (1998) and Gambaro et al. (2018) and models the credit rating transitions and the default process using an extension of a time-homogeneous Markov chain and two subordinator processes. This approach allows simultaneous modeling of credit spreads for different rating classes and credit spreads to fluctuate randomly even when the rating does not change.
The calibration methodologies used in this work are consistent with the scope of the work and the scope of the proposed model, i.e., pricing of defaultable bonds and calculation of SCR for the spread risk sub-module, and with the market-consistency principle required by Solvency II. For this purpose, calibration techniques on time series known as filtering techniques are used, which allow the model parameters to be jointly estimated under both the real-world probability measure (necessary for risk assessment) and the risk-neutral probability measure (necessary for pricing). Specifically, an appropriate set of time series of credit spread term structures, differentiated by economic sector and rating class, is used.
The proposed model, in its final version, returns excellent results in terms of goodness of fit to historical data, and the projected data are consistent with historical data and the Solvency II framework.
The filtering techniques, in the different configurations used in this work (particle filtering with Gauss-Legendre quadrature techniques, particle filtering with Sequential Importance Resampling algorithm, Kalman filter), were found to be an effective and flexible tool for estimating the models proposed, able to handle the high computational complexity of the problem addressed.