Identifying risk spillovers in financial markets is of great importance for assessing systemic risk and portfolio management. Granger causality in tail (or in risk) tests whether past extreme events of a time series help predicting future extreme events of another time series. The topology and connectedness of networks built with Granger causality in tail can be used to measure systemic risk and to identify risk transmitters. Here we introduce a novel test of Granger causality in tail which adopts the likelihood ratio statistic and is based on the multivariate generalization of a discrete autoregressive process for binary time series describing the sequence of extreme events of the underlying price dynamics. The proposed test has very good size and power in finite samples, especially for large sample size, allows inferring the correct time scale at which the causal interaction takes place, and it is flexible enough for multivariate extension when more than two time series are considered in order to decrease false detections as spurious effect of neglected variables. An extensive simulation study shows the performances of the proposed method with a large variety of data generating processes and it introduces also the comparison with the test of Granger causality in tail by Hong et al. (2009). We report both advantages and drawbacks of the different approaches, pointing out some crucial aspects related to the false detections of Granger causality for tail events. An empirical application to high frequency data of a portfolio of US stocks highlights the merits of our novel approach.
Author Archives: Fabrizio Lillo
P. Mazzarisi, F.Lillo, S. Marmi (2019). When panic makes you blind: A chaotic route to systemic risk, Journal of Economic Dynamics and Control , 100, 176-199
We present an analytical model to study the role of expectation feedbacks and overlapping portfolios on systemic stability of financial systems. Building on Corsi et al. (2016), we model a set of financial institutions having Value-at-Risk capital requirements and investing in a portfolio of risky assets, whose prices evolve stochastically in time and are endogenously driven by the trading decisions of financial institutions. Assuming that they use adaptive expectations of risk, we show that the evolution of the system is described by a slow-fast random dynamical system, which can be studied analytically in some regimes. The model shows how the risk expectations play a central role in determining the systemic stability of the financial system and how wrong risk expectations may create panic-induced reduction or over-optimistic expansion of balance sheets. Specifically, when investors are myopic in estimating the risk, the fixed point equilibrium of the system breaks into leverage cycles and financial variables display a bifurcation cascade eventually leading to chaos. We discuss the role of financial policy and the effects of some market frictions, as the cost of diversification and financial transaction taxes, in determining the stability of the system in the presence of adaptive expectations of risk.
P. Mazzarisi, P. Barucca, F. Lillo, D. Tantari (2020), A dynamic network model with persistent links and node-specific latent variables, with an application to the interbank market , European Journal of Operational Research, 281, 1, 50-65
We propose a dynamic network model where two mechanisms control the probability of a link between two nodes: (i) the existence or absence of this link in the past, and (ii) node-specific latent variables (dynamic fitnesses) describing the propensity of each node to create links. Assuming a Markov dynamics for both mechanisms, we propose an Expectation-Maximization algorithm for model estimation and inference of the latent variables. The estimated parameters and fitnesses can be used to forecast the presence of a link in the future. We apply our methodology to the e-MID interbank network for which the two linkage mechanisms are associated with two different trading behaviors in the process of network formation, namely preferential trading and trading driven by node-specific characteristics. The empirical results allow to recognize preferential lending in the interbank market and indicate how a method that does not account for time-varying network topologies tends to overestimate preferential linkage.
D. Di Gangi, G. Bormetti, F. Lillo (2019) , Score-Driven Exponential Random Graphs: A New Class of Time-Varying Parameter Models for Dynamical Networks
Motivated by the evidence that real-world networks evolve in time and may exhibit non-stationary features, we propose an extension of the Exponential Random Graph Models (ERGMs) accommodating the time variation of network parameters. Within the ERGM framework, a network realization is sampled from a static probability distribution defined parametrically in terms of network statistics. Inspired by the fast growing literature on Dynamic Conditional Score-driven models, in our approach, each parameter evolves according to an updating rule driven by the score of the conditional distribution. We demonstrate the flexibility of the score-driven ERGMs, both as data generating processes and as filters, and we prove the advantages of the dynamic version with respect to the static one. Our method captures dynamical network dependencies, that emerge from the data, and allows for a test discriminating between static or time-varying parameters. Finally, we corroborate our findings with the application to networks from real financial and political systems exhibiting non stationary dynamics.
D. Di Gangi, F. Lillo, D. Pirino (2018) , Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction , Journal of Economic Dynamics and Control, 94, 117-141
Monitoring and assessing systemic risk in financial markets is of great importance but it often requires data that are unavailable or available at a very low frequency. For this reason, systemic risk assessment with partial information is potentially very useful for regulators and other stakeholders. In this paper we consider systemic risk due to fire sales spillovers and portfolio rebalancing by using the risk metrics defined by Greenwood et al. (2015). By using a method based on the constrained minimization of the Cross Entropy, we show that it is possible to assess aggregated and single bank’s systemicness and vulnerability, using only the information on the size of each bank and the capitalization of each investment asset. We also compare our approach with an alternative widespread application of the Maximum Entropy principle allowing to derive graph probability distributions and generating scenarios and we use it to propose a statistical test for a change in banks’ vulnerability to systemic events.
L.M. Calcagnile, G. Bormetti, M. Treccani, S. Marmi, F. Lillo, Collective synchronization and high frequency systemic instabilities in financial markets, Quantitative Finance 18 (2), 237-247
We present some empirical evidence on the dynamics of price instabilities in financial markets and propose a new Hawkes modelling approach. Specifically, analysing the recent high frequency dynamics of a set of US stocks, we find that since 2001 the level of synchronization of large price movements across assets has significantly increased. We find that only a minor fraction of these systemic events can be connected with the release of pre-announced macroeconomic news. Finally, the larger is the multiplicity of the event—i.e. how many assets have swung together—the larger is the probability of a new event occurring in the near future, as well as its multiplicity. To reproduce these facts, due to the self- and cross-exciting nature of the event dynamics, we propose an approach based on Hawkes processes. For each event, we directly model the multiplicity as a multivariate point process, neglecting the identity of the specific assets. This allows us to introduce a parsimonious parametrization of the kernel of the process and to achieve a reliable description of the dynamics of large price movements for a high-dimensional portfolio.
https://doi.org/10.1080/14697688.2017.1403141
Letizia E., Barucca P., Lillo F. (2018). Resolution of ranking hierarchies in directed networks
Identifying hierarchies and rankings of nodes in directed graphs is fundamental in many applications such as social network analysis, biology, economics, and finance. A recently proposed method identifies the hierarchy by finding the ordered partition of nodes which minimises a score function, termed agony. This function penalises the links violating the hierarchy in a way depending on the strength of the violation.
To investigate the resolution of ranking hierarchies we introduce an ensemble of random graphs, the Ranked Stochastic Block Model. We find that agony may fail to identify hierarchies when the structure is not strong enough and the size of the classes is small with respect to the whole network. We analytically characterise the resolution threshold and we show that an iterated version of agony can partly overcome this resolution limit.
P.Mazzarisi, P.Barucca, F.Lillo, D.Tantari, A dynamic network model with persistent links and node-specific latent variables, with an application to the interbank market
We propose a dynamic network model where two mechanisms control the probability of a link between two nodes: (i) the existence or absence of this link in the past, and (ii) node-specific latent variables (dynamic fitnesses) describing the propensity of each node to create links. Assuming a Markov dynamics for both mechanisms, we propose an Expectation-Maximization algorithm for model estimation and inference of the latent variables. The estimated parameters and fitnesses can be used to forecast the presence of a link in the future. We apply our methodology to the e-MID interbank network for which the two linkage mechanisms are associated with two different trading behaviors in the process of network formation, namely preferential trading and trading driven by node-specific characteristics. The empirical results allow to recognise preferential lending in the interbank market and indicate how a method that does not account for time-varying network topologies tends to overestimate preferential linkage.
P.Barucca, P.Mazzarisi, F.Lillo, D.Tantari (2017), Disentangling group and link persistence in dynamic stochastic block models
We study the inference of a model of dynamic networks in which both communities and
links keep memory of previous network states. By considering maximum likelihood inference from
single snapshot observations of the network, we show that link persistence makes the inference of
communities harder, decreasing the detectability threshold, while community persistence tends to make
it easier. We analytically show that communities inferred from single network snapshot can share a
maximum overlap with the underlying communities of a specific previous instant in time. This leads
to time-lagged inference: the identification of past communities rather than present ones. Finally
we compute the time lag and propose a corrected algorithm, the Lagged Snapshot Dynamic (LSD)
algorithm, for community detection in dynamic networks. We analytically and numerically characterize
the detectability transitions of such algorithm as a function of the memory parameters of the model.
Letizia E., Lillo F. (2017). Corporate payments networks and credit risk rating
This paper provides empirical evidences that corporate firms risk assessment could benefit from taking quantitatively into account the network of interactions among firms. Indeed, the structure of interactions between firms is critical to identify risk concentration and the possible pathways of propagation of financial distress. In this work, we consider the interactions by investigating a large proprietary dataset of payments between Italian firms. We first characterise the topological properties of the payment networks, and then we focus our attention on the relation between the networks and the risk of firms. Our main finding is to document the existence of an homophily of risk, i.e. the tendency of firms with similar risk profile to be statistically more connected among themselves. This effect is observed when considering both pairs of firms and communities or hierarchies identified in the network. We leverage this knowledge to demonstrate that network properties of a node can be used to predict the missing rating of a firm.
P.Barucca, D.Tantari, F.Lillo (2016), Centrality metrics and localization in core-periphery networks
Two concepts of centrality have been defined in complex networks. The first considers the centrality of a node and many different metrics for it have been defined (e.g. eigenvector centrality, PageRank, non-backtracking centrality, etc). The second is related to large scale organization of the network, the core-periphery structure, composed by a dense core plus an outlying and loosely-connected periphery. In this paper we investigate the relation between these two concepts. We consider networks generated via the stochastic block model, or its degree corrected version, with a core-periphery structure and we investigate the centrality properties of the core nodes and the ability of several centrality metrics to identify them. We find that the three measures with the best performance are marginals obtained with belief propagation, PageRank, and degree centrality, while non-backtracking and eigenvector centrality (or MINRES [10], showed to be equivalent to the latter in the large network limit) perform worse in the investigated networks.
http://iopscience.iop.org/article/10.1088/1742-5468/2016/02/023401/meta
Taranto, D. E., Bormetti, G., Bouchaud, J.-P., Toth, B., and Lillo, F. (2016). Linear models for the impact of order flow on prices II. The Mixture Transition Distribution model
Abstract
Modeling the impact of the order flow on asset prices is of primary importance to understand the behavior of financial markets. Part I of this paper reported the remarkable improvements in the description of the price dynamics which can be obtained when one incorporates the impact of past returns on the future order flow. However, impact models presented in Part I consider the order flow as an exogenous process, only characterized by its two-point correlations. This assumption seriously limits the forecasting ability of the model. Here we attempt to model directly the stream of discrete events with a so-called Mixture Transition Distribution (MTD) framework, introduced originally by Raftery (1985). We distinguish between price-changing and non price-changing events and combine them with the order sign in order to reduce the order flow dynamics to the dynamics of a four-state discrete random variable. The MTD represents a parsimonious approximation of a full high-order Markov chain. The new approach captures with adequate realism the conditional correlation functions between signed events for both small and large tick stocks and signature plots. From a methodological viewpoint, we discuss a novel and flexible way to calibrate a large class of MTD models with a very large number of parameters. In spite of this large number of parameters, an out-of-sample analysis confirms that the model does not overfit the data.
F. Corsi, S. Marmi, F. Lillo, When micro prudence increases macro risk: The destabilizing effects of financial innovation, leverage, and diversification, Operations Research 64 (5), 1073-1088
By exploiting basic common practice accounting and risk-management rules, we propose a simple analytical dynamical model to investigate the effects of microprudential changes on macroprudential outcomes. Specifically, we study the consequence of the introduction of a financial innovation that allows reducing the cost of portfolio diversification in a financial system populated by financial institutions having capital requirements in the form of Value at Risk (VaR) constraint and following standard mark-to-market and risk-management rules. We provide a full analytical quantification of the multivariate feedback effects between investment prices and bank behavior induced by portfolio rebalancing in presence of asset illiquidity and show how changes in the constraints of the bank portfolio optimization endogenously drive the dynamics of the balance sheet aggregate of financial institutions and, thereby, the availability of bank liquidity to the economic system and systemic risk. The model shows that when financial innovation reduces the cost of diversification below a given threshold, the strength (because of higher leverage) and coordination (because of similarity of bank portfolios) of feedback effects increase, triggering a transition from a stationary dynamics of price returns to a nonstationary one characterized by steep growths (bubbles) and plunges (bursts) of market prices.
Taranto, D. E., Bormetti, G., Bouchaud, J.-P., Toth, B., and Lillo, F. (2016). Linear models for the impact of order flow on prices I. Propagators: Transient vs. History Dependent Impact
Abstract
Market impact is a key concept in the study of financial markets and several models have been proposed in the literature so far. The Transient Impact Model (TIM) posits that the price at high frequency time scales is a linear combination of the signs of the past executed market orders, weighted by a so-called propagator function. An alternative description — the History Dependent Impact Model (HDIM) — assumes that the deviation between the realised order sign and its expected level impacts the price linearly and permanently. The two models, however, should be extended since prices are a priori influenced not only by the past order flow, but also by the past realisation of returns themselves. In this paper, we propose a two-event framework, where price-changing and non price-changing events are considered separately. Two-event propagator models provide a remarkable improvement of the description of the market impact, especially for large tick stocks, where the events of price changes are very rare and very informative. Specifically the extended approach captures the excess anti-correlation between past returns and subsequent order flow which is missing in one-event models. Our results document the superior performances of the HDIMs even though only in minor relative terms compared to TIMs. This is somewhat surprising, because HDIMs are well grounded theoretically, while TIMs are, strictly speaking, inconsistent.
Corsi, F., Lillo, F. and Pirino, D. (2015). Measuring Flight-to-Quality with Granger-Causality Tail Risk Networks.
Abstract
We introduce an econometric method to detect and analyze events of flight-to-quality by financial institutions. Specifically, using the recently proposed test for the detection of Granger causality in risk (Hong et al. 2009), we construct a bipartite network of systemically important banks and sovereign bonds, where the presence of a link between two nodes indicates the existence of a tail causal relation. This means that tail events in the equity variation of a bank helps in forecasting a tail event in the price variation of a bond. Inspired by a simple theoretical model of flight-to-quality, we interpret links of the bipartite networks as distressed trading of banks directed toward the sovereign debt market and we use them for defining indicators of flight-to-quality episodes. Based on the quality of the involved bonds, we distinguish different patterns of flight-to-quality in the 2006-2014 period. In particular, we document that, during the recent Eurozone crisis, banks with a considerable systemic importance have significantly impacted the sovereign debt market chasing the top-quality government bonds. Finally, an out of sample analysis shows that connectedness and centrality network metrics have a significant cross-sectional forecasting power of bond quality measures.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2576078
G. Bormetti, L. M. Calcagnile, M. Treccani, F. Corsi, S. Marmi, F. Lillo, Modelling systemic price cojumps with Hawkes factor models , Quantitative Finance 15 (7), 1137-1156
Instabilities in the price dynamics of a large number of financial assets are a clear sign of
systemic events. By investigating portfolios of highly liquid stocks, we find that there are a
large number of high-frequency cojumps. We show that the dynamics of these jumps is
described neither by a multivariate Poisson nor by a multivariate Hawkes model. We
introduce a Hawkes one-factor model which is able to capture simultaneously the time
clustering of jumps and the high synchronization of jumps across assets.
Lillo, F., & Pirino, D. (2015). The impact of systemic and illiquidity risk on financing with risky collateral. Journal of Economic Dynamics and Control, 50, 180–202.
Abstract
Repurchase agreements (repos) are one of the most important sources of funding liquidity for many financial investors and intermediaries. In a repo, some assets are given by a borrower as collateral in exchange of funding. The capital given to the borrower is the market value of the collateral, reduced by an amount termed as haircut (or margin). The haircut protects the capital lender from loss of value of the collateral contingent on the borrower׳s default. For this reason, the haircut is typically calculated with a simple Value at Risk estimation of the collateral for the purpose of preventing the risk associated to volatility. However, other risk factors should be included in the haircut and a severe undervaluation of them could result in a significant loss of value of the collateral if the borrower defaults. In this paper we present a stylized model of the financial system, which allows us to compute the haircut incorporating the liquidity risk of the collateral and, most important, possible systemic effects. These are mainly due to the similarity of bank portfolios, excessive leverage of financial institutions, and illiquidity of assets. The model is analytically solvable under some simplifying assumptions and robust to the relaxation of these assumptions, as shown through Monte Carlo simulations. We also show which are the most critical model parameters for the determination of haircuts.
http://www.sciencedirect.com/science/article/pii/S0165188914001675
Taranto, D. E., Bormetti, G., and Lillo, F. (2014) The adaptive nature of liquidity taking in limit order books. Journal of Statistical Mechanics: Theory and Experiment 2014.6: P06002
Abstract
In financial markets, the order flow, defined as the process assuming value one for buy market orders and minus one for sell market orders, displays a very slowly decaying autocorrelation function. Since orders impact prices, reconciling the persistence of the order flow with market efficiency is a subtle issue. A possible solution is provided by asymmetric liquidity, which states that the impact of a buy or sell order is inversely related to the probability of its occurrence. We empirically find that when the order flow predictability increases in one direction, the liquidity in the opposite side decreases, but the probability that a trade moves the price decreases significantly. While the last mechanism is able to counterbalance the persistence of order flow and restore efficiency and diffusivity, the first acts in the opposite direction. We introduce a statistical order book model where the persistence of the order flow is mitigated by adjusting the market order volume to the predictability of the order flow. The model reproduces the diffusive behaviour of prices at all time scales without fine-tuning the values of parameters, as well as the behaviour of most order book quantities as a function of the local predictability of the order flow.