Presenters: Petter Kolm (NYU, Courant Institute)

Title: Feature Selection in Jump Models

Abstract: Jump models switch infrequently between states to fit a sequence of data while taking the ordering of the data into account. In this talk, we propose a new framework for joint feature selection, parameter and state-sequence estimation in jump models. Feature selection is necessary in high-dimensional settings where the number of features is large compared to the number of observations and the underlying states differ only with respect to a subset of the features. We develop and implement a coordinate descent algorithm that alternates between selecting the features and estimating the model parameters and state sequence, which scales to large data sets with large numbers of (noisy) features. We demonstrate the usefulness of the proposed framework by comparing it with a number of other methods on both simulated and real data in the form of financial returns, protein sequences, and text. The resulting sparse jump model outperforms all other methods considered and is remarkably robust to noise.

This is joint work with Erik Lindstrom and Peter Nystrup.

# Author Archives: Admin

# May 25 at 4 pm (CEST). Laura Ballotta, Fourier-based methods for the management of complex insurance products

Presenters: Laura Ballotta (Cass Business School, City University of London)

Title: Fourier-based methods for the management of complex insurance products

Abstract: This paper proposes a framework for the valuation and the management of complex life insurance contracts, whose design can be described by a portfolio of embedded options, which are activated according to one or more triggering events. These events are in general monitored discretely over the life of the policy, due to the contract terms. Similar designs can also be found in other contexts, such as counterparty credit risk for example. The framework is based on Fourier transform methods as they allow to derive convenient closed analytical formulas for a broad spectrum of underlying dynamics. Multidimensionality issues generated by the discrete monitoring of the triggering events are dealt with efficiently designed Monte Carlo integration strategies. We illustrate the tractability of the proposed approach by means of a detailed study of ratchet variable annuities, which can be considered a prototypical example of these complex structured products. This is joint work with Ernst Eberlein, Thorsten Schmidt and Raghid Zeineddine.

# May 18 at 4 pm (CEST). Julien Guyon, Dispersion-Constrained Martingale Schrodinger Problems and the Joint S&P 500/VIX Smile Calibration Puzzle

**Presenters:** Julien Guyon (Bloomberg, Columbia University, Courant Institute)

**Title:** Dispersion-Constrained Martingale Schrodinger Problems and the Joint S&P 500/VIX Smile Calibration Puzzle

Abstract: The very high liquidity of S&P 500 (SPX) and VIX derivatives requires that financial institutions price, hedge, and risk-manage their SPX and VIX options portfolios using models that perfectly fit market prices of both SPX and VIX futures and options, jointly. This is known to be a very difficult problem. Since VIX options started trading in 2006, many practitioners and researchers have tried to build such a model. So far the best attempts, which used parametric continuous-time jump-diffusion models on the SPX, could only produce approximate fits. In this talk we solve this long standing puzzle for the first time using a completely different approach: a nonparametric discrete-time model. Given a VIX future maturity T1, we build a joint probability measure on the SPX at T1, the VIX at T1, and the SPX at T2 = T1 + 30 days which is perfectly calibrated to the SPX smiles at T1 and T2, and the VIX future and VIX smile at T1. Our model satisfies the martingality constraint on the SPX as well as the requirement that the VIX at T1 is the implied volatility of the 30-day log-contract on the SPX.

The model is cast as the unique solution of what we call a Dispersion-Constrained Martingale Schrodinger Problem which is solved by duality using an extension of the Sinkhorn algorithm, in the spirit of (De March and Henry-Labordere, Building arbitrage-free implied volatility: Sinkhorn’s algorithm and variants, 2019). We prove that the existence of such a model means that the SPX and VIX markets are jointly arbitrage-free. The algorithm identifies joint SPX/VIX arbitrages should they arise. Our numerical experiments show that the algorithm performs very well in both low and high volatility environments. Finally, we discuss how our technique extends to continuous-time stochastic volatility models, via what we dub VIX-Constrained Martingale Schrodinger Bridges, inspired by the classical Schrodinger bridge of statistical mechanics. The resulting stochastic volatility model is numerically implemented and is shown to achieve joint calibration with very high accuracy.

Time permitting, we will also briefly discuss a few related topics:

(i) a remarkable feature of the SPX and VIX markets: the inversion of convex ordering, and how classical stochastic volatility models can reproduce it;

(ii) why, due to this inversion of convex ordering, and contrary to what has often been stated, among the continuous stochastic volatility models calibrated to the market smile, the local volatility model does not maximize the price of VIX futures.

# May 11 at 4 pm (CEST). Loriano Mancini, Portfolio choice when stock returns may disappoint: An empirical analysis based on L-moments

May 11 at 4 pm (CEST).

Presenters: Loriano Mancini (Università della Svizzera Italiana)

Title: Portfolio choice when stock returns may disappoint: An empirical analysis based on L-moments

Abstract: We empirically examine the equity portfolio choices of investors with generalized disappointment aversion (GDA) preferences. The portfolio choice relies on a novel semi-parametric method based on L-moments which permits a large-scale empirical study. GDA investors appear to be very sensitive to higher-order L-moment returns, and to suffer large monetary utility losses from suboptimal portfolio choices such as equally weighted portfolios. These losses increase in the level of disappointment aversion and the number of stocks available for investment.

# May 4 at 4 pm (CEST). Lukasz Szpruch, Gradient Flows for Regularized Stochastic Control Problems

May 4 at 4 pm (CEST).

Presenters: Lukasz Szpruch (University of Edinburgh)

Title: Gradient Flows for Regularized Stochastic Control Problems

Abstract: This talk is on stochastic control problems regularized by the relative entropy, where the action space is the space of measures. This setting includes relaxed control problems, problems of finding Markovian controls with the control function replaced by an idealized infinitely wide neural network and can be extended to the search for causal optimal transport maps. By exploiting the Pontryagin optimality principle, we identify suitable metric space on which we construct gradient flow for the measure-valued control process along which the cost functional is guaranteed to decrease. It is shown that under appropriate conditions, this gradient flow has an invariant measure which is the optimal control for the regularized stochastic control problem. If the problem we work with is sufficiently convex, the gradient flow converges exponentially fast. Furthermore, the optimal measured valued control admits Bayesian interpretation which means that one can incorporate prior knowledge when solving stochastic control problems. This work is motivated by a desire to extend the theoretical underpinning for the convergence of stochastic gradient type algorithms widely used in the reinforcement learning community to solve control problems.

Joint work with David Siska (Edinburgh).

# April 20 at 4 pm (CEST). Nino Antulov-Fantulin, Complexity and Machine Learning with Financial applications

April 20 at 4 pm (CEST).

Presenters: Nino Antulov-Fantulin (ETH Zurich)

Title: Complexity and Machine Learning with Financial applications

Abstract: Complexity science studies systems and problems that are composed of many components that may interact with each other in a dynamic and non-linear way. In this first part of the talk, the author will motivate and introduce several research questions and directions at the interface of complexity and machine learning: (i) the ability of neural networks to steer or control trajectories of network dynamical systems, (ii) node embedding of directed graphs and (iii) efficient Monte Carlo sampling of epidemic processes. In the second part of the talk, the author will focus on machine learning modelling for (crypto)financial markets.

# April 15 at 4 pm (CEST). Blanka Horvath and Issa Zacharia, An Optimal Transport Approach to Market Regime Clustering

April 15 at 4 pm (CEST).

Presenters: Blanka Horvath and Issa Zacharia (King’s College London)

Title: An Optimal Transport Approach to Market Regime Clustering

Abstract: The problem of rapid and automated detection of distinct market regimes is a topic of great interest to financial mathematicians and practitioners alike. In this paper, we outline an unsupervised learning algorithm clusters a given time-series – corresponding to an asset or index – into a suitable number of temporal segments (market regimes). This method – the principle of which is inspired by the well-known k-means algorithm – clusters said segments on the space of probability measures with finite p-th moment. On this space, our choice of metric is the p-Wasserstein distance. We compare our Wasserstein-kmeans approach with a more traditional implementation of the kmeans algorithm by generating clusters in Euclidean space via the first N raw moments of each log-return segment instead (moment-kmeans). We compare the two approaches initially on real data and validate the performance of either algorithm by studying the maximum mean discrepancy between, and within, clusters. We show that the Wasserstein-kmeans algorithm vastly outperforms the moment-based approach on both real and synthetic data. In particular, the Wasserstein-kmeans algorithm performs well, even when the distribution associated to each regime is non-Gaussian.

# April 8 at 6 pm (CEST). Andrea Barbon, Brokers and Order Flow Leakage: Evidence from Fire Sales

April 8 at 6 pm (CEST).

Presenters: Andrea Barbon (University of St. Gallen)

Title: Brokers and Order Flow Leakage: Evidence from Fire Sales

Abstract: Using trade‐level data, we study whether brokers play a role in spreading order flow information in the stock market. We focus on large portfolio liquidations that result in temporary price drops, and identify the brokers who intermediate these trades. These brokers’ clients are more likely to predate on the liquidating funds than to provide liquidity. Predation leads to profits of about 25 basis points over 10 days and increases the liquidation costs of the distressed fund by 40%. This evidence suggests a role of information leakage in exacerbating fire sales.

# Mehdi Tomas and Michael Benzaquen, Cross-Impact modeling on derivative markets

March 16 at 4 pm (CEST).

**Presenters**: Mehdi Tomas and Michael Benzaquen (CFM and Ecole Polytechnique)

**Title**: Cross-Impact modeling on derivative markets

**Abstract**: Impact modeling on derivatives is challenging on two grounds. First, liquidity in some markets (e.g., options) can be fragmented across correlated, illiquid instruments. Second, their prices are locked by non-arbitrage. Univariate impact models cannot account for these problems. Instead, we need to rely on cross-impact, its cross-sectional generalization. We introduce the Kyle cross-impact model which aggregates liquidity and is consistent with no-arbitrage. We illustrate our framework using data from E-Mini futures, options and VIX futures. The resulting model is useful for optimal execution and estimation of hedging costs.

# Michele Vodret and Iacopo Mastromatteo, Understanding the relation between trades and price changes is of paramount importance for practitioners, yet challenging for theoreticians.

March 2 at 5 pm (CEST).

**Presenters**: Michele Vodret and Iacopo Mastromatteo. (CFM and Ecole Polytechnique)

**Title**: Understanding the relation between trades and price changes is of paramount importance for practitioners, yet challenging for theoreticians.

**Abstract**: On one hand models coming from econophysics/quantitative-finance literature predict a price impact function that slowly decays in order to compensate the long-range correlation of the order flow. On the other hand, these models lacks from a proper micro-foundation, customary in models coming from the theoretical economics literature, which build on rational expectations and asymmetric information, and typically prescribe linear impact. Recently, we extended the classic Kyle model, partially bridging the gap between these two classes of models. We will show our findings and suggest future lines of research.

# Prof. Matteo Marsili, Relevance

Prof. Matteo Marsili (Abdus Salam ICTP)

Title:

Relevance

Abstract:

The mass is a relevant variable in experiments of free falling bodies, their colour is not. The mass enters the laws that governs how objects fall, their colour does not. How can one identify relevant variables when data is scarce and high dimensional and the laws that govern the phenomena under study are unknown? In order to address this question, I will first argue that relevance can be quantified unambiguously in information theoretic terms, on the basis of a data alone. Samples with maximal relevance, i.e. those which are mostly informative about the generative process, exhibit power law distributions, suggesting a possible origin for the ubiquitous observation of such distributions. In addition, this opens the way to model free approaches to extract relevant information from high dimensional datasets. This will be illustrated in the cases of protein sequences and multi-electrode arrays recording of neural activity.

# Aurélien Decelle, Spectral learning of Restricted Boltzmann Machines

Aurélien Decelle (Université Paris-Sud XI)

Title:

Spectral learning of Restricted Boltzmann Machines

Abstract:

In this presentation I will expose our recent results on the Restricted Boltzmann Machine (RBM). The RBM is a generative model very similar to the Ising model, it is composed of both visible and hidden binary variables, and traditionally used in the context of machine learning. In this context, the goal is to inferred the parameters of the RBM such that it reproduces correctly a dataset’s distribution. Although they have been widely used in computer science, the phase diagram of this model is not known precisely in the context of learning. In particular, it is not known how the parameters influence the learning, and what exactly is learned within the parameters of the model. In our work, we show how the SVD of the data governs the first phase of the learning and how this decomposition helps to dynamics and the equilibrium properties of the model.

# Jacopo Rocchi, Self-sustained clusters in spin glass models

Jacopo Rocchi (LPTMS, Université Paris-Sud)

Title:

Self-sustained clusters in spin glass models

Abstract:

While macroscopic properties of spin glasses have been thoroughly investigated, their manifestation in the corresponding microscopic configurations is much less understood. To identify the emerging microscopic structures with macroscopic phases at different temperatures, we introduce the concept of self-sustained clusters (SSC). SSC are regions of the space where in-cluster induced fields dominate over the field induced by out-cluster spins. We study their properties in the Ising p-spin model with p=3 using replicas. The intuition gained using fully connected models is then used in the study of models defined on random graphs. A message-passing algorithm is developed to determine the probability of individual spins to belong to SSC. Results for specific instances, which compare the predicted SSC associations with the dynamical properties of the spins, are obtained from numerical simulations. This insight gives rise to a way to predict individual spin dynamics from a single snapshot of spin configurations.

# Prof. Jean Jacod, Modeling asset prices: small scale versus large scale

Jean Jacod (Université Paris 6, Pierre et Marie Curie)

Title:

Modeling asset prices: small scale versus large scale

Abstract:

A typical model for the price of financial asset, allowing for explicit or numerical computation of option prices, hedging, calibration, etc… , describes the price with an horizon of months or years. In contrast, a very active topic now is concerned with models for tick prices or order books. The structure of the price at the microscopic level is very di_erent from the structure of the usual (often continuous) semimartingales used at a macroscopic level. In particular the microscopic prices evolves on the tick grid, usually going up or down by one tick only. Our aim is to see how it is possible to reconcile the two viewpoints, using a scaling limit of tick-level price models. We will see that this question (going back to the thesis of Bachelier, in a sense) raises a number of non trivial questions if we want a reasonably simple microscopic model, together with a macroscopic model exhibiting stochastic volatility or jumps or a drift.

(Joint work with Yacine Ait-Sahalia).

# Hye-Jin Cho, On Overconfidence, Bubbles and the Stochastic Discount Factor

Wednesday, 09/05/2018

14:00

Scuola Normale Superiore

Aula Bianchi Scienze

**Hye-Jin Cho **

University of Paris 1 – Panthéon Sorbonne

**Abstract
**

This study is intended to provide a continuous-time equilibrium model

in which overconfidence generates disagreements among two groups

regarding asset fundamentals. Every agent in trading wants to sell

more than the average stock price in the market. However, the

overconfident agent drives a speculative bubble with a false belief

that the stock price will tend to move to the average price over time.

I represent the difference between a false belief and a stochastic

stationary process which does not change when shifted in time. The gap

of beliefs shows how to accommodate dynamic fluctuations as parameters

change such as the degree of overconfidence or the information of

signals. By showing how changes in an expectation operator affect the

stochastic variance of economic fundamentals, speculative bubbles are

revealed at the burst independently from the market.

# Tomasz Gubiec , Continuous Time Random Walk in finance. The story of symbiosis.

Wednesday, 02/05/2018

14:00

Scuola Normale Superiore

Aula Marie Curie

**Tomasz Gubiec**

University of Warsaw

**Abstract
**

Over 50 years ago, two physicists Montroll and Weiss in the physical context of dispersive transport and diffusion introduced stochastic process, named Continuous-Time Random Walk (CTRW). The trajectory of such a process is created by elementary events ‘spatial’ jumps of the stochastic process preceded by waiting (or interevent or pausing) time. Since introduction, CTRW found innumerable application in different fields [1]. In this seminar, I will focus on the application of CTRW to finance [2] and I will tell the story of how this application turned out to be fruitful for both sided and motivated new directions of research [3,4].

[1] Kutner, R., & Masoliver, J. (2017). The continuous time random walk, still trendy: fifty-year history, state of art and outlook. The European Physical Journal B, 90(3), 50

[2] Scalas, E. (2006). Five years of continuous-time random walks in econophysics. In The complex networks of economic interactions (pp. 3-16). Springer, Berlin, Heidelberg.

[3] Gubiec, T., & Kutner, R. (2010). Backward jump continuous-time random walk: An application to market trading. Physical Review E, 82(4), 046119

[4] Gubiec, T., & Kutner, R. (2017). Continuous-Time Random Walk with multi-step memory: an application to market dynamics. The European Physical Journal B, 90(11), 228

# Frontiers in High-Frequency Financial Econometrics 2018

# Elisa Alos, The implied volatility surface in modeling problems

Tuesday 27-03-2018

11:00

Scuola Normale Superiore

Aula Fermi

**Elisa Alos**

Universitat Pompeu Fabra, Barcellona

**Abstract
**

In the Black-Scholes model, the volatility parameter is constant. But it is well-known that, if we compute this volatility parameter by inverting market option prices, the result (the implied volatility) will depend on the strike price (a variation described graphically as a smile or skew) and on time to maturity. Classical stochastic volatility models, where the volatility is allowed to be a diffusion process, can capture the observed smiles and skews, but they cannot easily explain the term structure. For instance, recent numerical analysis state that the skew slope is approximately $O((T )^{-k})$, for some positive $k$ and where $T$ denotes the time to maturity, while the rate for these stochastic volatility models is $O(1)$. In this talk, we will see how to construct new stochastic volatility models that can describe this phenomena. Towards this end, we will present short-time approximations for the implied volatility skew and smile. The obtained formulas will give us a useful tool to identify the volatilities that can explain this term structure. Based on this approach, some new models have been proposed recently (as, for example, rough volatilities). In this talk we will discuss on the state of the art of this modeling research, and we will discuss the main advantages and disavantages of these new models.

#
A. Barra, G. Genovese, P. Sollich, D. Tantari, * Phase diagram of restricted Boltzmann machines and generalized Hopfield networks with arbitrary priors *, Physical Review E 97 (2), 022310, 2018

Restricted Boltzmann machines are described by the Gibbs measure of a bipartite spin glass, which in turn can be seen as a generalized Hopfield network. This equivalence allows us to characterize the state of these systems in terms of their retrieval capabilities, both at low and high load, of pure states. We study the paramagnetic-spin glass and the spin glass-retrieval phase transitions, as the pattern (i.e., weight) distribution and spin (i.e., unit) priors vary smoothly from Gaussian real variables to Boolean discrete variables. Our analysis shows that the presence of a retrieval phase is robust and not peculiar to the standard Hopfield model with Boolean patterns. The retrieval region becomes larger when the pattern entries and retrieval units get more peaked and, conversely, when the hidden units acquire a broader prior and therefore have a stronger response to high fields. Moreover, at low load retrieval always exists below some critical temperature, for every pattern distribution ranging from the Boolean to the Gaussian case.

#
L.M. Calcagnile, F. Corsi, S. Marmi, * Entropy and efficiency of the ETF market**
*

*
*

We investigate the relative information efficiency of financial markets by measuring the entropy of the time series of high frequency data. Our tool to measure efficiency is the Shannon entropy, applied to 2-symbol and 3-symbol discretisations of the data. Analysing 1-minute and 5-minute price time series of 55 Exchange Traded Funds traded at the New York Stock Exchange, we develop a methodology to isolate true inefficiencies from other sources of regularities, such as the intraday pattern, the volatility clustering and the microstructure effects. The first two are modelled as multiplicative factors, while the microstructure is modelled as an ARMA noise process. Following an analytical and empirical combined approach, we find a strong relationship between low entropy and high relative tick size and that volatility is responsible for the largest amount of regularity, averaging 62% of the total regularity against 18% of the intraday pattern regularity and 20% of the microstructure.

arXiv preprint arXiv:1609.04199

#
L.M. Calcagnile, G. Bormetti, M. Treccani, S. Marmi, F. Lillo,

#
N. Angelini, G. Bormetti, S. Marmi, F. Nardini,

#
F. Corsi, S. Marmi, F. Lillo,

#
G. Bormetti, L. M. Calcagnile, M. Treccani, F. Corsi, S. Marmi, F. Lillo,

#
D.H. Kim, S. Marmi,

#
S. Marmi, C. Pacati, R. Renò, W.A. Risso,

#
G. Buccheri, S. Marmi, R.N. Mantegna,

#
L.M. Calcagnile, G. Bormetti, M. Treccani, S. Marmi, F. Lillo, *Collective synchronization and high frequency systemic instabilities in financial markets*, Quantitative Finance 18 (2), 237-247

We present some empirical evidence on the dynamics of price instabilities in financial markets and propose a new Hawkes modelling approach. Specifically, analysing the recent high frequency dynamics of a set of US stocks, we find that since 2001 the level of synchronization of large price movements across assets has significantly increased. We find that only a minor fraction of these systemic events can be connected with the release of pre-announced macroeconomic news. Finally, the larger is the multiplicity of the event—i.e. how many assets have swung together—the larger is the probability of a new event occurring in the near future, as well as its multiplicity. To reproduce these facts, due to the self- and cross-exciting nature of the event dynamics, we propose an approach based on Hawkes processes. For each event, we directly model the multiplicity as a multivariate point process, neglecting the identity of the specific assets. This allows us to introduce a parsimonious parametrization of the kernel of the process and to achieve a reliable description of the dynamics of large price movements for a high-dimensional portfolio.

https://doi.org/10.1080/14697688.2017.1403141

# Topics in portfolio choice III – Prof. Paolo Guasoni

# Topics in portfolio choice II – Prof. Paolo Guasoni

# Paolo Guasoni, “Stochastics and market frictions: An overview”

Wednesday October 19 2016

15:00

Scuola Normale Superiore

Aula Mancini

**Paolo Guasoni**

Dublin City University

**Abstract
**

In the last decades Finance theory has benefited from its affinity with the theory of Stochastic Processes, in particular martingales and stochastic control. These theories have subtly influenced the development of Finance along the path in which they were most successful, thereby disregarding market frictions such as trading costs, incomplete information, and incentives. This talk outlines recent developments in stochastic methods for models with frictions, how they undermine several tenets of portfolio theory, and how they stimulate new approaches to stochastic control. The talk concludes with a mention of frictions arising from mortality and operational risk.

# Franco Flandoli, “From Clinical Oncology to scaling limits”

Thursday October 13 2016

16:30

Scuola Normale Superiore

Aula Bianchi

**Franco Flandoli**

University of Pisa

**Abstract
**

A problem of Clinical Oncology will be shortly introduced and its modelling based on differential equations and statistical elements will be illustrated. The above modelling is the simplest possible, for a first investigation. In order to make it more realistic, two natural mathematical elements are particle systems and Partial Differential Equations. It is here that scaling limit questions arise. As an example, two problems will be described: a first, partially solved one, connecting proliferating particles with the so called Fisher-KPP equations; and a second one, widely open, about the features, potentially of KPZ type, of the proliferating boundary.

# Damiano Brigo, “Intrinsic stochastic differential equations as jets: theory and applications”

Monday October 10 2016

16:00

Scuola Normale Superiore

Aula Mancini

**Damiano Brigo**

Imperial College, London

**Abstract
**

We quickly introduce Stochastic Differential Equations (SDEs) and their two main calculi: Ito and Stratonovich. Briefly recalling the definition of jets, we show how Ito SDEs on manifolds may be defined intuitively as 2-jets of curves driven by Brownian motion and show how this relationship can be interpreted in terms of a convergent numerical scheme. We show how jets can lead to intuitive and intrinsic representations of Ito SDEs, presenting several plots and numerical examples. We give a new geometric interpretation of the Ito-Stratonovich transformation in terms of the 2-jets of curves induced by consecutive vector flows. We interpret classic quantities and operators in stochastic analysis geometrically. We hint at applications of the jet representation to i) dimensionality reduction by projection of infinite dimensional stochastic partial differential equations (SPDEs) onto finite dimensional submanifolds for the filtering problem in signal processing, and ii) consistency between dynamics of interest rate factors and parametric form of term structures in mathematical finance. We explain that the mainstream choice of Stratonovich calculus for stochastic differential geometry is not optimal when combining geometry and probability, using the mean square optimality of projection on submanifolds as a fundamental

application.

# Gabriele La Spada, “Competition, reach for yield, and money market funds”

Tuesday September 13 2016

11:00

Scuola Normale Superiore

Aula Bianchi

**Gabriele La Spada**

Federal Reserve Bank of New York

**Abstract
**

Do asset managers reach for yield because of competitive pressures in a low-rate environment? I propose a tournament model of money market funds (MMFs) to study this issue. When funds care about relative performance, an increase in the risk premium leads funds with lower default costs to increase risk-taking, while funds with higher default costs decrease risk-taking. Without changes in the premium, lower risk-free rates reduce the risk-taking of all funds. I show that these predictions are consistent with MMF risk-taking during the 2002-08 period and that rank-based performance is indeed a key determinant of money flows to MMFs.

# Anirban Chakraborti, “Sectoral co-movements and volatilities of Indian stock market: An analysis of daily returns data”

Wednesday July 6 2016

11:30

Scuola Normale Superiore

Aula Fermi

**Anirban Chakraborti**

Jawaharlal Nehru University, New Delhi, India

**Abstract
**

First, we review the techniques of decomposing aggregate correlation matrices to study co-movements in financial data. We apply the techniques to daily return time series from the Indian stock market. Secondly, we use the multi-dimensional scaling methods to visualise the dynamic evolution of the stock market. This method helps to differentiate sectors in the market in the form of clusters. The other objective is to detect periods of instability in the market. Finally, our aim is to decompose the aggregate volatility into sectoral components. Such a mapping allows us to study impact of different sectors on the market behaviour and vice versa.

# Marc Mezard, “Boole, Shannon, and the challenge of data science: A statistical physics perspective”

Wednesday June 29 2016

15:00

Scuola Normale Superiore

Aula Azzurra

**Marc Mezard**

École Normale Supérieure, Paris

**Abstract
**

In 1854, in his treatise on the Laws of Nature, George Boole had stated a clear goal : « to investigate the fundamental laws of those operations of the mind by which reasoning is performed ». This led him to study the foundations of logic and of probabilities. A century later, Claude Shannon opened the way to a mathematical understanding of information and of its communication. The fields of research initiated by these two giants play a major role in contemporary science, and in particular in the handling of large amounts of data, and in the extraction of information out of these data. However, in large-size problems, collective phenomena of the type studied in statistical physics, like phase transitions, start to play a major role. This talk will study the importance of phase transitions in some core problems of Boolean logics and of information theory, with a special focus on the importance of glassy phases.

# Jiro Akahori, “Ito atlas and around”

Thursday March 17 2016

13:00

Scuola Normale Superiore

Aula Mancini

**Jiro Akahori**

Ritsumeikan University, Shiga, Japan

**Abstract
**

I will discuss Malliavin’s canonic diffusion on the circle and related topics including a link to the Fourier method.

# Christian Brownlees, “Community Detection in Partial Correlation Networks”

Friday March 11 2016

10:30

Scuola Normale Superiore

Aula Bianchi

**Christian Brownlees**

Universitat Pompeu Fabra, Barcelona

**Abstract
**

In this work we propose a community detection algorithm for partial correlation networks. We assume that the variables in the network are partitioned into communities. The presence of nonzero partial correlation between two variables is determined by a Bernoulli trial whose probability depends on whether the variables belong to the same community or not. The community partition is assumed to be unobserved and the goal is to recover it from a sample of observations. To tackle this problem we introduce a community detection algorithm called Blockbuster. The algorithm detects communities by applying k-means clustering to the eigenvectors corresponding to the largest eigenvalues of the sample covariance matrix. We study the properties of the procedure and show that Blockbuster consistently detects communities when the network dimension and the sample size are large. The methodology is used to study real activity clustering in the United States.

#
N. Angelini, G. Bormetti, S. Marmi, F. Nardini, *A Stylized Model for Long-Run Index Return Dynamics *, Essays in Economic Dynamics, 111-122

We introduce a discrete-time model of stock index return dynamics grounded on the ability of Shiller’s Cyclically Adjusted Price-to-Earning ratio to predict long-horizon market performances. Specifically, we discuss a model in which returns are driven by a fundamental term and an autoregressive component perturbed by external random disturances. The autoregressive component arises from the agents’ belief that expected returns are higher in bullish markets than in bearish markets. The fundamental term, driven by the value towards which fundamentalists expect the current price should revert, varies in time and depends on the initial averaged price-to-earnings ratio. The actual stock price may deviate from the perceived reference level as a combined effect of an idyosyncratic noise component and local trends due to trading strategies. We demonstrate both analytically and by means of numerical experiments that the long-run behavior of our stylized dynamics agrees with empirical evidences reported in literature.

#
F. Corsi, S. Marmi, F. Lillo, *When micro prudence increases macro risk: The destabilizing effects of financial innovation, leverage, and diversification*, Operations Research 64 (5), 1073-1088

By exploiting basic common practice accounting and risk-management rules, we propose a simple analytical dynamical model to investigate the effects of microprudential changes on macroprudential outcomes. Specifically, we study the consequence of the introduction of a financial innovation that allows reducing the cost of portfolio diversification in a financial system populated by financial institutions having capital requirements in the form of Value at Risk (VaR) constraint and following standard mark-to-market and risk-management rules. We provide a full analytical quantification of the multivariate feedback effects between investment prices and bank behavior induced by portfolio rebalancing in presence of asset illiquidity and show how changes in the constraints of the bank portfolio optimization endogenously drive the dynamics of the balance sheet aggregate of financial institutions and, thereby, the availability of bank liquidity to the economic system and systemic risk. The model shows that when financial innovation reduces the cost of diversification below a given threshold, the strength (because of higher leverage) and coordination (because of similarity of bank portfolios) of feedback effects increase, triggering a transition from a stationary dynamics of price returns to a nonstationary one characterized by steep growths (bubbles) and plunges (bursts) of market prices.

# Maria Rita Iacò, “Copulas in uniform distribution, optimal transport and finance”

Tuesday January 26 2016

13:00

Scuola Normale Superiore

Aula Bianchi

**Maria Rita Iacò**

Technische Universität Graz

**Abstract
**

The aim of this talk is to give an overview of some results obtained in the framework of copulas. The last ones got considerable attention in the last years, especially in finance where they are used to perform stress-tests and robustness checks in situations of extreme downside events. However, soon after the global financial crisis of 2007–2008, several people argued that copulas, together with other mathematical instruments, have been one of the main causes of the market crashes.

During this talk I will describe some problems in uniform distribution and finance which were described by using a copula approach. Since we will deal with an optimisation problem, we will finally show how this problem can be perfectly embedded in the general theory of optimal transport.

# Nicola Fusari, “Pricing Short-Term Market Risk: Evidence from Weekly Options”

Thursday December 17 2015

13:00

Scuola Normale Superiore

Aula Bianchi

**Nicola Fusari**

Johns Hopkins Carey Business School

**Abstract
**We study short-term market risks implied by weekly S&P 500 index options. The introduction of weekly options has dramatically shifted the maturity profile of traded options over the last five years, with a substantial proportion now having expiry within one week. Economically, this reflects a desire among investors for actively managing their exposure to very short-term risks. Such short-dated options provide an easy and direct way to study market volatility and jump risks. Unlike longer-dated options, they are largely insensitive to the risk of intertemporal shifts in the economic environment, i.e., changes in the investment opportunity set. Adopting a novel general semi-nonparametric approach, we uncover variation in the shape of the negative market jump tail risk which is not spanned by market volatility. Incidents of such tail shape shifts co- incide with serious mispricing of standard parametric models for longer-dated options. As such, our approach allows for easy identification of periods of heightened concerns about negative tail events on the market that are not always “signaled” by the level of market volatility and elude standard asset pricing models.

Joint work with Torben G. Andersen AND Viktor Todorov.

# Domenico Di Gangi, “Assessing Systemic Risk Due to Fire Sales Spillover Through Maximum Entropy Network Reconstruction”

Wednesday November 11 2015

13:00

Scuola Normale Superiore

Aula Bianchi

**Domenico Di Gangi**

Department of Physics, University of Pisa,

**Abstract
**Assessing systemic risk in financial markets is of great importance but it often requires data that are unavailable or available at a very low frequency. For this reason, systemic risk assessment with partial information is potentially very useful for regulators and other stakeholders. In this paper we consider systemic risk due to fire sales spillover and portfolio rebalancing by using the risk metrics defined by Greenwood et al. (2015). By using the Maximum Entropy principle we propose a method to assess aggregated and single bank’s systemicness and vulnerability and to statistically test for a change in these variables when only the information on the size of each bank and the capitalization of the investment assets are available. We prove the effectiveness of our method on 2001-2013 quarterly data of US banks for which portfolio composition is available.

Please find the preprint at the url:

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639178

# Frederic Abergel, “Limit order books driven by Hawkes processes”

Tuesday September 8 2015

11:30

Scuola Normale Superiore

Aula Bianchi

**Frederic Abergel**

CMAP, Ecole Centrale-Supelec, Paris

**Abstract
**Hawkes processes offer an interesting toolbox to model the interplay between different agents on financial markets. This talk will present some recent results on Hawkes process-driven limit order books, focusing on questions of ergodicity and asymptotic behaviour. Some numerical simulations will also be commented.

# Roberto Casarin, “Bayesian Nonparametric Calibration and Combination of Predictive Distributions”

Thursday July 2 2015

13:00

Scuola Normale Superiore

Aula Bianchi

**Roberto Casarin**

Department of Economics – Università Ca’ Foscari di Venezia

**Abstract**

We introduce a Bayesian approach to predictive density calibration and combination that accounts for parameter uncertainty and model set incompleteness through the use of random calibration functionals and random combination weights. Building on the work of Ranjan and Gneiting (2010) and Gneiting and Ranjan (2013), we use infinite beta mixtures for the calibration. The proposed Bayesian nonparametric approach takes advantage of the flexibility of Dirichlet process mixtures to achieve any continuous deformation of linearly combined predictive distributions. The inference procedure is based on Gibbs sampling and allows accounting for uncertainty in the number of mixture components, mixture weights, and calibration parameters. The weak posterior consistency of the Bayesian nonparametric calibration is provided under suitable conditions for unknown true density. We study the methodology in simulation examples with fat tails and multimodal densities and apply it to density forecasts of daily S&P returns and daily maximum wind speed at the Frankfurt airport. Joint work with Federico Bassetti and Francesco Ravazzolo.

# XVII Workshop on Quantitative Finance

The quantitative finance group of the Scuola Normale Superiore will host the XVII Workshop on Quantitative Finance.

The workshop will take place on January, 28-29, 2016 in Pisa at Scuola Normale Superiore.

# Andrea Pallavicini, “Arbitrage-Free Pricing with Funding Costs and Collateralization”

Friday May 29 2015

9.30 – 13.00

Scuola Normale Superiore

Aula Fermi

**Andrea Pallavicini
**Banca IMI, Milano and Imperial College, London

**Arbitrage-Free Pricing with Funding Costs and Collateralization**

**Abstract
**The financial crisis started in 2007 has shown that any pricing framework must include from the very beginning the possibility of default of any market player. As a consequence derivative valuation and risk analysis have moved from exotic derivatives managed on simple single-asset classes to simple derivatives embedding credit risk and new, or previously neglected, types of complex and interconnected non-linear effects. Derivative valuation is adjusted to include counterparty credit risk and contagion effects along with funding costs due to collateral posting, treasury policies, and regulatory constraints. A second level of complexity is produced by moving from a single trade to the whole bank portfolio. Aggregation-dependent valuation processes, and theirs operational challenges, arising from non-linearities are discussed both from a mathematical and practical point of view.

Download slides here.

All interested people are kindly invited.

# Massimiliano Caporin, “The impact of network connectivity on factor exposures, asset pricing and portfolio diversification”

Wednesday May 6 2015

13:00

Scuola Normale Superiore

Aula Bianchi

**Massimiliano Caporin
**Department of Economics and Management “Marco Fanno” – Università di Padova

**Abstract**

The need for understanding the propagation mechanisms behind the recent financial crises lead the increased interest for works associated with asset interconnections. In this framework, network-based methods have been used to infer from data the linkages between institutions. In this paper, we elaborate on this and make a step forward by introducing network linkages into linear factor models. Networks are used to infer the exogenous and contemporaneous links across assets, and impacts on several dimensions: network exposures act as in inflating factor for systematic exposure to common factors with implications for pricing; the power of diversication is reduced by the presence of network connections; in the presence of network links a misspecied traditional linear factor model provides residuals that are correlated and heteroskedastic. We support our claims with an extensive simulation experiment. Joint work with Monica Billio, Roberto Panzica, and Loriana Pelizzon.

# Andrea Sillari, “Risk Models – Operation, Maintenance, Upgrade”

Thursday April 21 2015

13:00

Scuola Normale Superiore

Aula Mancini

**Andrea Sillari
**Unicredit Bank – Milano

# Vladimir Filimonov, “Exogenous versus endogenous dynamics in the price discovery process”

Thursday April 14 2015

13.00

Scuola Normale Superiore

Aula Russo

**Vladimir Filimonov
**ETH Zurich, Switzerland

**Abstract
**The talk discusses feedback mechanisms in the price discovery process: from high-frequency market-making and algorithmic trading to long-term behavioral mechanisms. In order to quantify short-term endogeneity we propose an index derived by calibrating the self-excited Hawkes model on empirical time series of trades. The Hawkes model accounts simultaneously for the co-existence and interplay between the exogenous impact and the the feedback look by which past trading activity may influence future trading activity. Technically known in the mathematical literature on branching processes as the branching ratio, the reflexivity index is defined as an average ratio of the number of price moves that are due to endogenous interactions to the total number of all price changes, which also include exogenous events. This index quantifies at the same time both “criticality” of the system (stability and susceptibility to large shocks) and its “efficiency” (in sense of the Efficient Market Hypothesis). We calibrate our measure on several financial and commodity futures markets and documented presence of “micro” regime shifts that coincided with “macro” changes in trading methods or sentiments of investors. Finally we relate our analysis to recent evidences of an intrinsic “criticality” of price discovery and make a bridge between short- and long-memory models.

# Joao da Gama Batista, “Sudden trust collapse in networked societies”

Friday April 10 2015

13.00

Scuola Normale Superiore

Aula Bianchi

**Joao da Gama Batista
**Laboratoire de mathématiques appliquées aux systèmes – École Centrale de Paris

**Abstract
**Trust is a collective, self-fulfilling phenomenon that suggests analogies with phase transitions. We introduce a stylized model for the build-up and collapse of trust in networks, which generically displays a first order transition. The basic assumption of our model is that whereas trustworthiness begets trustworthiness, panic also begets panic, in the sense that a small decrease in trustworthiness may be amplified and ultimately lead to a sudden and catastrophic drop of collective trust. We show, using both numerical simulations and mean-field analytic arguments, that there are extended regions of the parameter space where two equilibrium states coexist: a well-connected network where global confidence is high, and a poorly connected network where global confidence is low. In these coexistence regions, spontaneous jumps from the well-connected state to the poorly connected state can occur, corresponding to a sudden collapse of trust that is not caused by any major external catastrophe. In large systems, spontaneous crises are replaced by history dependence: whether the system is found in one state or in the other essentially depends on initial conditions.

# Tommaso Colozza, “Supply of public debt and demand for risk premia: a Minskian approach to credit risk”

Wednesday April 1 2015

13.00

Scuola Normale Superiore

Aula Mancini

**Tommaso Colozza
**Dipartimento di Statistica e Matematica Applicata all’Economia – Università di Pisa

**Abstract
**Financial stability of EMU countries is managed by policy makers through several key macroeconomic indicators; the market instead monitors creditworthiness with credit risk premia embedded in sovereign yields. A demand-supply approach solves this duality: in a Minskian framework, positive inelastic shifts in debt-to-GDP ratio due to widespread macro-financial distress may lower risk appetites of lenders and increase risk premia, up to default. Time-variating risk appetites justify statistical relevance of debt-to-gdp variation on yields levels; if conveniently decomposed, debt velocity allows also to imply a default probability measure comparable to standard CDS-implied measures.

# Roberto Renò, “Multi-jumps”

Friday March 20 2015

13.00

Scuola Normale Superiore

Aula Bianchi

**Roberto Renò
**Dipartimento di Economia Politica e Statistica – Università di Siena

**Abstract
**The simultaneous occurrence of jumps in several stocks (multi-jumps) can be associated to major nancial news, is correlated with sudden spikes of the variance risk premium, and determines an increase in the stock variances and correlations which signicantly deteriorates the diversication potential of asset allocation. The latter evidence implies a reduction in the demand of stocks by an aware risk-averse investor. These facts can be easily overlooked by the usage of standard univariate jump statistics, which just lack sucient power. They are instead revealed in a clearly cut way by using a novel test based on smoothed estimators of the integrated variance ofindividual stocks.

Joint work with Massimiliano Caporin and Aleksey Kolokolov.

#
G. Bormetti, L. M. Calcagnile, M. Treccani, F. Corsi, S. Marmi, F. Lillo, *Modelling systemic price cojumps with Hawkes factor models *, Quantitative Finance 15 (7), 1137-1156

Instabilities in the price dynamics of a large number of financial assets are a clear sign of

systemic events. By investigating portfolios of highly liquid stocks, we find that there are a

large number of high-frequency cojumps. We show that the dynamics of these jumps is

described neither by a multivariate Poisson nor by a multivariate Hawkes model. We

introduce a Hawkes one-factor model which is able to capture simultaneously the time

clustering of jumps and the high synchronization of jumps across assets.

#
D.H. Kim, S. Marmi, *Distribution of asset price movement and market potential*, Journal of Statistical Mechanics: Theory and Experiment 2015 (7), P07001

In this article we discuss the distribution of asset price movements by introducing a market potential function. From the principle of free energy minimization we analyze two different kinds of market potentials. We obtain a U-shaped potential when market reversion (i.e. contrarian investors) is dominant. On the other hand, if there are more trend followers, flat and logarithmic potentials appear. By using the cyclically adjusted price-to-earning ratio, which is a common valuation tool, we empirically investigate the market data. By studying long term data we observe the historical change of the market potential of the US stock market. Recent US data show that the market potential looks more like a trend-following potential. Next, we compare the market potentials for 12 different countries. Though some countries have similar market potentials, there are specific examples like Japan which exhibits a very flat potential.

# Roberto Baviera, “A thermometer for financial instability in the euro area economy and the role of carry trade”

Thursday February 19 2015

13.00

Scuola Normale Superiore

Aula Bianchi

**Roberto Baviera
**Financial Engineering, Dipartimento di Matematica -Politecnico di Milano

**Abstract
**

This study suggests a simple financial instability indicator for the euro area economy.

It works as a discrete thermometer with three possible outcomes depending on the severity of the crisis. This indicator is based on the specific shape of the credit term structure for the two main peripheral countries in the area. The paper discusses how some key features of term structure are linked to government debt carry trade.

In order to assess the performance of the proposed market-based indicator, the paper shows how the identified episodes of financial turmoil are related with the timing and the intensity of unconventional measures in the euro area.

# Fabio Caccioli, “Instabilities in portfolio optimization and regularization”

Tuesday November 4 2014

13.00

Scuola Normale Superiore

Aula Bianchi

**Fabio Caccioli
**University College London

**Abstract
**

We consider the problem of portfolio selection in presence of market impact, and we show that including a term which accounts for finite liquidity in portfolio optimization naturally mitigates the instabilities that arise in the estimation of coherent risk measures. This is because taking into account the impact of trading in the market is mathematically equivalent to introducing a regularization on the risk measure. We show that the impact function determines which regularizer is to be used, and we characterize the typical behavior of the optimal portfolio in the limit of large portfolio sizes for the case of Expected Shortfall.

# Xuezhong He, “Optimality of momentum and reversal”

Wednesday September 10 2014

13.00

Scuola Normale Superiore

Aula Bianchi

**Xuezhong He
**University of Technology – Sydney (Australia)

**Abstract
**

We develop a continuous-time asset price model to capture short-run momentum and long-run reversal. By studying a dynamic asset allocation problem, we derive the optimal investment strategy in closed form and show that the combined momentum and reversal strategies are optimal. We then estimate the model to the S&P 500 and demonstrate that, by taking the timing opportunity with respect to trend in return and market volatility, the optimal strategies outperform not only pure momentum and pure mean reversion strategies, but also the market index and time series momentum strategy. Furthermore we show that the optimality also holds to the out-of-sample tests and short-sale constraints and the out performance is immune to market states, investor sentiment and market volatility.

# Luca Capriotti, “Real Time Risk Management with Adjoint Algorithmic Differentiaton”

Friday June 20 2014

11.00 – 12.30, 14.30 – 16.00

Scuola Normale Superiore

Aula Bianchi

**Luca Capriotti
**Credit Suisse London

**Real Time Risk Management with Adjoint Algorithmic Differentiaton**

**Abstract
**Adjoint Algorithmic Differentiation (AAD) is one of the principal innovations in risk management of the recent times. In this minicourse I will introduce AAD and show how it can be used to implement the calculation of price sensitivities in complete generality and with minimal analytical effort. The focus will be the application to Monte Carlo methods – generally the most challenging from the computational point of view. With several examples I will illustrate the workings of AAD and demonstrate how it can be straightforwardly implemented to reduce the computation time of the risk of any portfolio by order of magnitudes. ￼

Download flyer here and slides here.

All interested people are kindly invited.

# Symposium on return predictability in stock and real estate markets

# Paolo Porcedda, “Asimmetrie informative e selezione avversa nella gestione del portafoglio creditizio nell’esperienza di un primario istituto creditizio italiano”

Monday April 7 2014

13.00

Scuola Normale Superiore

Aula Bianchi

**Paolo Porcedda
**UniCredit Bank

**Abstract
**

Alimentata dalle pressioni esercitate dalle autorità di vigilanza bancaria a seguito dell’introduzione della nuova normativa regolamentare (cd Basilea 2), nei primi anni 2000 i principali istituti di credito delle economie avanzate cercarono di dotarsi di sistemi centralizzati di misurazione della probabilità di default delle controparti affidate fondati su modelli statistici, in alcuni casi anche molto complessi. Il salto “culturale” comportato da tale riassetto organizzativo, con le implicazioni che ne sono derivate in termini di pratiche decisionali e sistemi di misurazione delle performance aziendali e relativa incentivazione, potrebbe essere tra le cause dei problemi di esigibilità dei prestiti riscontrati negli anni più recenti.

Secondo la nostra tesi, infatti, i profondi cambiamenti causati da tali ristrutturazioni nei processi di valutazione ed erogazione del credito non hanno, da un lato, tenuto in piena considerazione gli effetti dell’asimmetria informativa venutasi a creare con il nuovo modello organizzativo (risk management accentrato, che “cala” le proprie valutazioni sulla rete distributiva che eroga i prestiti) e ciò ha portato ad un tipico paradosso di “principal-agent” non facilmente risolvibile con le sole soluzioni già proposte nella

letteratura esistente sull’argomento. Dall’altro, sono stati sottovalutati i problemi di selezione avversa che i modelli quantitativi di misurazione del merito di credito implicano se applicati senza gli adeguati correttivi.

# Andrea Pallavicini, “Credit Risk Modelling Before and After the Crisis”

# Vittorio Carlini, “Mercati finanziari e l’illusione dei numeri”

Thursday March 13 2014

13:00

Scuola Normale Superiore

Aula Bianchi

**Vittorio Carlini**

Il Sole24Ore

**Abstract**

L’hardware delle reti, i modelli matematici e i software dei prodotti dominano le Borse. Una dinamica che dimentica l’economia reale e le aziende.

# Emmanuel Bacry, “Hawkes process and applications”

Thursday February 27 2014

13.00

Scuola Normale Superiore

Aula Bianchi

**Emmanuel Bacry
**CMAP, UMR 7641 CNRS, Ecole Polytechnique, France

**Abstract
**

Hawkes processes are point self-exciting point processes particularly well suited for applications. Introduced in the 70s, that have been used in very various domaines such as high-frequency financial time-series modeling or viral diffusion in social networks.

After describing how they are defined and their main properties, we shall discuss some problems linked to parametric estimations (in high dimensions) as well as non parametric estimations. We will present several applications.

# Youngna Choi, “Financial Instability Contagion: a Dynamical Systems Approach”

Thursday January 9 2014

13.00

Scuola Normale Superiore

Aula Bianchi

**Youngna Choi
**Montclair State University

**Abstract
**

We build a multi-agent dynamical system for the global economy to investigate and analyze financial crises. The agents are large aggregates of a subeconomy, and the global economy is a collection of subeconomies. We use well-known theories of dynamical systems to represent a financial crisis as propagation of a negative shock on wealth due the breakage of a financial equilibrium. We first extend the framework of the market instability indicator, an early warning signal defined for a single economy as the spectral radius of the Jacobian matrix of the wealth dynamical system. Then, we formulate a quantitative definition of instability contagion in terms thereof. Finally, we analyze the mechanism of instability contagion for both single and multiple economies. Our contribution is to provide a methodology to quantify and monitor the level of instability in sectors and stages of a structured global economic model and how it may propagate between its components.

# Emilio Barucci, “Does a countercyclical buffer affect bank management?”

Monday October 7 2013

13.00

Scuola Normale Superiore

Aula Bianchi

**Emilio Barucci
**Politecnico di Milano

**Abstract
**

We analyze the effect of the countercyclical capital buffer, as provided by the Basel III regulation, on bank management.

The goal of the regualtion is to reduce bank leverage and excess risk taking.

We show that the countercyclical buffer increases the incentive for equity holders to take excess risk investing in assets with high drift rate independently of their volatility. Therefore, the buffer induces risk-shifting incentives, however it is effective in reducing the bank size and its leverage but the magnitudo is rather small.

Joint work with Luca Del Viva, ESADE Business School

# Ying Chen, “Filtering Asynchronous High Frequency Data”

Tuesday June 4 2013

13.00

Scuola Normale Superiore

Aula Bianchi

**Ying Chen
**Department of Statistics & Applied Probability – National University of Singapore

**Abstract
**

We develop a synchronizing technique for irregularly spaced and asynchronous high frequency data. The technique learns from the dependence structure of raw data and iteratively recovers the unobserved values of the synchronous series at high sampling frequency.

The numerical results illustrate the performance of the proposed technique and compared to the conventional techniques — Previous Tick technique and Refresh Time technique. The proposed technique provides good performance in terms of accuracy and feature.

Moreover, a realized covariance estimator is constructed by incorporating the synchronized technique. We compare the feature of the estimator with several alternative estimators.

# Matthieu Cristelli, “A New Metrics for Country Fitness and Product Complexity”

Wednesday May 22 2013

13.00

Scuola Normale Superiore

Aula Bianchi

**Matthieu Cristelli
**ISC-CNR, Institute for Complex Systems –

Department of Physics, “Sapienza” University

**Abstract
**

Classical economic theories prescribe specialization of countries industrial production. Inspection of the country databases of exported products shows that this is not the case: successful countries are extremely diversified, in analogy with biosystems evolving in a competitive dynamic environment. The challenge is assessing quantitatively the non-monetary and non-income based competitive advantage of diversification which represents the hidden potential for development and growth. In a series of recent works [1,2] we develop a new statistical approach based on coupled non-linear maps, whose fixed point defines a new metrics for the country Fitness and product Com-plexity. The idea underlying such an approach is that the intangible features determining the competitiveness of a country can be quantified by properly measuring what a country exports. We show that a non-linear iteration is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. Given the paradigm of economic complexity, the correct and simplest approach to measure the competitiveness of countries is the one presented in this work.The two metrics allow to define a new kind of fundamental analysis of the hidden growth potential of countries. It is possible to compare non-monetary factors of fitness and complexity with measures of economic intensity such as the countries GDP per capita. We argue that this comparison is informative on the growth potential of countries. As an example, countries that show both a high fitness and a high complexity, but a low GDP per capita, are very likely to strongly boost their income in the next years. From preliminary analysis with growth data from 1995 to 2010, it is possible to see that results well reflect what occurred in the real world over that period.

# Paolo Guasoni, “Frictions and Fees in Portfolio Choice”

Monday May 13 2013, 16.00 – 18.00 Aula 2

Tuesday May 14 2013, 11.00 – 13.00 Aula 2

Wednesday May 15 2013, 14.00 – 16.00 Aula Bianchi

Thursday May 16 2013, 11.00 – 13.00 Aula 2

Scuola Normale Superiore

**Paolo Guasoni
**Boston University and Dublin College University

**Frictions and Fees in Portfolio Choice**

All interested people are kindly invited.

# Jean-Philippe Bouchaud, “Instabilities in Financial Markets”

Thursday May 9 2013, 16.00 – 18.00 Aula Mancini

Friday May 10 2013, 14.00 – 16.00 Aula Mancini

Monday May 13 2013, 9.00 – 11.00 Aula Bianchi

Scuola Normale Superiore

**Jean-Philippe Bouchaud
**Capital Fund Management and Ecole Polytechnique

**Instabilities in Financial Markets**

May 9 “Stylized facts: old pieces and new results”

May 10 “Price formation, market impact and HFT”

May 13 “Instabilities: some (dangerous) feedback loops”

All interested people are kindly invited.

# Rama Cont, “Functional Ito calculus and Functional Kolmogorov equations”

Monday April 8 2013, 10.00 – 13.00 Aula Tonelli

Monday April 15 2013, 10.00 – 13.00 Aula Tonelli

Monday April 22 2013, 10.00 – 13.00 Aula Tonelli

Tuesday April 23 2013, 9.00 – 12.00 Aula 2

Wednesday April 24 2013, 9.00 – 12.00 Aula Dini

Monday May 27 2013, 10.00 – 13.00 Aula Tonelli

Scuola Normale Superiore

**Rama Cont
**Imperial College London and CNRS

Laboratoire de Probabilités et Modeles Aléatoires, Université Paris VI-VII

**Functional Ito calculus and Functional Kolmogorov equations**

All interested people are kindly invited.

# Fulvio Corsi, “When Micro Prudence increases Macro Risk: The Destabilizing Effects of Financial Innovation, Leverage, and Diversification”

Wednesday March 6 2013

13:00

Scuola Normale Superiore

Aula Mancini

**Fulvio Corsi
**Scuola Normale Superiore

**Abstract
**By exploiting basic common practice accounting and risk management rules, we propose a simple analytical dynamical framework to investigate the effects of micro- prudential changes on macro-prudential outcomes. Specifically, we study the consequence of the introduction of a financial innovation that allow reducing the cost of portfolio diversification in a financial system populated by financial institutions having capital requirements in the form of VaR constraint and following standard mark-to- market and risk management rules. We provide a full analytical quantification of the multivariate feedback effects between investment prices and bank behavior induced by portfolio rebalancing in presence of asset illiquidity and show how changes in the constraints of the bank portfolio optimization endogenously drive the dynamics of the balance sheet aggregate of financial institutions and, thereby, the availability of bank liquidity to the economic system and systemic risk. The model shows that when financial innovation reduces the cost of diversification below a given threshold, the strength (due to higher leverage) and coordination (due to similarity of bank portfolios) of feed- back effects increase, triggering a transition from a stationary dynamics of price returns to a non stationary one characterized by steep growths (bubbles) and plunges (bursts) of market prices.

# Federico Poloni and Giacomo Sbrana, “Estimating Econometric Models through Matrix Equations”

Tuesday March 5 2013

13.00

Scuola Normale Superiore

Aula Bianchi

**Federico Poloni
**Dipartimento di Informatica – Università di Pisa

**Giacomo Sbrana**

Rouen Business School

**Abstract
**

We present an algorithm to estimate the parameters of multivariate ARMA, GARCH and stochastic volatility models. The approach is based on a moment estimator; a similar approach has already been suggested in literature for univariate GARCH but its generalization to multivariate models requires some more linear algebra machinery, especially in the field of matrix equations.

The resulting estimator is extremely fast to compute, in comparison to maximum-likelihood approaches. We also discuss methods to regularize and improve this estimate.

#
S. Marmi, C. Pacati, R. Renò, W.A. Risso, *A quantitative approach to Faber’s tactical asset allocation*, International Journal of Computational Economics and Econometrics 3 (1-2), 91-101

Routinely, practitioners and academics alike propose the use of trading strategies with an

alleged improvement on the risk–return relation, typically entailing a considerably higher

return for the given level of risk. A very popular example is” A quantitative approach to

tactical asset allocation” by the fund manager M. Faber, a real hit in the SSRN online library.

Is this paper a counterexample to market efficiency? We reject this conclusion, showing that

a lot of caution should be used in this field, and we indicate a series of bootstrapping

experiments which can be easily implemented to evaluate the performance of trading

strategies.

#
G. Buccheri, S. Marmi, R.N. Mantegna, *Evolution of correlation structure of industrial indices of US equity markets*, Physical Review E 88 (1), 012806

We investigate the dynamics of correlations present between pairs of industry indices of U.S. stocks traded in U.S. markets by studying correlation-based networks and spectral properties of the correlation matrix. The study is performed by using 49 industry index time series computed by K. French and E. Fama during the time period from July 1969 to December 2011, which spans more than 40 years. We show that the correlation between industry indices presents both a fast and a slow dynamics. The slow dynamics has a time scale longer than 5 years, showing that a different degree of diversification of the investment is possible in different periods of time. Moreover, we also detect a fast dynamics associated with exogenous or endogenous events. The fast time scale we use is a monthly time scale and the evaluation time period is a 3-month time period. By investigating the correlation dynamics monthly, we are able to detect two examples of fast variations in the first and second eigenvalue of the correlation matrix. The first occurs during the dot-com bubble (from March 1999 to April 2001) and the second occurs during the period of highest impact of the subprime crisis (from August 2008 to August 2009).

# Marco Bianchetti, “Consistent No-Arbitrage Derivatives’ Pricing Including Funding And Collateral”

Tuesday January 29 2013

13:00

Scuola Normale Superiore

Aula Bianchi

**Marco Bianchetti**

Intesa Sanpaolo

**Abstract
**We revisit the problem of general no-arbitrage pricing of derivatives including the funding component. We show that, by adopting the no-arbitrage approach based on replication, PDE, and Feynman-Kac theorem, an appropriate treatment of the self-financing conditions allows straightforward and consistent proofs of the relevant pricing formulas covering a broad range of cases. In particular, we firstly recover the basic results for single currency funding of derivatives including, step by step, perfect or partial collateral, repo, and dividends. Next, we generalize the analysis to the case of multiple currency funding, and we examine the special case of interest rate derivatives. These results are useful to provide a simple and consistent framework of modern pricing formulas, fostering a broader understanding of the current market practice of CSA discounting, and to set solid theoretical grounds supporting further generalizations to include other risk factors, counterparty risk (CVA and DVA) in particular.

**Download Flyer**

# Marko Weber, “Dynamic Trading Volume”

Tuesday November 6 2012

13:00

Scuola Normale Superiore

Aula Bianchi

**Marko Weber
**Dublin City University and Scuola Normale Superiore Pisa

**Abstract
**We derive the process followed by trading volume, in a market with finite depth and constant investment opportunities, where a representative investor, with a long horizon and constant relative risk aversion, trades a safe and a risky asset. Trading volume approximately follows a Gaussian, mean-reverting diffusion, and increases with depth, volatility, and risk aversion. The model generates an endogenous ban on leverage and short-selling. Joint work with Paolo Guasoni.

**Download Flyer**

# Bence Toth, “Anomalous Price Impact and the Critical Nature of Liquidity in Financial Markets”

Tuesday July 24 2012

12:00

Scuola Normale Superiore

Aula Bianchi

**Bence Toth**

Capital Fund Management, Paris, France

**Abstract
**We propose a dynamical theory of market liquidity that predicts that the average supply/demand profile is V-shaped and vanishes around the current price. This result is generic, and only relies on mild assumptions about the order flow and on the fact that prices are (to a first approximation) diffusive. This naturally accounts for two striking stylized facts: first, large metaorders have to be fragmented in order to be digested by the liquidity funnel, leading to long-memory in the sign of the order flow. Second, the anomalously small local liquidity induces a breakdown of linear response and a diverging impact of small orders, explaining the “square-root” impact law, for which we provide additional empirical support. Finally, we test our arguments quantitatively using a numerical model of order flow based on the same minimal ingredients.

# James G. M. Gatheral, “Arbitrage-free SVI volatility surfaces”

Wednesday July 18 2012

13:00

Scuola Normale Superiore

Aula 2

**James G. M. Gatheral**

Baruch College, The City University of New York

**Abstract
**In this talk we motivate the widely-used SVI (“stochastic volatility inspired”) parameterization of the implied volatility surface and show how to calibrate it in such a way as to guarantee the absence of static arbitrage. In particular, we exhibit a large class of arbitrage-free SVI volatility surfaces with a simple closed-form representation. We demonstrate the high quality of typical SVI fits with a numerical example using recent SPX options data. We conclude by suggesting that SVI might one day replace SABR as the implied volatility parameterization of choice.

**Download Slides**

# James G. M. Gatheral, “Optimal Order Execution”

Friday July 13 2012

11:30

Scuola Normale Superiore

Aula Bianchi

**James G. M. Gatheral**

Baruch College, The City University of New York

**Abstract
**We review various models of market impact. We use variational calculus to derive optimal execution strategies, noting that in many conventional models, static strategies are dynamically optimal. We then present a model in which the optimal strategy does depend on the stock price and derive an explicit closed-form solution for this strategy by solving the HJB equation. We discuss price manipulation, indicating modeling choices for which this is unlikely to be a problem. We present empirical evidence and some heuristic arguments justifying the well-known square-root formula for market impact. Assuming price dynamics that are consistent with the square-root formula, we suggest likely properties of optimal execution strategies.

**Download Slides**

# Enrico Scalas, “Intraday Option Pricing”

Tuesday June 19 2012

11:00

Scuola Normale Superiore

Aula Bianchi

**Enrico Scalas**

DISIT, Università del Piemonte Orientale and Basque Center for Applied Mathematics, Bilbao

**Abstract
**A stochastic model for pure-jump diffusion (the compound renewal process) can be used as a zero-order approximation and as a phenomenological description of tick-by-tick price fluctuations. This leads to an exact and explicit general formula for the martingale price of a European call option. A complete derivation of this result is presented by means of elementary probabilistic tools.

Reference: Scalas E. and Politi M. (2012). A parsimonious model for intraday European option pricing. Economics Discussion Papers, No 2012-14, Kiel Institute for the World Economy.

# Cecilia Mancini, “Spot Volatility Estimation Using Delta Sequences”

Monday May 7 2012

13:00

Scuola Normale Superiore

Aula Bianchi

**Cecilia Mancini**

Università di Firenze

**Abstract
**Work done in collaboration with V.Mattiussi and R.Renò.

**Download Flyer**

# Robert Almgren, “Quantitative Problems in Optimal Execution”

Friday April 13 2012

13:00

Scuola Normale Superiore

Aula Bianchi

**Robert Almgren**

New York University and Quantitative Brokers

**Abstract**

Execution of large transactions so as to minimize market impact and trading costs is a very important aspect of modern financial markets. We will give an overview of the quantitative tools that are used to approach this problem and how they are implemented in practice. These include balance of risk and reward, design of optimal trajectories, and the mathematical issues that arise in optimal response to time-varying liquidity and volatility.

# Maria Elvira Mancino, “Fourier Volatility Estimation Method: Theory and Applications with High Frequency Data”

Wednesday March 14 2012

13.00 – 14.00

Scuola Normale Superiore

Aula Bianchi

**Maria Elvira Mancino
**Università di Firenze

**Download Flyer**

**Download Slides**

# Roberto Renò, “Price and Volatility Co-Jumps”

Wednesday February 15 2012

13:00

Scuola Normale Superiore

Aula Bianchi

**Roberto Renò**

Università degli Studi di Siena

**Abstract
**A sizeable proportion of large, discontinuous, changes in asset prices are found to be associated with contemporaneous large, discontinuous, changes in volatility (i.e., co-jumps), negative price jumps usually occurring along with positive volatility jumps. We document that the co-jumps yield an economically-meaningful portion of leverage, return skewness, and the implied volatility smirk. These, and other, effects are uncovered in the context of a flexible modeling approach (allowing, among other features, for independent as well as common jumps, volatility-dependent jump arrivals, and time-varying leverage) and a novel identification strategy relying on infinitesimal cross-moments and high-frequency price data.

**Download Flyer**

**Download Slides**

# Emmanuel Bacry, “Modelling Microstructure using Multivariate Hawkes Processes”

Wednesday December 14 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Emmanuel Bacry**

Ecole Polytechnique

**Abstract
**Hawkes processes are used for modelling tick-by-tick variations of a single or of a pair of asset prices. For each asset, two counting processes (with stochastic intensities) are associated respectively to the positive and negative jumps of the price. We show that, by coupling these intensities using a kernel function, one can reproduce high-frequency mean reversion structure as well as Epps effect which are both characteristic of the microstructure. We define a numerical method that provides a non-parametric estimation of the kernel shape, it is find to be slowly decaying (power-law) suggesting a long memory nature of self-excitation phenomena at the microstructure level of price dynamics. The consequences for market impact are discussed using a model very simple model for exogeneous trades.

**Download Flyer**

# Domenico Mignacca, “Risk Attribution, Risk Budgeting and Portfolio’s Implied Views: A Factor Approach”

# Quantitative Approaches to Risk Assessment and Investment Transparency

Monday December 12 2011

10.45 – 17.00

Sala Stemmi

Scuola Normale Superiore – Pisa

10:45 Registration

11:00 Opening address

*Book Presentation: ***A Quantitative Framework to Assess the Risk- Reward Profile of Non-Equity Products
**11:10

*Keynote Address*– Hélyette Geman

11:50

*Special Address*– Marcello Minenna, Giovanna Maria Boi, Paolo Verzella

13:00 Lunch

*Technical Contributions*

14:00 Rita D’Ecclesia

*Risk Assessment of debt liabilities: overview and case studies*

14:30 Dongning Qu

*Understanding Risks in Structured Products*

15:00 Paolo Sironi

*Enhancing competitiveness through transparent investment decision-making*

15:30 Coffee break

*Round Table:*Risk Transparency through Quantitative Methods

**16:00**

*Moderator*– Stefano Marmi

*Panelists*– Rita D’Ecclesia, Hélyette Geman, Domenico Mignacca, Marcello Minenna, Dongning Qu, Paolo Sironi

For more information please contact: +39 050 509111, s.marmi@sns.it, or fabrizio.lillo@sns.it.

Download event flyer here, and brochure here.

# Umberto Cherubini, “CDS and the City”

Wednesday November 23 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Umberto Cherubini**

Università di Bologna

**Abstract**

We discuss the peculiarity of the CDS tool in the realm of credit derivatives. We show how to use such products for the hedging of credit risk and for synthetic creation of it. We show how to use CDS quotes to bootstrap the term structure of the implied probability of default of an obligor. We finally discuss the relevance of the market in which CDS are traded, which is Over-The-Counter. As an illustration we provide a case of CDS used in the funding of a municipal entity.

# Rama Cont, “Systemic risk: a challenge for modelling”

Wednesday November 9 2011

15:00

Scuola Normale Superiore

Sala degli Stemmi

**Rama Cont**

Univ. Paris VI-VII, France and Columbia Univ. New York, USA

# Rosario Nunzio Mantegna, “Econophysics Investigation of Financial Markets: Correlation, Heterogeneity and Agent Based Models”

Tuesday October 4 2011 15.00 – 17.00 Aula Bianchi

Wednesday October 5 2011 11.00 – 13.00 Aula Bianchi

Thursday October 6 2011 11.00 – 13.00 Aula Mancini

Scuola Normale Superiore

**
Rosario Nunzio Mantegna
**Università di Palermo

**Lecture 1. Correlation, hierarchical clustering and correlation based networks in financial markets
**Download slides here.

**Download slides here.**

Lecture 2. Heterogeneity and specialization of investors in financial markets

Lecture 2. Heterogeneity and specialization of investors in financial markets

**Download slides here.**

Lecture 3. Agent based modeling of financial markets

Lecture 3. Agent based modeling of financial markets

# Aldo Nassigh, “Default and Credit Migration Risk in Trading Portfolios”

Tuesday September 20 2011

16:00

Scuola Normale Superiore

Aula Bianchi

**Monica Billio**

Unicredit, Milano

**Abstract
**Default and credit migration risk was treated as negligible for a long time in banks’ trading portfolios, characterized by an investment horizon of few days. This was consistent with the ‘Constant Level of Risk’ assumption according to which, in case of deterioration of the creditworthiness of the obligor, exposures with high credit quality would have been replaced with the goal of moving the asset allocation back to the original risk profile. If perfect market liquidity and continuous Brownian motion for asset prices are granted, losses induced by the frequent rebalancing of the portfolio can indeed be neglected. The rise and blow up of the Credit Trading bubble (also named Sub-Prime, Lehman and Sovereign crises) showed the shortcomings of such approach. In 2004, the Basel Committee on Banking Supervision asked banks to set aside capital for credit risk in trading portfolios, in response to the rising credit exposures and the improvements in risk management best practice observed in the banking system. Such capital add-on (named ‘Incremental Risk Charge’) will enter into force in December 2011. The proper evaluation of default and credit migration risk under the constant level of risk assumption translates into the call for modeling portfolio credit risk in the framework of short-term, multi-step simulations. Aim of the seminar is to give an update on recent developments regarding modeling the Incremental Risk Charge and to raise some critical and unresolved issues as: the difficulty in adapting to this problem the mainstream treatment of portfolio credit risk by continuous-time Markov Chains applied to the rating migration process; the lack of an unambiguous approach to the estimation of asset correlations, leading to large discrepancies in the capital level required by the various models developed so far.

# L’instabilità dei mercati finanziari: il flash crash un anno dopo.

Wednesday May 11 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Stefano Marmi
**Scuola Normale Superiore, Pisa

**6 maggio 2010:**

**l’effetto farfalla si abbatte sul ping pong del trading ad alta frequenza**

**Fulvio Corsi
**Università della Svizzera Italiana, Lugano

**Ordini tossici e crisi di liquidità:**

**un punto di vista accademico sul Flash Crash**

**Fabrizio Lillo
**Scuola Normale Superiore, Pisa, Università di Palermo e Santa Fe Institute, Santa Fe

**Trading ad alta frequenza ed instabilità di sistema:**

**il data mining aiuta a comprendere il Flash Crash**

**Giacomo Bormetti
**Scuola Normale Superiore, Pisa e INFN, Pavia

**Modelli quantitativi del panico di mercato**

**Abstract
**Durante l’incontro verrà presentata una cronaca degli eventi accaduti il 6 maggio 2010 sul mercato azionario americano ed una rassegna della letteratura apparsa nella stampa specializzata nei mesi successivi.

**Download Flyer**

Le presentazioni possono essere scaricate seguendo i seguenti links:

**Slides_Marmi**,

**Slides_Corsi**,

**Slides_Lillo**e

**Slides_Bormetti**.

# Daniel Nehren and Lorenzo Balducci, “Un incontro con JP Morgan”

Tuesday May 10 2011

15:00

Scuola Normale Superiore

Aula Mancini

**Daniel Nehren**

Head of Linear Quantitative Research, JP Morgan New York

**Introduction to Algorithmic Trading**

**Lorenzo Balducci**

Quantitative Research, JP Morgan London

**Quantitative Research at JP Morgan**

Following this link you can download the slides of the talk by Lorenzo Balducci: **Slides_Balducci**.

# Franco Nardini, “Innovation, specialization and growth in a model of structural change”

Tuesday April. 19 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Franco Nardini**

Università di Bologna

**Abstract
**The aim of this talk is to investigate the nexus between demand patterns and innovation as it stems from research efforts and the extent of specialization. In the proposed model an innovation race conducted by entrants investing in research and development against established incumbents raises productivity at the industry level and leads to a shift in the aggregate demand pattern and consequently to a redistribution of the profit fund among industries and a restructuring of the production process in each industry. The talk argues that the degree of development as reflected in a demand share distribution is characterized by a corresponding distribution of specialized sectors that becomes more even across industries as the development process proceeds and investigates the consequences in terms of economic growth.

**Download Flyer**

**Download Slides**

# Jean-Philippe Bouchaud, “An introduction to Statistical Phy-nance”

Wednesday April 13 2011, 14.00 – 16.00 Aula Dini

Thursday April 14 2011, 14.00 – 16.00 Aula Dini

Friday April 15 2011, 11.00 – 13.00 Aula Fermi

Scuola Normale Superiore

**Jean-Philippe Bouchaud
**Science & Finance, Capital Fund Management, Paris

**Lecture 1. (Wednesday April 13) The dynamics of price in financial markets **

- Fat tails and intermittent dynamics: empirical facts and models
- High frequency and seasonality effects
- Cross sectional and multivariate effects
- Correlations and copulas

**Lecture 2. (Thursday April 14) Price formation and microstructure**

- Order book dynamics
- Order flow dynamics
- Impact and liquidity
- Is price dynamics exogenous or endogenous?

**Lecture 3. (Friday April 15) The world of financial derivatives**

- The worrying ways of financial engineering
- Option pricing: welcome to a non Black-Scholes world
- Volatility modelling: GARCH and multiscale/multifractal frameworks
- The hedge Monte-Carlo method
- The need for “second generation” models

# Marko Hans Weber, “On portfolio optimization in markets with frictions”

Tuesday April 5 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Marko Hans Weber**

Scuola Normale Superiore

**Abstract
**The classic portfolio optimization problem was solved by Robert Merton in 1969 in his paper “Lifetime portfolio selection under uncertainty: the continuous time case”. In an economy formed by two assets, a risk-free bond and a stock, which has standard Black-Scholes dynamics, he finds explicitly the optimal trading strategy for an agent with constant relative risk aversion. The mainstream literature assumes a frictionless market, but ignoring transaction costs and liquidity may seriously affect the reliability of a financial model. The objective of the talk is to give a review on the effects of introducing proportional transaction costs. Indeed, the results by Merton are no longer valid in this framework. We also want to approach the issue of liquidity, which has been studied just marginally in the literature, and then compare the impact that both kind of frictions have.

**Download Flyer**

**Download Slides**

# Monica Billio, “Econometric measures of systemic risk in the finance and insurance sectors”

Tuesday March 29 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Monica Billio**

Department of Economics – Università Ca’ Foscari di Venezia

**Abstract
**We propose several econometric measures of systemic risk to capture the interconnectedness among the monthly returns of hedge funds, banks, brokers, and insurance companies based on principal components analysis and Granger-causality tests. We find that all four sectors have become highly interrelated over the past decade, increasing the level of systemic risk in the finance and insurance industries. These measures can also identify and quantify financial crisis periods, and seem to contain predictive power for the current financial crisis. Our results suggest that hedge funds can provide early indications of market dislocation, and systemic risk arises from a complex and dynamic network of relationships among hedge funds, banks, insurance companies, and brokers.

Joint work with Mila Getmansky, Andrew W. Lo, and Loriana Pellizzon.

# Salvatore Federico, “Rappresentazione a dimensione infinita per problemi di controllo con ritardo”

Tuesday March 15 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Salvatore Federico**

Università di Bologna

**Abstract
**I problemi di controllo con ritardo nella variabile di stato e/o di controllo emergono in maniera naturale in molti contesti applicativi (Fisica, Biologia, Economia, ecc.). La prima parte del seminario sarà dedicata all’illustrazione di una serie di esempi di interesse in ambito economico-finanziario. La seconda parte del seminario sarà dedicata alla descrizione delle difficoltà matematiche a cui bisogna far fronte nell’affrontare tali problemi ed alla descrizione dell’approccio di rappresentazione a dimensione infinita agli stessi. Infine l’ultima parte sarà dedicata allo studio di alcuni problemi specifici e all’illustrazione di alcuni risultati di regolarità per le equazioni di Hamilton-Jacobi-Bellman associate.

**Download Flyer**

# Maurizio Pratelli, “Calcolo delle greche e integrazione per parti secondo Malliavin”

Tuesday March 8 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Maurizio Pratelli**

Università degli Studi di Pisa

**Abstract
**Si usano chiamare greche le derivate dei prezzi (ad esempio di opzioni) rispetto ad opportuni parametri: quando non sono disponibili formule esplicite per i prezzi questi vengono usualmente calcolati con metodi Montecarlo. Tuttavia il calcolo (o almeno l’approssimazione) delle derivate pone forti problemi di instabilità numerica. Per ovviare a questa difficoltà P.L.Lions e collaboratori hanno adattato il metodo di integrazione per parti, originariamente introdotto da P.Mallavin per affrontare con tecniche probabilistiche il teorema di Hormander sulla caratterizzazione degli operatori ipoellittici: il lavoro di Lions ha aperto la strada a numerose applicazioni. In questo seminario intendo esporre sinteticamente le idee essenziali del “calcolo di Malliavin” ed esporne alcune applicazioni a problemi posti dalla Finanza Matematica.

**Download Flyer**

# Andrea Pallavicini, “Surviving the Credit Crunch: new features for post-crisis pricing models”

Tuesday May 3 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Andrea Pallavicini**

Mediobanca, Milano

**Abstract
**Starting from the beginning of the credit crunch many pricing models fail to incorporate market movements since they are designed to discard features now crucial in such turmoil situation. In particular, credit risk cannot be neglected any longer while modelling other asset classes: single counterparties can default, and extreme events may happen too, such as the default of sectors of the economy, or even the break-down of the whole system. A new generation of pricing models able to naturally include counterparty and systemic risk along with more exotic features, such as funding and liquidity effects, is still under construction, but it is now possible to discern the first steps. In this presentation we focus on credit and the interest-rate asset classes. In particular, we describe some old approaches which survived the crisis (loss models with default clustering or self-excitement), and some new proposals driven by recent market trends or by modifications in regulation framework (CVA evaluation framework, credit contagion models, multiple yield-curve models).

**Download Flyer**

**Download Slides**

# Giacomo Bormetti, “Minimal model of financial stylized facts”

Tuesday February 15 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Giacomo Bormetti**

Scuola Normale Superiore – Pisa

**Abstract**

In this seminar I will present joint work with D. Delpini from the University of Pavia. We afford the statistical characterization of a linear Stochastic Volatility Model featuring Inverse Gamma stationary distribution for the instantaneous volatilitiy of financial returns. We detail the derivation of the moments of the return distribution, revealing the role of the Inverse Gamma law in the emergence of fat tails, and of the relevant correlation functions. We also propose a systematic methodology for estimating the model parameters, and we describe the empirical analysis of the Standard & Poor 500 index daily returns, confirming the ability of the model to capture many of the established stylized fact as well as the scaling properties of empirical distributions over different time horizons.

# Sylvain Arlot, “Advanced Course on Statistics”

Monday February 14 2011, 14.00 – 16.00 Aula Bianchi

Tuesday February 15 2011, 9.00 – 11.00 Aula Fermi

Thursday February 17 2011, 14.00 – 16.00 Aula Fermi

Tuesday February 22 2011, 14.00 – 16.00 Aula Dini

Wednesday February 23 2011, 9.00 – 11.00 Aula Bianchi

Scuola Normale Superiore

** SYLVAIN ARLOT
**CNRS-INRIA and ENS, Paris

**Advanced Course on Statistics**

**Lecture 1. (Monday February 14) Statistical learning **

- the statistical learning learning problem
- examples: prediction, regression, classification, density estimation
- estimators: definition, consistency, examples
- universal learning rates and No Free Lunch Theorems [1]
- the estimator selection paradigm, bias-variance decomposition of the risk
- data-driven selection procedures and the unbiased risk estimation principle

**Lecture 2. (Tuesday February 15) Model selection for least-squares regression **

- ideal penalty, Mallows’ Cp
- oracle inequality for Cp (i.e., non-asymptotic optimality of the corresponding model selection procedure), corresponding learning rates [2]
- the variance estimation problem
- minimal penalties and data-driven calibration of penalties: the slope heuristics [3,4]
- algorithmic and other practical issues [5]

**Lecture 3. (Thursday February 17) Linear estimator selection for least-squares regression [6] **

- linear estimators: (kernel) ridge regression, smoothing splines, k-nearest neighbours, Nadaraya-Watson estimators
- bias-variance decomposition of the risk
- the linear estimator selection problem: CL penalty
- oracle inequality for CL
- data-driven calibration of penalties: a new light on the slope heuristics

**Lecture 4. (Tuesday February 22) Resampling and model selection **

- regressograms in heteroscedastic regression: the penalty cannot be a function of the dimensionality of the models [7]
- resampling in statistics: general heuristics, the bootstrap, exchangeable weighted bootstrap [8]
- study of a case example: estimating the variance by resampling
- resampling penalties: why do they work for heteroscedastic regression? oracle-inequality. comparison of the resampling weights [9]

**Lecture 5. (Wendsday February 23) Cross-validation and model/estimator selection [10] **

- cross-validation: principle, main examples
- cross-validation for estimating of the prediction risk: bias, variance
- cross-validation for selecting among a family of estimators: main properties, how should the splits be chosen?
- illustration of the robustness of cross-validation: detecting changes in the mean of a signal with unknown and non-constant variance [11]
- correcting the bias of cross-validation: V-fold penalization. Oracle-inequality. [12] Continue reading

# Lucio M. Calcagnile, “Misurare l’efficienza informativa dei mercati finanziari”

Wednesday February 2 2011

13:00

Scuola Normale Superiore

Aula Bianchi

**Lucio M. Calcagnile**

Scuola Normale Superiore

**Abstract
**Un mercato finanziario è detto “efficiente” se è efficiente nel processare l’informazione disponibile, se – cioè – gli agenti che lo compongono assimilano e incorporano immediatamente nei prezzi tutte le informazioni rilevanti. È possibile stabilire se un mercato è efficiente in senso assoluto ovvero quantificare il grado di efficienza relativa di un mercato rispetto a un altro? Esporrò alcuni lavori recenti in letteratura che con metodi diversi tentano di misurare l’efficienza relativa. Presenterò esperimenti e analisi condotti su serie temporali finanziarie ad alta frequenza usando il contesto della teoria dell’informazione e l’entropia di Shannon quale strumento per misurare l’efficienza. Illustrerò infine alcuni vantaggi e svantaggi che si presentano nell’analisi di serie ad alta frequenza.