Title: Feature Selection in Jump Models

Abstract: Jump models switch infrequently between states to fit a sequence of data while taking the ordering of the data into account. In this talk, we propose a new framework for joint feature selection, parameter and state-sequence estimation in jump models. Feature selection is necessary in high-dimensional settings where the number of features is large compared to the number of observations and the underlying states differ only with respect to a subset of the features. We develop and implement a coordinate descent algorithm that alternates between selecting the features and estimating the model parameters and state sequence, which scales to large data sets with large numbers of (noisy) features. We demonstrate the usefulness of the proposed framework by comparing it with a number of other methods on both simulated and real data in the form of financial returns, protein sequences, and text. The resulting sparse jump model outperforms all other methods considered and is remarkably robust to noise.

This is joint work with Erik Lindstrom and Peter Nystrup. ]]>

Title: Fourier-based methods for the management of complex insurance products

Abstract: This paper proposes a framework for the valuation and the management of complex life insurance contracts, whose design can be described by a portfolio of embedded options, which are activated according to one or more triggering events. These events are in general monitored discretely over the life of the policy, due to the contract terms. Similar designs can also be found in other contexts, such as counterparty credit risk for example. The framework is based on Fourier transform methods as they allow to derive convenient closed analytical formulas for a broad spectrum of underlying dynamics. Multidimensionality issues generated by the discrete monitoring of the triggering events are dealt with efficiently designed Monte Carlo integration strategies. We illustrate the tractability of the proposed approach by means of a detailed study of ratchet variable annuities, which can be considered a prototypical example of these complex structured products. This is joint work with Ernst Eberlein, Thorsten Schmidt and Raghid Zeineddine. ]]>

Abstract: The very high liquidity of S&P 500 (SPX) and VIX derivatives requires that financial institutions price, hedge, and risk-manage their SPX and VIX options portfolios using models that perfectly fit market prices of both SPX and VIX futures and options, jointly. This is known to be a very difficult problem. Since VIX options started trading in 2006, many practitioners and researchers have tried to build such a model. So far the best attempts, which used parametric continuous-time jump-diffusion models on the SPX, could only produce approximate fits. In this talk we solve this long standing puzzle for the first time using a completely different approach: a nonparametric discrete-time model. Given a VIX future maturity T1, we build a joint probability measure on the SPX at T1, the VIX at T1, and the SPX at T2 = T1 + 30 days which is perfectly calibrated to the SPX smiles at T1 and T2, and the VIX future and VIX smile at T1. Our model satisfies the martingality constraint on the SPX as well as the requirement that the VIX at T1 is the implied volatility of the 30-day log-contract on the SPX.

The model is cast as the unique solution of what we call a Dispersion-Constrained Martingale Schrodinger Problem which is solved by duality using an extension of the Sinkhorn algorithm, in the spirit of (De March and Henry-Labordere, Building arbitrage-free implied volatility: Sinkhorn’s algorithm and variants, 2019). We prove that the existence of such a model means that the SPX and VIX markets are jointly arbitrage-free. The algorithm identifies joint SPX/VIX arbitrages should they arise. Our numerical experiments show that the algorithm performs very well in both low and high volatility environments. Finally, we discuss how our technique extends to continuous-time stochastic volatility models, via what we dub VIX-Constrained Martingale Schrodinger Bridges, inspired by the classical Schrodinger bridge of statistical mechanics. The resulting stochastic volatility model is numerically implemented and is shown to achieve joint calibration with very high accuracy.

Time permitting, we will also briefly discuss a few related topics:

(i) a remarkable feature of the SPX and VIX markets: the inversion of convex ordering, and how classical stochastic volatility models can reproduce it;

(ii) why, due to this inversion of convex ordering, and contrary to what has often been stated, among the continuous stochastic volatility models calibrated to the market smile, the local volatility model does not maximize the price of VIX futures.

Presenters: Loriano Mancini (Università della Svizzera Italiana)

Title: Portfolio choice when stock returns may disappoint: An empirical analysis based on L-moments

Abstract: We empirically examine the equity portfolio choices of investors with generalized disappointment aversion (GDA) preferences. The portfolio choice relies on a novel semi-parametric method based on L-moments which permits a large-scale empirical study. GDA investors appear to be very sensitive to higher-order L-moment returns, and to suffer large monetary utility losses from suboptimal portfolio choices such as equally weighted portfolios. These losses increase in the level of disappointment aversion and the number of stocks available for investment.

Presenters: Lukasz Szpruch (University of Edinburgh)

Title: Gradient Flows for Regularized Stochastic Control Problems

Abstract: This talk is on stochastic control problems regularized by the relative entropy, where the action space is the space of measures. This setting includes relaxed control problems, problems of finding Markovian controls with the control function replaced by an idealized infinitely wide neural network and can be extended to the search for causal optimal transport maps. By exploiting the Pontryagin optimality principle, we identify suitable metric space on which we construct gradient flow for the measure-valued control process along which the cost functional is guaranteed to decrease. It is shown that under appropriate conditions, this gradient flow has an invariant measure which is the optimal control for the regularized stochastic control problem. If the problem we work with is sufficiently convex, the gradient flow converges exponentially fast. Furthermore, the optimal measured valued control admits Bayesian interpretation which means that one can incorporate prior knowledge when solving stochastic control problems. This work is motivated by a desire to extend the theoretical underpinning for the convergence of stochastic gradient type algorithms widely used in the reinforcement learning community to solve control problems.

Joint work with David Siska (Edinburgh).

Presenters: Nino Antulov-Fantulin (ETH Zurich)

Title: Complexity and Machine Learning with Financial applications

Abstract: Complexity science studies systems and problems that are composed of many components that may interact with each other in a dynamic and non-linear way. In this first part of the talk, the author will motivate and introduce several research questions and directions at the interface of complexity and machine learning: (i) the ability of neural networks to steer or control trajectories of network dynamical systems, (ii) node embedding of directed graphs and (iii) efficient Monte Carlo sampling of epidemic processes. In the second part of the talk, the author will focus on machine learning modelling for (crypto)financial markets. ]]>

Presenters: Blanka Horvath and Issa Zacharia (King’s College London)

Title: An Optimal Transport Approach to Market Regime Clustering

Abstract: The problem of rapid and automated detection of distinct market regimes is a topic of great interest to financial mathematicians and practitioners alike. In this paper, we outline an unsupervised learning algorithm clusters a given time-series – corresponding to an asset or index – into a suitable number of temporal segments (market regimes). This method – the principle of which is inspired by the well-known k-means algorithm – clusters said segments on the space of probability measures with finite p-th moment. On this space, our choice of metric is the p-Wasserstein distance. We compare our Wasserstein-kmeans approach with a more traditional implementation of the kmeans algorithm by generating clusters in Euclidean space via the first N raw moments of each log-return segment instead (moment-kmeans). We compare the two approaches initially on real data and validate the performance of either algorithm by studying the maximum mean discrepancy between, and within, clusters. We show that the Wasserstein-kmeans algorithm vastly outperforms the moment-based approach on both real and synthetic data. In particular, the Wasserstein-kmeans algorithm performs well, even when the distribution associated to each regime is non-Gaussian. ]]>

Presenters: Andrea Barbon (University of St. Gallen)

Title: Brokers and Order Flow Leakage: Evidence from Fire Sales

Abstract: Using trade‐level data, we study whether brokers play a role in spreading order flow information in the stock market. We focus on large portfolio liquidations that result in temporary price drops, and identify the brokers who intermediate these trades. These brokers’ clients are more likely to predate on the liquidating funds than to provide liquidity. Predation leads to profits of about 25 basis points over 10 days and increases the liquidation costs of the distressed fund by 40%. This evidence suggests a role of information leakage in exacerbating fire sales. ]]>