On Consistency of Signature Using Lasso

arXiv stat.ML / 3/24/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper analyzes when learning time-series signatures via Lasso regression is statistically consistent, proving results for both asymptotic and finite-sample regimes.
  • It compares how Lasso-based estimation aligns better with different signature notions (Itô vs Stratonovich) depending on whether the underlying processes resemble Brownian motion or are mean-reverting.
  • The authors study the role of process properties, finding that weaker inter-dimensional correlations and Brownian closeness improve consistency for the Itô signature.
  • They validate the theory numerically and show that signature methods with Lasso can achieve high-accuracy learning of nonlinear functions and option prices, with performance sensitive to both process characteristics and signature choices.

Abstract

Signatures are iterated path integrals of continuous and discrete-time processes, and their universal nonlinearity linearizes the problem of feature selection in time series data analysis. This paper studies the consistency of signature using Lasso regression, both theoretically and numerically. We establish conditions under which the Lasso regression is consistent both asymptotically and in finite sample. Furthermore, we show that the Lasso regression is more consistent with the It\^o signature for time series and processes that are closer to the Brownian motion and with weaker inter-dimensional correlations, while it is more consistent with the Stratonovich signature for mean-reverting time series and processes. We demonstrate that signature can be applied to learn nonlinear functions and option prices with high accuracy, and the performance depends on properties of the underlying process and the choice of the signature.