AI Navigate

時間的マルコフ遷移場

arXiv cs.LG / 2026/3/11

Ideas & Deep AnalysisModels & Research

要点

  • Temporal Markov Transition Field(TMTF)は、マルコフ遷移場(MTF)を拡張したもので、時系列を連続した時間チャンクに分割することで時系列のダイナミクス変化をより良く捉えることを目的としています。
  • TMTFは単一のグローバル行列に依存せず、各チャンクごとに別個の遷移行列を推定するため、時間的なレジーム変化とそのタイミングを表現に反映できます。
  • 得られたTMTF画像は水平のバンドで異なる局所遷移ダイナミクスを符号化し、順序を保持し振幅には依存しないため、時系列タスクの畳み込みニューラルネットワークの入力に適しています。
  • 本稿では形式的定義を示し、構造的特性の解析とグローバルMTFとの違いを強調する数値例を提供し、過程の持続性や平均回帰などの特性に関連した局所遷移行列の幾何学的解釈について議論します。
  • この手法はチャンク分割に伴うバイアス-分散トレードオフに対処し、非定常条件下での時系列特徴付けにおけるマルコフ遷移場の解釈性と有用性を高めます。

Computer Science > Machine Learning

arXiv:2603.08803 (cs)
[Submitted on 9 Mar 2026]

Title:The Temporal Markov Transition Field

View a PDF of the paper titled The Temporal Markov Transition Field, by Michael Leznik
View PDF HTML (experimental)
Abstract:The Markov Transition Field (MTF), introduced by Wang and Oates (2015), encodes a time series as a two-dimensional image by mapping each pair of time steps to the transition probability between their quantile states, estimated from a single global transition matrix. This construction is efficient when the transition dynamics are stationary, but produces a misleading representation when the process changes regime over time: the global matrix averages across regimes and the resulting image loses all information about \emph{when} each dynamical regime was active. In this paper we introduce the \emph{Temporal Markov Transition Field} (TMTF), an extension that partitions the series into $K$ contiguous temporal chunks, estimates a separate local transition matrix for each chunk, and assembles the image so that each row reflects the dynamics local to its chunk rather than the global average. The resulting $T \times T$ image has $K$ horizontal bands of distinct texture, each encoding the transition dynamics of one temporal segment. We develop the formal definition, establish the key structural properties of the representation, work through a complete numerical example that makes the distinction from the global MTF concrete, analyse the bias--variance trade-off introduced by temporal chunking, and discuss the geometric interpretation of the local transition matrices in terms of process properties such as persistence, mean reversion, and trending behaviour. The TMTF is amplitude-agnostic and order-preserving, making it suitable as an input channel for convolutional neural networks applied to time series characterisation tasks.
Comments:
Subjects: Machine Learning (cs.LG); Machine Learning (stat.ML)
Cite as: arXiv:2603.08803 [cs.LG]
  (or arXiv:2603.08803v1 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2603.08803
Focus to learn more
arXiv-issued DOI via DataCite

Submission history

From: Michael Leznik Dr. [view email]
[v1] Mon, 9 Mar 2026 18:04:40 UTC (110 KB)
Full-text links:

Access Paper:

Current browse context:
cs.LG
< prev   |   next >
Change to browse by:

References & Citations

export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo
Bibliographic Tools

Bibliographic and Citation Tools

Bibliographic Explorer Toggle
Bibliographic Explorer (What is the Explorer?)
Connected Papers Toggle
Connected Papers (What is Connected Papers?)
Litmaps Toggle
Litmaps (What is Litmaps?)
scite.ai Toggle
scite Smart Citations (What are Smart Citations?)
Code, Data, Media

Code, Data and Media Associated with this Article

alphaXiv Toggle
alphaXiv (What is alphaXiv?)
Links to Code Toggle
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub Toggle
DagsHub (What is DagsHub?)
GotitPub Toggle
Gotit.pub (What is GotitPub?)
Huggingface Toggle
Hugging Face (What is Huggingface?)
Links to Code Toggle
Papers with Code (What is Papers with Code?)
ScienceCast Toggle
ScienceCast (What is ScienceCast?)
Demos

Demos

Replicate Toggle
Replicate (What is Replicate?)
Spaces Toggle
Hugging Face Spaces (What is Spaces?)
Spaces Toggle
TXYZ.AI (What is TXYZ.AI?)
Related Papers

Recommenders and Search Tools

Link to Influence Flower
Influence Flower (What are Influence Flowers?)
Core recommender toggle
CORE Recommender (What is CORE?)
IArxiv recommender toggle
IArxiv Recommender (What is IArxiv?)
About arXivLabs

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.