Research Framework

Multi-Model
Feature Engineering Framework

Regime · Volatility · Filter · Frequency · Memory · Decomposition

Six classical and modern quantitative models, each producing a complementary feature stream that feeds the AlphaMind Prediction Engine. We don't replace statistics with deep learning — we stack them.

01 — Premise

Why six models, not one big model

No single lens

Markets are heterogeneous — regime, volatility, microstructure noise, and frequency all live on different time scales. One model can't see all of them well.

Diversification

Six independent models with non-overlapping inductive biases yield strictly less correlated errors. The engine sees a richer, decorrelated feature space.

Interpretability

Each module's output is mathematically traceable. When the engine's forecast moves, we can attribute it back to which feature stream changed.

02 — The Stack

Six complementary models

Each model below produces a feature stream that gets fed into the AlphaMind Prediction Engine — together they form the engine's input embedding.

Regime

HMM Regime

Hidden Markov Model

A probabilistic state-space model that infers latent market regimes (bull / bear / ranging) from observable price dynamics. We condition every downstream prediction on the current regime — a forecast that ignores regime is, by definition, mis-specified.

P(sₜ | x₁:ₜ) ∝ P(xₜ | sₜ) · Σ P(sₜ | sₜ₋₁) · P(sₜ₋₁ | x₁:ₜ₋₁)
Role in pipeline

Provides the regime label that gates which prediction head the engine routes to.

Rabiner (1989) · Hamilton (1989)
Volatility

HAR-RV

Heterogeneous Autoregressive Realized Volatility

A long-memory volatility forecaster that aggregates realized variance at daily, weekly, and monthly horizons. It captures the fact that volatility today depends on volatility across multiple time scales — a property that GARCH alone misses.

RVₜ₊₁ = β₀ + β_d·RVₜ⁽ᵈ⁾ + β_w·RVₜ⁽ʷ⁾ + β_m·RVₜ⁽ᵐ⁾ + εₜ₊₁
Role in pipeline

Calibrates the engine's confidence intervals — the prediction band widens or tightens with HAR-RV's vol forecast.

Corsi (2009)
Filter

Kalman Filter

Recursive State-Space Estimator

An optimal recursive filter that fuses noisy observations with a dynamic model of the underlying state. We use it to extract the unobserved 'fair-price' component from raw quotes — feeding the engine a denoised signal instead of raw market noise.

x̂ₜ|ₜ = x̂ₜ|ₜ₋₁ + Kₜ(yₜ − Hₜ x̂ₜ|ₜ₋₁)
Role in pipeline

Pre-processes the OHLCV stream into a noise-reduced state vector before tokenization.

Kalman (1960)
Frequency

ATFNet

Adaptive Time-Frequency Network
TIMEFREQ

A neural architecture that learns features jointly in the time and frequency domains via a shared adaptive backbone. It captures periodic structures (intraday seasonality, session effects) that pure time-domain models systematically miss.

h = Adaptive[ Encoder_t(x) ⊕ Encoder_f(F{x}) ]
Role in pipeline

Provides frequency-aware feature embeddings that get concatenated to the token stream.

Liu et al. (2024)
Memory

Hurst Exponent

R/S Long-Memory Statistic
log nR/SH ≈ 0.62

A scaling exponent estimated from the rescaled-range (R/S) statistic of a price series. H = 0.5 ⇒ random walk; H > 0.5 ⇒ persistent / trending; H < 0.5 ⇒ mean-reverting. It tells the engine which prediction prior to use.

E[R(n)/S(n)] ∼ n^H, H = lim (log E[R/S]) / (log n)
Role in pipeline

Selects between trend-following and mean-reverting prediction heads inside the engine.

Hurst (1951) · Mandelbrot & Wallis (1969)
Decomposition

VMD

Variational Mode Decomposition

A non-recursive signal decomposition that splits a price series into a small set of band-limited oscillating modes around adaptively estimated central frequencies. We model each mode separately, then re-aggregate — a decomposition-then-forecasting pipeline.

min { Σₖ ‖∂ₜ[(δ + j/(πt)) ∗ uₖ(t)] e^{−jωₖt}‖² }
Role in pipeline

Multi-band feature stream — high / mid / low frequency components are predicted independently.

Dragomiretskiy & Zosso (2014)
03 — Integration

How the six streams enter the engine

Regime
Volatility
Filter
TIMEFREQ
Frequency
log nR/SH ≈ 0.62
Memory
Decomposition
Concatenate · Embed · Project
AlphaMind AI Prediction Engine

References

  • Rabiner, L. R. (1989). A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE.
  • Hamilton, J. D. (1989). A new approach to the economic analysis of nonstationary time series and the business cycle. Econometrica.
  • Corsi, F. (2009). A simple approximate long-memory model of realized volatility. Journal of Financial Econometrics.
  • Kalman, R. E. (1960). A new approach to linear filtering and prediction problems. Journal of Basic Engineering.
  • Liu, H. et al. (2024). ATFNet: Adaptive Time-Frequency Ensembled Network for Long-term Time Series Forecasting.
  • Hurst, H. E. (1951). Long-term storage capacity of reservoirs. Transactions of the American Society of Civil Engineers.
  • Dragomiretskiy, K., & Zosso, D. (2014). Variational mode decomposition. IEEE Transactions on Signal Processing.

Don't Miss the Only AI That Actually Helps You Win in Day Trading

No complex jargon. No noise. No confusing indicators
Just clear, actionable signals.