Regression is an important tool in trading (witness the number of traders that rely on moving averages of various sorts).    I don’t directly use regressors to generate trading signals, but I do find them useful in denoising signal output.

Aside from the obvious about past predicting the future, there are other issues with regressors:

1. lag: denoising necessarily involves averaging of some sort, resulting in lag relative to the underlier
2. parameterization:  what parameter settings bring out the features of interest

The simplest regressors are ARMA based FIR or IIR filters.   Lag is easy to quantify as phase delay in those systems and harder in others.   Rather than focusing on lag, I want to consider the parameterization.

Parameterization
To illustrate the problem of parameterization, consider a simple exponential MA in two market scenarios:

1. market with strong trends
Long windows mask tradeable market movements.   A shorter window (or “tau”) is needed to capture market movements of interest.
Short windowed MA oscillates on small movements.   Long window needed to reduce or eliminate noise that is not tradeable.

While I don’t use MAs for trade entry, the general problem of adapting a regressor to features of interest is important.

Penalized Least Squares
The penalized least-squares spline is known to be  the “best linear unbiased predictor”  for series that can be modeled by:

Where, f(x) is typically a polynomial based function (typically a high dimensional basis function).   Characteristic of the penalized family of splines is the balance between least-squares fit and curvature penalty:

This minimization can be constructed into a matrix based system using the basis design matrix.  I’m not going to go into this here, but you can find many papers on this.  The formulation is straightforward, but it is very easy to run into numerical instabilities with straightforward solutions (trust me I’ve tried), so best bet is to use one of the tried and tested implementations (such as DeBoor’s).

Ok, the problem with the above is that the parameter λ is a free variable (i.e. an input into the minimization).   λ allows us to control the degree of curvature or oscillating behavior.   Here is the same series with 4 different levels of λ (underlier in black):

Flexibility is great.  Now how do I choose λ appropriately?   And how do I define appropriate?

Criteria
As mentioned above, with the incorrect choice of regression parameters result in  regressor that is either too noisy or misses features.

Now before I explain the criteria (heuristics really) that I came up with, let me point to some literature tackling the general concept.   Tatyana Krivobokova, Ciprian M. Crainiceanu, and Goran Kauermann, “Fast Adaptive Penalized Splines” (2007).   Their approach produces an evolving λ, one for each of the truncated basis functions through time, chosen such as to reduce the local error, but keeping enough error to be optimally cross-validated.

Though the above is interesting, and indeed produces some amazing results for certain data sets, the “smoothness criteria” are fundamentally different from what I am looking for.

I decided that my criteria is as follows:

1. the amplitudes between min/maxima in the spline must meet some minimum amplitude-time
2. the energy of the spline must be “close” to the energy of the original series

The rationale for the 1st point is that we do not want small oscillations in the spline (signifying that we need to tune for less noise).   The second point tunes in the other direction, that is, if the spline is too stiff, missing many features, the energy of the spline will be too low relative to the original series.

Algorithm
The two above criteria break down into:

1. the integral between a maximum and minimum ≥ threshold
2. the integral of f(x)^2, where f(x) is the spline

As I did not see an easy way of building into a system of equations took the “poor mans algorithm” approach, namely:

1. binary-style search between low and high values for λ
2. if amplitude/area < threshold choose higher lambda else lower
3. repeat until some granularity

Works well!

Filed under strategies

### 3 responses to “Adaptive Regressor”

1. Bob

I like a pretty spline as much as the next guy but… how can they possibly be useful in on-line processing (e.g. real-time trading)? Spline algos use future data to smooth the past. That is they are ‘non-casual’. Great for smoothing data after the fact, but not filtering it in real-time. You can see this in your figure. Please correct me if I’m wrong…

nice blog btw.

• tr8dr

The only useful part of the spline is what the end point tells you in the context of trading. You have correctly observed that past points on the spline will evolve as new points become available. Consider that most technical indicators are based on regression. I don’t use technical indicators, but it does point to widespread use of regressors in trading.

Because the spline represents a maximum likelihood estimate of Y|X, it is no better or worse than a online stochastic state system of similar equations. To do better would require a state system with other variables, something that I didn’t need for the simple task at hand.

• tr8dr

Amending the above. A state system will do a bit better than an iteratively applied regressor. The the derivatives of the regressor end-point will tend to jump around a bit as new data is added.

In this regard, a state system will tend to do a bit better (from an end-point noise point of view), however, much more difficult to control and parameterize for the features I want …