Estimation Pipelines¶
Contrax supports both recursive and optimization-based estimation workflows.
That matters because the right question is usually not “which filter do I know?”, but “what estimation pipeline fits this system, data stream, and JAX workflow?”
Recursive Filters¶
The recursive side of the library includes:
kalman()for linear Gaussian systemsekf()for nonlinear models with local Jacobian linearizationukf()for nonlinear models with sigma-point propagation
Each of those also has one-step helpers for runtime loops.
Use the one-step helpers when measurements arrive online or when the estimator must live inside a service loop. Use the batch filter when you want an entire offline pass over a fixed sequence.
Contrax uses an update-first batch convention. The forward pass starts from a
prior on x_0, updates with y_0, then predicts the prior on x_1.
Linear Kalman Filter¶
For the linear discrete model
the filter recursion is
This is the right starting point when the model is already linear and the Gaussian assumptions are a reasonable first approximation.
Extended Kalman Filter¶
For nonlinear transition and observation maps,
the EKF replaces the linear matrices with local Jacobians:
This is the lightest nonlinear recursive path in the library. It works best when the local linearization is a good approximation over one update step.
Unscented Kalman Filter¶
The UKF keeps the same state-estimation goal but avoids local Jacobians. It starts from sigma points
and pushes them through the observation and transition maps:
The predicted moments are reconstructed with sigma-point weights:
and the update uses the cross-covariance
to form
This is the better fit when the local Jacobian story is too crude but you still want a recursive estimator rather than a full horizon solve.
Smoothers¶
Recursive filters consume information causally. Smoothers revisit the sequence with future information available.
Contrax provides:
rts()for filtered linear Kalman resultsuks()for filtered unscented Kalman results
These are offline tools. They are not replacements for runtime loops, but they are valuable when you want a retrospective state estimate or a comparison target for an optimization-based method.
RTS Smoother¶
For the linear filter, the Rauch-Tung-Striebel backward pass is
Unscented RTS-Style Smoother¶
For uks(), the same backward structure is used, but the smoother gain is
built from the unscented cross-covariance:
MHE As A Sibling Workflow¶
Moving-horizon estimation is not “just another Kalman filter.” In Contrax it is treated as a sibling fixed-horizon optimization workflow:
- an arrival cost anchors the start of the window
- process costs enforce model consistency across transitions
- measurement costs enforce agreement with the observed outputs
- optional extra costs let the user express soft constraints or domain terms
That is why mhe_objective() exists as a first-class pure cost function. It
lets you keep the model, horizon, and objective explicit rather than burying
them inside a solver wrapper.
The fixed-horizon objective is the optimization sibling of the recursive filters:
Choosing Between The Current Paths¶
Prefer:
kalman()when the model is linear and Gaussian assumptions are a good first approximationekf()when a local linearization is a reasonable approximation and you want the lightest nonlinear recursive pathukf()when sigma-point propagation is a better fit than local Jacobiansrts()oruks()when you need an offline smoothed trajectorymhe_objective()ormhe()when you want a fixed-window optimization-based estimate with explicit costs
Transform Contracts¶
The estimation surface is designed around fixed-shape JAX workflows:
- batch filters are scans
- one-step helpers can live inside larger scans
- missing-measurement handling avoids Python branching on traced values
mhe_objective()is a pure JAX cost over explicit arrays
That makes estimation pipelines compatible with the broader Contrax story: compiled, differentiable, and composable.