Skip to content

Sparse Estimation Backend Crosswalk

Sparse Estimation Backend Crosswalk curated visual

Visual: sparse backend decision map from Jacobian structure to factorization, Schur, marginalization, covariance, and PCG.

Backend decision matrix

Sparse Estimation Backend Crosswalk is the routing page for rank and nullspaces, sparsity, ordering, and fill-in, factorization, Schur complement for solving, marginalization prior construction, covariance recovery, and PCG stagnates diagnostics.

BackendSPD assumptionsRank robustnessSparsity/fill behaviorCovariance recoveryRuntime/memoryDiagnostic value
CholeskyRequires symmetric positive definite normal equations or damped approximation.Poor when rank deficient; failure is useful.Very fast when ordering controls fill.Selected covariance possible from factors if rank valid.Excellent for well-posed sparse systems.Factor failure exposes SPD, gauge, and conditioning issues.
LDLTHandles symmetric indefinite systems better than Cholesky depending on pivoting.More diagnostic for indefiniteness, still not full rank-revealing.Fill depends on ordering and pivoting.Useful for debugging signs and pivots.Slightly higher overhead than Cholesky.Pivots indicate indefiniteness or weak modes.
QRDoes not require normal-equation SPD.More robust than normal equations.Can create more fill than Cholesky but avoids squaring condition number.Square-root covariance paths are natural.More expensive but safer for debug and some production problems.Reveals rank and conditioning better than normal equations.
SVDNo SPD assumption.Strongest rank and nullspace diagnostic.Usually dense or small-case tool.Direct pseudoinverse and covariance diagnostics under threshold.Expensive, often for reduced snapshots.Best for explaining weak modes and threshold sensitivity.
SchurReduced system should be SPD after valid elimination and damping.Depends on invertible eliminated blocks.Can reduce problem size but create dense reduced blocks.Camera/pose covariance possible, but fill and rank matter.Excellent for bundle adjustment structure; risky for dense separators.Separates eliminated-variable issues from kept-variable solve.
PCGRequires symmetric positive definite matrix-vector operator.Weak for rank diagnosis unless monitored carefully.Avoids explicit factor fill; preconditioner controls convergence.Not a covariance recovery method by itself.Low memory, scalable if preconditioned well.Residual norms and stagnation expose conditioning or operator bugs.

Rank/nullspaces/conditioning

Rank and nullspaces answer whether the local linear system has enough independent information. Gauge freedom is an expected nullspace caused by arbitrary global coordinates such as pose-graph translation and yaw. Rank deficiency outside the expected gauge usually means missing constraints, bad geometry, or invalid elimination.

Conditioning is different. A weak mode can be technically observable while still producing fragile steps and misleading covariance. Rank thresholds, damping, and priors can make a system appear full rank while covariance remains nonsensical. The diagnostic sequence is: expected gauge dimension, singular spectrum, weak-mode vectors, condition estimate, then physical interpretation.

Sparsity/ordering/fill-in

Sparsity is the reason estimation backends are tractable. Each residual touches a small number of variables, so J, H, and square-root factors have exploitable block structure. Ordering decides which variables are eliminated first. Fill-in is the new nonzero structure created by elimination. A poor ordering or a dense marginalization prior can turn a sparse SLAM graph into a memory-heavy dense solve.

Inspect symbolic nonzeros, numeric nonzeros, elimination tree depth, separator size, and block structure. If runtime/memory grows suddenly after graph growth, open the fill report before changing nonlinear methods.

Schur/marginalization/covariance/PCG

Schur complement for solving eliminates variables inside one linear solve and then back-substitutes their updates. Marginalization prior construction removes variables from the active estimator and stores their information as a new prior over remaining separator variables. The algebra can look similar, but the lifecycle is different: solving is temporary, marginalization changes the future problem.

Square-root information stores normalized residual and factor structure without forming a dense covariance inverse. Covariance recovery extracts selected marginal covariance blocks from the local information system or square-root factor. Marginal covariance integrates out other variables; conditional covariance holds them fixed. The inverse of an information diagonal block is not generally a marginal covariance.

PCG requires an SPD operator, a preconditioner, and telemetry: unpreconditioned residual norm, preconditioned residual norm, iteration count, stopping tolerance, symmetry test for matrix-vector products, and nonlinear progress. Stagnation symptoms include residual norms flattening, iteration limits on many nonlinear steps, and accepted nonlinear progress disappearing even though linear work is high.

Concept cards

Rank deficiency

FieldExplanation
What it means hereThe linearized system lacks independent constraints in one or more directions.
Math objectRank of J, R, or H; zero singular values.
Effect on the solveMakes updates non-unique and covariance undefined without gauge handling.
What it solvesDiagnoses missing information and singular backends.
What it does not solveIt does not choose the physical anchor policy.
Minimal exampleRelative pose graph with no fixed world pose.
Failure symptomsCholesky fails, covariance huge, solution drifts in global frame.
Diagnostic artifactSingular values and nullspace basis.
Normal vs abnormal artifactNormal nullspace matches expected gauge; abnormal nullspace contains constrained states.
First debugging moveCount expected gauge freedoms and compare with numeric rank.
Do not confuse withPoor conditioning.
Read nextEigenvalues, Hessian Conditioning, and Observability.

Nullspace

FieldExplanation
What it means hereDirections in state perturbation space that do not change residuals locally.
Math objectVectors n such that J n = 0 or H n = 0.
Effect on the solveAllows arbitrary motion unless constrained or projected out.
What it solvesExplains gauge freedoms and unobservable directions.
What it does not solveIt does not imply every weak mode is exactly unobservable.
Minimal exampleGlobal yaw in a planar relative pose graph.
Failure symptomsState changes with no cost change, singular covariance, anchor sensitivity.
Diagnostic artifactNullspace vectors visualized as state perturbations.
Normal vs abnormal artifactNormal vectors match known symmetries; abnormal vectors reveal missing factors.
First debugging moveApply a small nullspace perturbation and verify residual change.
Do not confuse withEigenvector of a small but nonzero eigenvalue.
Read nextQR, SVD, and Rank-Revealing Solvers.

Gauge freedom

FieldExplanation
What it means hereArbitrary coordinates not determined by relative measurements.
Math objectKnown nullspace basis or anchor constraint.
Effect on the solveRequires fixing, projecting, or respecting gauge in covariance interpretation.
What it solvesSeparates coordinate arbitrariness from physical uncertainty.
What it does not solveIt does not add real sensor information.
Minimal exampleFixing the first pose in SLAM.
Failure symptomsDifferent absolute maps with same relative residual cost.
Diagnostic artifactGauge dimension and anchor sensitivity test.
Normal vs abnormal artifactNormal gauge affects only arbitrary frame; abnormal anchor changes relative geometry.
First debugging moveRemove the anchor and inspect expected singular modes.
Do not confuse withPrior knowledge.
Read nextSLAM/VIO Observability, FEJ, and Nullspace Consistency.

Condition number

FieldExplanation
What it means hereRatio between strongest and weakest constrained directions.
Math objectkappa(A) = sigma_max / sigma_min.
Effect on the solveControls numerical sensitivity and PCG convergence.
What it solvesQuantifies weak-mode fragility.
What it does not solveIt does not identify wrong residual semantics by itself.
Minimal exampleLow-parallax triangulation with tiny depth information.
Failure symptomsSlow PCG, noisy covariance, unstable small pivots.
Diagnostic artifactSpectrum, condition estimate, weak eigenvectors.
Normal vs abnormal artifactNormal spectrum has expected weak directions; abnormal spectrum spans many orders unexpectedly.
First debugging moveCheck scaling and whitening before changing factorization.
Do not confuse withExact rank deficiency.
Read nextEigenvalues, Hessian Conditioning, and Observability.

Sparsity

FieldExplanation
What it means hereMost Jacobian or Hessian blocks are zero because factors touch few variables.
Math objectBlock sparse J, H, or factor graph adjacency.
Effect on the solveEnables real-time memory and runtime scaling.
What it solvesAvoids dense linear algebra on large estimation problems.
What it does not solveIt does not guarantee low fill after elimination.
Minimal exampleOdometry factor touches only two neighboring poses.
Failure symptomsUnexpected dense rows, memory growth, slow symbolic phase.
Diagnostic artifactSparsity plot and block adjacency graph.
Normal vs abnormal artifactNormal pattern follows factor topology; abnormal pattern has broad accidental coupling.
First debugging movePrint each residual block's parameter dependencies.
Do not confuse withFill-in.
Read nextSparse Matrices, Fill-In, and Ordering.

Fill-in

FieldExplanation
What it means hereNew nonzeros created during elimination or factorization.
Math objectNonzero pattern of factor L, R, or reduced system.
Effect on the solveIncreases memory, runtime, and covariance recovery cost.
What it solvesIt is not a solver; it explains factorization growth.
What it does not solveIt does not indicate model accuracy.
Minimal exampleEliminating a pose connects all neighboring landmarks or poses.
Failure symptomsRuntime/memory explodes after graph growth.
Diagnostic artifactSymbolic fill report and elimination tree.
Normal vs abnormal artifactNormal fill grows predictably; abnormal fill jumps after ordering or marginalization.
First debugging moveCompare fill under alternative orderings on the same graph.
Do not confuse withOriginal sparsity.
Read nextSparse Matrices, Fill-In, and Ordering.

Ordering

FieldExplanation
What it means hereVariable elimination sequence used by the sparse backend.
Math objectPermutation matrix or ordered variable list.
Effect on the solveControls fill, separator size, and Schur structure.
What it solvesReduces runtime and memory by exploiting graph topology.
What it does not solveIt does not repair rank deficiency.
Minimal exampleEliminating landmarks before cameras in bundle adjustment.
Failure symptomsDirect solve becomes dense, Schur reduced matrix too large.
Diagnostic artifactOrdering report and fill comparison.
Normal vs abnormal artifactNormal ordering respects block structure; abnormal ordering creates large cliques.
First debugging moveRun AMD/COLAMD or domain-specific ordering and compare nonzeros.
Do not confuse withSolver method choice.
Read nextSparse Matrices, Fill-In, and Ordering.

Cholesky

FieldExplanation
What it means hereFactorization for SPD normal equations or information matrices.
Math objectH = L L^T or R^T R.
Effect on the solveFast direct solve when assumptions hold.
What it solvesEfficient sparse SPD systems.
What it does not solveIt does not handle rank deficiency robustly.
Minimal exampleDamped pose-graph normal equations after gauge anchor.
Failure symptomsNon-positive pivot, factorization abort, huge condition warning.
Diagnostic artifactPivot log and factor nonzeros.
Normal vs abnormal artifactNormal pivots positive and stable; abnormal pivots vanish or turn negative.
First debugging moveCheck SPD assumptions, gauge, and whitening.
Do not confuse withQR on the original least-squares system.
Read nextCholesky, LDLT, and Normal Equations.

LDLT

FieldExplanation
What it means hereSymmetric factorization that exposes diagonal or block pivots.
Math objectP^T H P = L D L^T.
Effect on the solveCan diagnose indefiniteness and handle some symmetric systems better than Cholesky.
What it solvesSymmetric systems where pivot information matters.
What it does not solveIt is not a full rank-revealing replacement for SVD.
Minimal exampleDebugging an indefinite Hessian approximation.
Failure symptomsNegative or tiny pivots, pivoting instability.
Diagnostic artifactD pivots and permutation.
Normal vs abnormal artifactNormal pivots match expected definiteness; abnormal pivots reveal sign or rank problems.
First debugging moveInspect pivot sequence and compare with damped Cholesky.
Do not confuse withLM damping.
Read nextCholesky, LDLT, and Normal Equations.

QR

FieldExplanation
What it means hereLeast-squares factorization that avoids forming normal equations.
Math objectJ = Q R.
Effect on the solveImproves numerical behavior for ill-conditioned least squares.
What it solvesMore robust linearized least-squares solves.
What it does not solveIt does not make a wrong residual correct.
Minimal exampleDebugging calibration Jacobians without squaring condition number.
Failure symptomsHigher runtime but stable answer compared with Cholesky failure.
Diagnostic artifactR diagonal and rank estimate.
Normal vs abnormal artifactNormal R has clear rank; abnormal has tiny diagonals in expected constrained modes.
First debugging moveCompare QR step with normal-equation step on a small case.
Do not confuse withSVD rank thresholding.
Read nextQR, SVD, and Rank-Revealing Solvers.

SVD

FieldExplanation
What it means hereFactorization that exposes singular values and vectors directly.
Math objectJ = U Sigma V^T.
Effect on the solveGives best local rank and nullspace diagnosis.
What it solvesSmall or reduced rank-revealing debug problems.
What it does not solveIt is usually too expensive for full production sparse graphs.
Minimal exampleExplaining weak yaw/scale mode in a calibration snapshot.
Failure symptomsSingular values collapse, threshold changes covariance dramatically.
Diagnostic artifactSingular spectrum and right singular vectors.
Normal vs abnormal artifactNormal tiny singular vectors match known gauge; abnormal vectors reveal missing excitation.
First debugging moveVisualize the weakest singular vector as a state perturbation.
Do not confuse withEigenanalysis of a damped Hessian without interpreting damping.
Read nextQR, SVD, and Rank-Revealing Solvers.

Schur complement

FieldExplanation
What it means hereAlgebraic elimination of a variable block to solve a reduced system.
Math objectS = B - E C^-1 E^T.
Effect on the solveReduces system size and exploits conditional independence.
What it solvesLandmark-heavy bundle adjustment and related block systems.
What it does not solveIt does not mean permanent marginalization by itself.
Minimal exampleEliminate points, solve camera system, back-substitute point updates.
Failure symptomsSingular eliminated block, dense reduced matrix, wrong back-substitution.
Diagnostic artifactBlock ranks, Schur nonzeros, reduced residual.
Normal vs abnormal artifactNormal eliminated blocks are invertible; abnormal blocks need damping or removal.
First debugging moveValidate C^-1 block solves on representative landmarks.
Do not confuse withMarginalization prior.
Read nextSchur Complement, Marginalization, and PCG.

Marginalization prior

FieldExplanation
What it means herePrior factor created after removing variables from the active estimator.
Math objectSchur-reduced information and right-hand side over separator variables.
Effect on the solvePreserves old information but can introduce dense couplings and stale linearization.
What it solvesKeeps fixed-lag smoothing bounded.
What it does not solveIt does not remain exact after large future relinearization changes.
Minimal exampleMarginalize old poses and keep a prior on the newest boundary poses.
Failure symptomsDense prior explosion, overconfident estimator, loop closure fights prior.
Diagnostic artifactPrior residual, prior matrix sparsity, linearization point.
Normal vs abnormal artifactNormal prior separator is controlled; abnormal prior grows dense and stale.
First debugging moveTrack separator size and prior nonzeros at every marginalization.
Do not confuse withSchur complement for a temporary solve.
Read nextOut-of-Sequence Measurements and Fixed-Lag Smoothing.

Square-root information

FieldExplanation
What it means hereFactor form of information used to whiten residuals or represent solved factors.
Math objectL where L^T L = Sigma^-1, or QR factor R.
Effect on the solveAvoids explicit dense inverses and keeps normalized residuals.
What it solvesStable weighting and selected covariance workflows.
What it does not solveIt does not avoid rank analysis.
Minimal examplePremultiply pose residual by inverse square-root covariance.
Failure symptomsDouble whitening, bad scale, covariance mismatch.
Diagnostic artifactL, R, and whitened residual checks.
Normal vs abnormal artifactNormal factor reconstructs information; abnormal factor has wrong units or order.
First debugging moveVerify L^T L against intended information.
Do not confuse withCovariance square root.
Read nextSquare-Root Information and Covariance Recovery.

Covariance recovery

FieldExplanation
What it means hereExtracting selected local uncertainty from information or square-root factors.
Math objectBlocks of H^-1, marginal covariance, or pseudoinverse under rank handling.
Effect on the solveUsed after solving for diagnostics, gating, and integrity checks.
What it solvesProvides local uncertainty when gauge and rank assumptions are explicit.
What it does not solveIt does not validate global consistency.
Minimal exampleRecover camera pose marginal covariance after sparse BA.
Failure symptomsNegative variance, overconfident weak mode, dense inverse memory blowup.
Diagnostic artifactSelected covariance block, rank threshold, factor backsolve trace.
Normal vs abnormal artifactNormal block is positive and gauge-aware; abnormal block changes wildly with anchor.
First debugging moveCompare selected covariance against SVD on a small snapshot.
Do not confuse withMeasurement covariance.
Read nextSquare-Root Information and Covariance Recovery.

PCG

FieldExplanation
What it means hereIterative method for large SPD systems using matrix-vector products.
Math objectPreconditioned conjugate gradient iterations for A x = b.
Effect on the solveReduces memory but makes convergence depend on conditioning and preconditioning.
What it solvesLarge sparse or implicit Schur systems where direct factorization is too expensive.
What it does not solveIt does not work on indefinite or nonsymmetric operators.
Minimal exampleIterative Schur solve for large bundle adjustment.
Failure symptomsPCG stagnates, reaches iteration limit, nonlinear cost stops improving.
Diagnostic artifactUnpreconditioned and preconditioned residual norms, iteration count, tolerance.
Normal vs abnormal artifactNormal residuals decrease with accepted nonlinear progress; abnormal residuals flatten or oscillate.
First debugging moveTest symmetry and compare to a direct solve on a small case.
Do not confuse withSchur complement itself.
Read nextSchur Complement, Marginalization, and PCG.

Preconditioner

FieldExplanation
What it means hereApproximation that improves PCG convergence by reshaping the linear system.
Math objectMatrix or operator M approximating A or A^-1.
Effect on the solveChanges iteration count and residual decay.
What it solvesReduces effective condition number for iterative solves.
What it does not solveIt does not fix an invalid SPD assumption.
Minimal exampleBlock-Jacobi preconditioner for pose blocks.
Failure symptomsMany PCG iterations, residual plateaus, nonlinear budget wasted.
Diagnostic artifactPreconditioned residual norm and iteration comparison.
Normal vs abnormal artifactNormal preconditioner improves decay; abnormal preconditioner has little effect or breaks symmetry.
First debugging moveCompare identity, diagonal, and block preconditioners on the same snapshot.
Do not confuse withWhitening or LM damping.
Read nextSchur Complement, Marginalization, and PCG.

Sources

Public research notes collected from public sources.