Pro Feature. Requires a Pro or Ultra subscription. Get started at api.mathematicalcompany.com
HRP & Denoising
Horizon implements Hierarchical Risk Parity (HRP) and Marcenko-Pastur covariance denoising from Lopez de Prado’s Machine Learning for Asset Managers. All matrix operations, eigendecomposition (Jacobi method), and clustering are implemented from scratch in Rust. No external linear algebra dependencies.HRP Allocation
Correlation-distance clustering with inverse-variance recursive bisection. No matrix inversion required.
Covariance Denoising
Marcenko-Pastur eigenvalue clipping to separate signal from noise in sample covariance matrices.
Detoning
Remove the dominant market factor (first principal component) to reveal idiosyncratic structure.
Full Pipeline
Combine denoising with HRP for robust allocation from noisy return data.
Hierarchical Risk Parity
HRP avoids the pitfalls of traditional mean-variance optimization (matrix inversion instability, concentrated portfolios) by using a hierarchical clustering approach:- Correlation to distance: convert the correlation matrix to a distance matrix
- Hierarchical clustering: single-linkage agglomerative clustering on the distance matrix
- Quasi-diagonalization: reorder assets so that correlated assets are adjacent (seriation)
- Recursive bisection: allocate weights by splitting the sorted list in half and weighting inversely proportional to cluster variance
hz.hrp_weights
| Parameter | Type | Description |
|---|---|---|
covariance | list[list[float]] | N x N covariance matrix (must be square, positive diagonal) |
HRPResult object.
HRPResult Type
| Field | Type | Description |
|---|---|---|
weights | list[float] | Portfolio weights summing to 1.0, one per asset |
sorted_indices | list[int] | Asset indices in quasi-diagonalized (seriation) order |
linkage | list[(int, int, float)] | Clustering linkage: each entry is (cluster_a, cluster_b, distance) |
In the linkage list, cluster indices below N are original assets. Indices >= N represent merged clusters formed in earlier steps. There are always N-1 merges for N assets.
How Weights Are Determined
Lower-variance assets receive more weight. When assets form correlated blocks, the algorithm allocates between blocks inversely proportional to their cluster variance, then recurses within each block.Covariance Denoising (Marcenko-Pastur)
Sample covariance matrices estimated from finite data contain noise. The Marcenko-Pastur distribution provides a theoretical upper bound for the eigenvalues of a random matrix. Eigenvalues below this bound are noise; eigenvalues above it carry signal. Denoising replaces noise eigenvalues with their average, shrinking the noise while preserving the signal structure.hz.denoise_covariance
| Parameter | Type | Description |
|---|---|---|
covariance | list[list[float]] | N x N covariance matrix (square, positive diagonal) |
n_observations | int | Number of observations T used to estimate the covariance (at least 2) |
DenoisedCov Type
| Field | Type | Description |
|---|---|---|
covariance | list[list[float]] | Denoised N x N covariance matrix |
eigenvalues | list[float] | Original eigenvalues sorted descending |
n_signals | int | Number of signal eigenvalues (above Marcenko-Pastur threshold) |
n_noise | int | Number of noise eigenvalues (below threshold) |
Covariance Detoning
Detoning removes the market factor (first principal component) from a covariance matrix. This reveals the idiosyncratic correlation structure by zeroing out the largest eigenvalue’s contribution.hz.detone_covariance
| Parameter | Type | Description |
|---|---|---|
covariance | list[list[float]] | N x N covariance matrix (square, positive diagonal) |
list[list[float]]: the detoned covariance matrix. The trace of the detoned matrix will be smaller than the original (the dominant eigenvalue has been removed).
Detoning is useful for correlation-based clustering (e.g., the clustering step of HRP). When all assets are driven by a common market factor, their correlations are inflated. Removing the market mode makes the idiosyncratic structure more visible, potentially improving cluster quality.
Full Pipeline
Combine denoising with HRP for robust portfolio allocation from noisy return data.Pipeline with Detoning
Combining with Fractional Differentiation
Use fractional differentiation to create stationary return features, then compute covariance for HRP:Mathematical Background
HRP Algorithm
HRP Algorithm
HRP proceeds in four steps:
- Correlation distance: d(i,j) = sqrt(0.5 * (1 - corr(i,j))). Perfectly correlated assets have distance 0; uncorrelated assets have distance sqrt(0.5).
- Single-linkage clustering: agglomerative clustering where the distance between two clusters is the minimum distance between any pair of their members. This builds a dendrogram (tree) of N-1 merges.
- Quasi-diagonalization: traverse the dendrogram to produce a leaf ordering where correlated assets are adjacent. This is the seriation step.
- Recursive bisection: split the sorted asset list in half. For each half, compute the cluster variance (w’ * Sigma * w using inverse-variance weights within the cluster). Allocate between halves inversely proportional to their cluster variances. Recurse until single assets remain.
Marcenko-Pastur Distribution
Marcenko-Pastur Distribution
For a T x N random matrix with i.i.d. entries, the eigenvalues of the sample covariance matrix follow the Marcenko-Pastur distribution as T, N -> infinity with q = N/T fixed.The support of this distribution is [lambda_-, lambda_+] where:
- lambda_+ = sigma^2 * (1 + sqrt(q))^2
- lambda_- = sigma^2 * (1 - sqrt(q))^2
Detoning
Detoning
The first principal component (largest eigenvalue) typically captures the market factor: the common movement that drives all assets together. Removing it by zeroing out the largest eigenvalue and reconstructing the matrix reveals the residual (idiosyncratic) correlation structure.This is useful when the market factor inflates correlations and obscures the true clustering of assets. After detoning, assets that move together for idiosyncratic reasons (sector, geography) become more distinguishable.
Jacobi Eigendecomposition
Jacobi Eigendecomposition
Horizon implements the classical cyclic Jacobi eigenvalue algorithm for real symmetric matrices. For each off-diagonal element above a tolerance, a Givens rotation is applied to zero it out. The algorithm converges for any symmetric matrix and is numerically stable.This avoids external LAPACK/BLAS dependencies, keeping the Rust binary self-contained and portable across platforms.