Skip to content

Add Linear Regression Rereferencing (LRR)#6

Merged
cboulay merged 4 commits intodevfrom
lrr
Feb 3, 2026
Merged

Add Linear Regression Rereferencing (LRR)#6
cboulay merged 4 commits intodevfrom
lrr

Conversation

@cboulay
Copy link
Member

@cboulay cboulay commented Feb 3, 2026

Summary

  • Add a self-supervised regression framework (SelfSupervisedRegressionTransformer) and a concrete LRR implementation (LRRTransformer) in ezmsg/learn/process/ssr.py
  • LRR predicts each channel from the others in its cluster via ridge regression and subtracts the prediction: y = X @ (I - W)
  • The effective weight matrix is delegated to AffineTransformTransformer, which automatically exploits block-diagonal structure when channel_clusters are provided
  • Bump ezmsg-sigproc dependency to >=2.13.1 to use AffineTransformTransformer.set_weights

Details

Framework (SelfSupervisedRegressionTransformer):

  • Accumulates channel covariance C = X^T X and solves per-cluster ridge regressions via the block-inverse identity (one matrix inverse per cluster instead of a per-channel Cholesky loop)
  • Supports incremental and batch fitting modes
  • All linear algebra stays in the source array namespace (NumPy, CuPy, etc.)
  • Subclasses implement _on_weights_updated and _process

LRR (LRRTransformer / LRRUnit):

  • _on_weights_updated computes I - W and passes it to an internal AffineTransformTransformer
  • Subsequent fits use set_weights for fast in-place updates without a full state reset
  • Supports pre-calculated weights from a numpy array or CSV file path
  • LRRUnit provides the ezmsg Unit wrapper with INPUT_SAMPLE subscriber

@cboulay cboulay merged commit 44ad802 into dev Feb 3, 2026
8 checks passed
@cboulay cboulay deleted the lrr branch February 3, 2026 16:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant