Self-supervised contrastive learning performs non-linear system identification
tl;dr: Dynamics Contrastive Learning (DCL) is a framework to uncover linear, switching linear and non-linear dynamics under a non-linear observation model using contrastive learning.
News
May '25 | The camera ready version of the paper is available on arXiv & OpenReview. |
January '25 | Our paper was accepted at ICLR 2025. See you in Singapore! |
October '24 | Our preprint is now available on arXiv! |
Abstract
Self-supervised learning (SSL) approaches have brought tremendous success across many tasks and domains. It has been argued that these successes can be attributed to a link between SSL and identifiable representation learning: Temporal structure and auxiliary variables ensure that latent representations are related to the true underlying generative factors of the data. Here, we deepen this connection and show that SSL can perform system identification in latent space. We propose dynamics contrastive learning, a framework to uncover linear, switching linear and non-linear dynamics under a non-linear observation model, give theoretical guarantees and validate them empirically.
Reference
@inproceedings{ laiz2025selfsupervised, title={Self-supervised contrastive learning performs non-linear system identification}, author={Rodrigo Gonz{\'a}lez Laiz and Tobias Schmidt and Steffen Schneider}, booktitle={The Thirteenth International Conference on Learning Representations}, year={2025}, url={https://openreview.net/forum?id=ONfWFluZBI} }