Matthias Bethge

Latest

Equivariance by Contrast: Identifiable Equivariant Embeddings from Unlabeled Finite Group Actions

EbC learns equivariant embeddings from observation pairs without relying on group-specific inductive biases, with theoretical …

RDumb: A Simple Approach that Questions Our Progress in Continual Test-Time Adaptation

RDumb is a simple baseline that periodically resets the model to its pretrained state, yet outperforms state-of-the-art continual …

Unsupervised Object Learning via Common Fate

Unsupervised object learning from videos using the Common Fate Principle, decomposing the problem into motion segmentation, generative …

Contrastive Learning Inverts the Data Generating Process

Contrastive learning with the InfoNCE objective can recover the ground truth latent factors underlying the data.

If Your Data Distribution Shifts, Use Self-Learning

Self-learning techniques like entropy minimization and pseudo-labeling are simple and effective at improving model performance under …

Pretraining Boosts Out-of-Domain Robustness for Pose Estimation

ImageNet pretraining significantly improves out-of-domain robustness for pose estimation across architectures and species.

Improving Robustness against Common Corruptions by Covariate Shift Adaptation

Adapting batch norm statistics of trained computer vision models considerably increases robustness at test-time.

Multi-Task Generalization and Adaptation between Noisy Digit Datasets: An Empirical Study

We show that good target performance can be achieved on domain adaptation tasks by adapting only the normalization statistics and …

Salad: A Toolbox for Semi-supervised Adaptive Learning Across Domains

An open source toolbox providing a unified implementation of state-of-the-art methods for transfer learning, semi-supervised learning …