We propose a novel online adaptation method that enhances the model's performance under missing modality setting.
We propose a benchmark for domain shifts in medical imaging domain.
We propose a novel online evaluation protocol for Test Time Adaptation (TTA) methods, which penalizes slower methods by providing them with fewer samples for adaptation.
We leverage simulated data to mitigate forgetting in domain incremental continual segmentation.
We leverage online distillation for continual semantic segmentation with cyclic domain shifts.
We assess the robustness of face recognition models against semantic variations.
We leverage learnable tokens and large-scale pretrained models to mitigate forgetting in video class incremental learning.
We leverage learnable tokens and large-scale pretrained models to mitigate forgetting in video class incremental learning.
We assess the adversarial robustness of Inception Score and Frechet Inception Distance (FID) and propose a robustified version of FID.
We assess the certified robustness of models trained in a federated fashion.