Efficient Test-Time Adaptation with Cache-based Dynamic Adapter

Course project for "Trends and Applications in Computer Vision" @ UniTN

πŸ“„ Official Paper:
Efficient Test-Time Adaptation of Vision-Language Models
🧠 Authors: Adilbek Karmanov, Dayan Guan, Shijian Lu, Abdulmotaleb El Saddik, Eric Xing

πŸŽ“ This project was developed as part of the β€œTrends and Applications in Computer Vision” course taught by Prof. M. Mancini and G. Boato.


πŸ” Overview

This repository extends the official implementation of TDA (Test-Time Adaptation) by exploring its efficiency, robustness, and flexibility in different real-world scenarios.

πŸ“‚ Resources:

πŸ‘₯ Project by:
Juan Camacho Mohedano β€’
Andrea De Carlo β€’
Samuele Bolotta

For setup and base implementation details, refer to the original README_official.md


πŸš€ Our Contributions

πŸ“Œ 1. Benchmarking on Diverse Datasets

We evaluated TDA on:

  • Class Distribution Shifts (CD)
  • Out-of-Distribution (OOD) test cases

Example:
CIFAR-10-C (non-iid stream) results:

πŸ“„ Code: tda_cd_benchmark.ipynb


βš™οΈ 2. Hyperparameter Sensitivity under Budget Constraints

Performance analysis across:

  • Adaptation budgets
  • Stream orderings

πŸ“„ Code: tda_runner_experiments.py


🧩 3. Waiting List Strategy

Added Waiting List to improve robustness:

  • Helped on ImageNet
  • Less effective on CIFAR10-C

πŸ“„ Code: tda_runner_with_waiting.py


πŸ“ Files Added

File Purpose
tda_cd_benchmark.ipynb Benchmarking on class-distribution shifts
tda_runner_experiments.py Hyperparameter tuning
tda_runner_with_waiting.py Waiting list enhancement

πŸ“Œ Notes

  • Experiments are reproducible and tested under realistic compute constraints.
  • Visualizations and explanations are in the Final Presentation.

πŸ“¬ Contact

For inquiries, feel free to reach out to any of us on LinkedIn.


⭐ If you found this useful, consider checking out the official paper and citing it!