Revisiting Hidden Representations in Transfer Learning for Medical Imaging

Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review

Abstract

While a key component to the success of deep learning is the availability of massive amounts of training data, medical image datasets are often limited in diversity and size. Transfer learning has the potential to bridge the gap between related yet different domains. For medical applications, however, it remains unclear whether it is more beneficial to pre-train on natural or medical images. We aim to shed light on this problem by comparing initialization on ImageNet and RadImageNet on seven medical classification tasks. Our work includes a replication study, which yields results contrary to previously published findings. In our experiments, ResNet50 models pre-trained on ImageNet tend to outperform those trained on RadImageNet. To gain further insights, we investigate the learned representations using Canonical Correlation Analysis (CCA) and compare the predictions of the different models. Our results indicate that, contrary to intuition, ImageNet and RadImageNet may converge to distinct intermediate representations, which appear to diverge further during fine-tuning. Despite these distinct representations, the predictions of the models remain similar. Our findings show that the similarity between networks before and after fine-tuning does not correlate with performance gains, suggesting that the advantages of transfer learning might not solely originate from the reuse of features in the early layers of a convolutional neural network.
Original languageEnglish
JournalTransactions on Machine Learning Research
ISSN2835-8856
Publication statusPublished - 15 Sept 2023

Keywords

  • transfer learning
  • hidden representations
  • representations learning
  • Medical imaging

Fingerprint

Dive into the research topics of 'Revisiting Hidden Representations in Transfer Learning for Medical Imaging'. Together they form a unique fingerprint.

Cite this