Login or signup to connect with paper authors and to register for specific Author Connect sessions (if available).
IMPACT OF DATASET SIZE AND TRANSFER LEARNING ON TRUNCATED LIGHTWEIGHT ARCHITECTURE
Rajesh Godasu, David Zeng, Kruttika Sutrave
Lightweight CNN architectures, known for their efficiency and speed without compromising accuracy, play a crucial role in addressing the challenges posed by limited training data availability and high computational resource demands. These architectures seamlessly integrate with transfer learning, a pivotal technique in deep learning that allows pre-trained models to be adapted for new tasks. Our study focused on evaluating a multi-stage transfer learning approach across various dataset sizes and truncated versions of the MobileNetV2 architecture for medical image classification. The results reveal that simpler models can perform competitively, whereas more complex models generally deliver higher accuracy. Furthermore, the significance of model complexity on target performance tends to diminish with smaller datasets.
AuthorConnect Sessions
No sessions scheduled yet