News Release

DEEPSCAN: integrating vision transformers for advanced skin lesion diagnostics

This article by Dr. Tahani Jaser Alahmadi and colleagues is published in the journal, The Open Dermatology Journal

Peer-Reviewed Publication

Bentham Science Publishers

The increase in skin conditions, especially skin cancers, shows the need for accurate diagnostics. Traditional imaging methods struggle to capture complex skin lesion patterns, leading to potential misdiagnoses. While classical CNNs are effective, they often miss detailed patterns and important context.

Our research explores using Vision Transformers (ViTs) to diagnose skin lesions, taking advantage of their attention mechanisms and ability to understand global context. We use the fictional Dermatological Vision Dataset (DermVisD), which has over 15,000 annotated images, to compare ViTs with traditional CNNs. This study aims to evaluate the potential advantages of ViTs in dermatology.

Initial experiments show that ViTs improve diagnostic accuracy by 18% compared to CNNs, achieving an impressive 97.8% accuracy on the validation set. These results indicate that ViTs are much better at recognizing complex lesion patterns.

Integrating Vision Transformers into dermatological imaging represents a promising step toward more accurate diagnostics. By using global contextual understanding and attention mechanisms, ViTs provide a more detailed approach that could outperform traditional methods. This advancement suggests the potential for setting new accuracy standards in diagnosing skin lesions.

ViTs represent a major advancement in dermatological imaging, with the potential to redefine accuracy and reliability standards. This study highlights the significant impact of ViTs on detecting and diagnosing skin conditions, advocating for their wider use in clinical settings.

Read this article here;

For publishing scholarly article in Bentham journals, please visit:

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.