Classification of Carotid Plaque with Jellyfish Sign Through Convolutional and Recurrent Neural Networks Utilizing Plaque Surface Edges
Takeshi Yoshidomi, Shinji Kume, Hiroaki Aizawa, Akira Furui
TL;DR
The paper tackles automated detection of the Jellyfish sign in carotid plaques from ultrasound videos, a dynamic marker of rupture risk. It introduces a two‑stage approach: preprocessing to separate plaque motion from wall motion, and a CNN‑BiLSTM classifier that consumes two‑channel inputs combining plaque video with surface edge information. Ablation studies show the plaque surface input and bidirectional temporal modeling substantially improve performance, achieving about 80% accuracy on 200 cases, with Grad‑CAM++ visualizations supporting localized feature focus. This method provides an objective, video‑driven tool for early identification of high‑risk Jellyfish plaques, with potential to aid clinical decision making and treatment planning.
Abstract
In carotid arteries, plaque can develop as localized elevated lesions. The Jellyfish sign, marked by fluctuating plaque surfaces with blood flow pulsation, is a dynamic characteristic of these plaques that has recently attracted attention. Detecting this sign is vital, as it is often associated with cerebral infarction. This paper proposes an ultrasound video-based classification method for the Jellyfish sign, using deep neural networks. The proposed method first preprocesses carotid ultrasound videos to separate the movement of the vascular wall from plaque movements. These preprocessed videos are then combined with plaque surface information and fed into a deep learning model comprising convolutional and recurrent neural networks, enabling the efficient classification of the Jellyfish sign. The proposed method was verified using ultrasound video images from 200 patients. Ablation studies demonstrated the effectiveness of each component of the proposed method.
