Table of Contents
Fetching ...

Classification of Carotid Plaque with Jellyfish Sign Through Convolutional and Recurrent Neural Networks Utilizing Plaque Surface Edges

Takeshi Yoshidomi, Shinji Kume, Hiroaki Aizawa, Akira Furui

TL;DR

The paper tackles automated detection of the Jellyfish sign in carotid plaques from ultrasound videos, a dynamic marker of rupture risk. It introduces a two‑stage approach: preprocessing to separate plaque motion from wall motion, and a CNN‑BiLSTM classifier that consumes two‑channel inputs combining plaque video with surface edge information. Ablation studies show the plaque surface input and bidirectional temporal modeling substantially improve performance, achieving about 80% accuracy on 200 cases, with Grad‑CAM++ visualizations supporting localized feature focus. This method provides an objective, video‑driven tool for early identification of high‑risk Jellyfish plaques, with potential to aid clinical decision making and treatment planning.

Abstract

In carotid arteries, plaque can develop as localized elevated lesions. The Jellyfish sign, marked by fluctuating plaque surfaces with blood flow pulsation, is a dynamic characteristic of these plaques that has recently attracted attention. Detecting this sign is vital, as it is often associated with cerebral infarction. This paper proposes an ultrasound video-based classification method for the Jellyfish sign, using deep neural networks. The proposed method first preprocesses carotid ultrasound videos to separate the movement of the vascular wall from plaque movements. These preprocessed videos are then combined with plaque surface information and fed into a deep learning model comprising convolutional and recurrent neural networks, enabling the efficient classification of the Jellyfish sign. The proposed method was verified using ultrasound video images from 200 patients. Ablation studies demonstrated the effectiveness of each component of the proposed method.

Classification of Carotid Plaque with Jellyfish Sign Through Convolutional and Recurrent Neural Networks Utilizing Plaque Surface Edges

TL;DR

The paper tackles automated detection of the Jellyfish sign in carotid plaques from ultrasound videos, a dynamic marker of rupture risk. It introduces a two‑stage approach: preprocessing to separate plaque motion from wall motion, and a CNN‑BiLSTM classifier that consumes two‑channel inputs combining plaque video with surface edge information. Ablation studies show the plaque surface input and bidirectional temporal modeling substantially improve performance, achieving about 80% accuracy on 200 cases, with Grad‑CAM++ visualizations supporting localized feature focus. This method provides an objective, video‑driven tool for early identification of high‑risk Jellyfish plaques, with potential to aid clinical decision making and treatment planning.

Abstract

In carotid arteries, plaque can develop as localized elevated lesions. The Jellyfish sign, marked by fluctuating plaque surfaces with blood flow pulsation, is a dynamic characteristic of these plaques that has recently attracted attention. Detecting this sign is vital, as it is often associated with cerebral infarction. This paper proposes an ultrasound video-based classification method for the Jellyfish sign, using deep neural networks. The proposed method first preprocesses carotid ultrasound videos to separate the movement of the vascular wall from plaque movements. These preprocessed videos are then combined with plaque surface information and fed into a deep learning model comprising convolutional and recurrent neural networks, enabling the efficient classification of the Jellyfish sign. The proposed method was verified using ultrasound video images from 200 patients. Ablation studies demonstrated the effectiveness of each component of the proposed method.

Paper Structure

This paper contains 18 sections, 2 equations, 3 figures, 2 tables.

Figures (3)

  • Figure 1: Overview of the proposed method
  • Figure 2: Schematic diagram of the template matching process
  • Figure 3: Visualization of activation maps using Grad-CAM++. The top row displays the initial frame of the ultrasound video images used as input. The semi-transparent red area in the image marks the region annotated by sonographer as the lesion area of the Jellyfish sign. In the activation maps, areas with more intense red indicate a more considerable contribution to the classification.