In current medical practices, minimally invasive procedures have become a prevailing norm and are known as endovascular procedures when they involve vascular structures. However, endovascular procedures such as angioplasty or stenting still face significant challenges due to a lack of high-resolution sensory feedback. These procedures can treat atherosclerotic stenosis, a condition where plaque build-up in the blood vessels occludes blood flow, potentially leading to severe medical conditions such as ischemic stroke from hypoperfusion or emboli.
During these endovascular procedures, a metal wire mesh called a stent is deployed via the blood vessels to correct the stenosis. When fully deployed, the stent widens the blocked blood vessel and restores blood flow. However, due to the lack of a direct line of sight and robust sensory feedback, physicians may find it difficult to affirm if the stent is properly fully deployed. An improperly deployed stent or a stent that has suffered a crimp in the center during deployment could lead to a higher rate of thrombosis, restenosis, or clot embolism.
Sensory feedback that can objectively tell the surgeons if the stent is deployed fully opened during the endovascular surgery would allow the surgeons to remedy a partially deployed stent. Medical imaging techniques such as fluoroscopy and X-ray imaging that are currently used in determining the deployed stent’s state have their limitations. Both imaging techniques expose patients to radiation. Fluoroscopy can have difficulty detecting compressions or crimps in the center of the stent at certain angles, and restriction in performing X-ray imaging only at a certain angle during the surgery could prevent operators from directly visualizing if the stent is fully open. While different techniques can be used to compensate for this, they increase the patient’s exposure to contrast and radiation.
To overcome these limitations, our team proposes the use of a 3D radio frequency (RF)-based imaging sensor that does not warrant a direct line of sight and a new deep neural network (DNN) called StentNet to obtain sensory feedback on the deployed stent’s state. The 3D RF-based sensor working within the bandwidth of 6.3-8.3 GHz is employed to detect the stent deployment. It uses antennas to transmit and capture modulated signals to construct a 3D mapping of the radiated space. We then employed the DNN network to classify the state of the stent based on the changes in the reflected signal obtained from the sensor. During the experiment, the sensor’s direct line of sight to the stent was occluded by a cardboard box. To further simulate the occlusion caused by human tissues during the surgery, we replaced the cardboard box with a slice of chicken or pork. The proposed solution achieved an overall accuracy of 90% in detecting whether the stent was fully deployed, partially deployed or deployed with compression in the center even when the sensor’s direct light of sight to the stent was occluded.
These results served as proof of concept for the use of RF-based sensors in detecting the deployed stent’s states. The results highlighted a few constraints of our proposed method; the effects of different occluders and limitations in the sensor’s output data size limited the sensing range. Our future work would focus on processing raw antenna signals to increase the sensing range. A cadaver study will also be conducted to simulate the real surgical scene and to validate the results.
Written by: Lalithkumar Seenivasan, Mengya Xu, Leonard Yeo, and Hongliang Ren
Reference: Mengya Xu, et al., ‘Stent Deployment Detection Using Radio Frequency‐Based Sensor and Convolutional Neural Networks‘ (2020.) DOI: 10.1002/aisy.202000092