Published in Nature Scientific Reports
As an analytic pipeline for quantitative imaging feature extraction and analysis, radiomics has grown rapidly in the past decade. On the other hand, recent advances in deep learning and transfer learning have shown significant potential in the quantitative medical imaging field, raising the research question of whether deep transfer learning features have predictive information in addition to radiomics features. In this study, using CT images from Pancreatic Ductal Adenocarcinoma (PDAC) patients recruited in two independent hospitals, we discovered most transfer learning features have weak linear relationships with radiomics features, suggesting a potential complementary relationship between these two feature sets. We also tested the prognostic performance for overall survival using four feature fusion and reduction methods for combining radiomics and transfer learning features and compared the results with our proposed risk score-based feature fusion method. It was shown that the risk score-based feature fusion method significantly improves the prognosis performance for predicting overall survival in PDAC patients compared to other traditional feature reduction methods used in previous radiomics studies (40% increase in area under ROC curve (AUC) yielding AUC of 0.84).
At The Hospital for Sick Children, we have opening for a fully funded MSc student (domestic applicants only) in the field of Machine Learning for Medical Imaging and Medicine for January 2021 admission to Institute of Medical Science (IMS) at the University of Toronto. The research project is AI in Medicine with the emphasis on radiomics and deep learning for diagnosis and prognosis of brain tumours, which requires a strong background in statistical analysis and machine learning. The successful candidate may have the option to start as a Research Assistant at SickKids in Sep 2020 until she/he transitions to MSc student in January 2021. If interested, please send your CV and transcripts along with list of references to farzad dot khalvati at utoronto.ca before Aug 23, 2020. The successful candidate will be invited to apply to the School of Graduate Studies at the University of Toronto.
Sensors Special Issue: Deep Learning-Based Imaging and Sensing Technologies for Biomedical Applications (Impact Factor: 3.27)
With the advent of deep learning, Artificial Intelligence (AI) models, including convolutional neural networks (CNNs), have delivered promising results for health monitoring and detection and prediction of different diseases using biomedical imaging and sensing technologies. These technologies help to improve the overall patient outcome by providing personalized diagnostics, prognostics, and treatment, improving the quality of life of patients. The unique challenges of developing AI models for health monitoring and disease diagnosis and prognosis using imaging and sensing technologies require customized models that go beyond off-the-shelf and generic AI solutions. These challenges include high accuracy, reliability, and explainability of the AI results for biomedical applications. To bring state-of-the-art research together, research papers reporting novel AI-driven imaging and/or sensing technologies with clinical applications are invited for submission to this Special Issue. The scope and topic of this Special Issue includes but is not limited to:
- AI-driven advances in biomedical optical imaging/sensing technologies (e.g., optical imaging, optical coherence tomography, near infrared spectroscopy, diffuse optical spectroscopy) for biomedical applications;
- AI-driven advances in medical image analysis using deep learning for different imaging modalities including X-ray, CT, MRI, PET, ultrasound, etc.;
- Advances in AI-based solutions for disease diagnosis and prognosis using imaging and/or sensing technologies;
- Advances in AI explainability solutions for imaging and/or sensing technologies that address different aspects of AI explainability, including novel attention map generators as well as ways to interpret the results and integrate them into clinical settings.
Dr. Farzad Khalvati