Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Two common obstacles limiting the performance of data-driven algorithms in digital histopathology classification tasks are the lack of expert annotations and the narrow diversity of datasets. Multi-instance learning (MIL) can address the former challenge for the analysis of whole slide images (WSI), but performance is often inferior to full supervision. We show that the inclusion of weak annotations can significantly enhance the effectiveness of MIL while keeping the approach scalable. An analysis framework was developed to process periodic acid-Schiff (PAS) and Sirius Red (SR) slides of renal biopsies. The workflow segments tissues into coarse tissue classes. Handcrafted and deep features were extracted from these tissues and combined using a soft attention model to predict several slide-level labels: delayed graft function (DGF), acute tubular injury (ATI), and Remuzzi grade components. A tissue segmentation quality metric was also developed to reduce the adverse impact of poorly segmented instances. The soft attention model was trained using 5-fold cross-validation on a mixed dataset and tested on the QUOD dataset containing n = 373 PAS and n = 195 SR biopsies. The average ROC-AUC over different prediction tasks was found to be 0.598 ± 0.011 , significantly higher than using only ResNet50 ( 0.545 ± 0.012 ), only handcrafted features ( 0.542 ± 0.011 ), and the baseline ( 0.532 ± 0.012 ) of state-of-the-art performance. In conjunction with soft attention, weighting tissues by segmentation quality has led to further improvement ( A U C = 0.618 ± 0.010 ) . Using an intuitive visualisation scheme, we show that our approach may also be used to support clinical decision making as it allows pinpointing individual tissues relevant to the predictions.

Original publication

DOI

10.3389/frtra.2024.1305468

Type

Journal article

Journal

Front Transplant

Publication Date

2024

Volume

3

Keywords

Bayesian Neural Network (BNN), computer vision, digital histopathology, kidney transplant, multi-instance learning