PSRC Main Site  |  Past & Future Meetings
Plastic Surgery Research Council

Back to 2020 Abstracts


Neural Networks For Nerve Analysis: Streamlining Axon Histomorphometry Using Machine Learning
Alison L. Wong, MD, MSE1, Harsha Malapati, BS1, Michael J. Wong, MD2, Mathieu Boudreau, PhD3, Nicholas von Guionneau, MBBS1, Julien Cohen-Adad, PhD3, Sami Tuffaha, MD1.
1Johns Hopkins Hospital, Baltimore, MD, USA, 2Dalhousie University, Halifax, NS, Canada, 3University of Montreal, Montreal, QC, Canada.

PURPOSE: Historically, axon histomorphometry has required manual thresholding, segmentation, counting and measuring of axons in representative micrographs. This is a time intensive and subjective process hampered by inter- and intra-rater reliability, with reported variability between observers as high as 9%. Deep learning is a subset of machine learning that uses convolutional neural networks where each network hierarchically defines specific features of images and does not require structured numerical input data. AxonDeepSeg (ADS) is a novel deep learning program trained on transmission electron micrographs that recognizes axons and performs histomorphometry automatically. We tested whether ADS could be used with light micrographs to perform reliable axon histomorphometry comparable to manual analysis.
METHODS: Two representative slices of adult Lewis rat median nerves were prepared using an osmium stain and imaged at a magnification of 100x. At this magnification, a single nerve produced 12-13 micrographs. For manual analysis, micrographs were imported into ImageJ (FIJI Package, version 2.0), and a 25x25 µm box (625 µm2) at the center of each image was sampled. Thresholding and segmentation were performed manually. Axon diameter and myelin thickness were individually measured; G-ratio was calculated from these values. The same micrographs were imported into ADS (version 0.2.6, open source), which performed thresholding and segmentation as a single function (fig.1). Axon measurements for the entire micrograph were generated but only those within the same coordinates as in the manual analysis were used. Descriptive statistics were calculated for axon count, g-ratio, and myelin thickness. The comparability of axon count was evaluated using a Bland-Altman plot and intraclass correlation (ICC). G-ratio, myelin thickness and axon diameter were compared using two sample t-tests and Cohen’s d for effect size.

RESULTS: The average time to segment an image in ADS was 20 seconds. There were a total of 25 micrographs for the two nerves. Axon count was 13.72±5.19 (standard deviation) manually and 13.72±5.78 in the ADS output. Bland-Altman comparison showed good agreement between the counts, with limits of agreement ranging from -6 to 6 (fig.2). ICC was 0.84 (P<0.01), indicating good to excellent agreement. G-ratio was smaller in ADS (0.425±0.10 vs. 0.474±0.10, t=-6.28, P<0.01; d=0.49). Myelin was thicker in the ADS compared to manual measuring (1.54±0.45 vs. 1.35±0.54, t=4.86, P<0.01; d=0.38). There was no difference in axon diameter (P=0.78).

CONCLUSION: ADS provided comparable counts, with a thicker measurement of myelin thickness leading to a smaller calculated g-ratio, but was significantly less time and resource intensive. Though not directly investigated, it is likely more consistent than manual analysis which has rates of misidentification as high as 11%. ADS is a promising tool for axon histomorphometry and future work should focus on additional training of the neural network to include additional disease states.


Back to 2020 Abstracts