Chat with us, powered by LiveChat
Machine Learning
October 31, 2023

An Introduction to Machine Learning in Medicine

There is no question that machine learning has revolutionised our world and the way that we interact with it.  Most of us interact with it every day and struggle to remember a time before we could ask Alexa to “set an alarm for 6 AM tomorrow morning”. From GPS traffic predictions and spam filtering to security surveillance and fraud detection, machine learning has made our lives faster and more convenient than ever before.

Big Data in the Healthcare Sector

It is clear that a wide variety of industries have adapted to utilise machine learning, and the healthcare sector is no exception. The healthcare sector has always possessed a large amount of information, with an estimated amount of data around 150 exabytes (1 exabyte = 10^18) in 2011. This information is present in a wide variety of forms, including written medical records, medical images, laboratory test results, familial history and genetic databases. Therefore, due to the large amount of data, it is no surprise that machine learning and artificial intelligence find their place in the healthcare sector.

Applications

Medical Imaging and Diagnosis

The analysis of medical images was one of the first areas targeted when machine learning was first introduced in the healthcare sector. Currently, medical images, such as ultrasound, CT scans  and MRI scans, are evaluated by a trained individual such as a physician, pathologist or radiologist. These individuals utilise years of training and experience in order to consistently evaluate these images and make diagnoses. Although they are generally well trained, this process is prone to human error and can become expensive if multiple professionals are consulted.

Machine learning is being applied to this facet of medicine with the intention of assisting medical professionals with the accuracy and rate of diagnoses.

A survey of 308 papers discussing the application of deep learning in medical image analysis was performed in 2017. The survey found the application of deep learning on images of: the brain, the eye, the chest, digital pathology/microscopy, the breast, the heart, the abdomen and the skeleton. In each instance, deep learning was used in order to produce mathematical models capable of making diagnoses. This survey found that end-to-end training convolutional neural networks (CNNs) have become the preferred approach for these analyses.  

One of the main problems identified in the survey was the availability of large labelled datasets on which to fit the models. As previously mentioned, the availability of data is not the problem, however, the process of labelling images requires consultation with specialists which is not always an immediate possibility.

Natural Language Processing of Medical Documents and Literature

Natural language process (NLP) is a branch of machine learning that is associated with the interaction between humans and computers using the natural language. The objective of NLP is to read and make sense of human languages in a manner that produces value.

Since the healthcare sector is in the business of people, it is no surprise that NLP is easily applied to these problems. Two specific applications of NPL include: the identification of cirrhosis patients from electronic health records and radiological scans, and the identification of reportable cases of cancer from pathology reports; the former producing models with a 95.7% sensitivity and 93.9% specificity, and the latter producing models with an accuracy of 87.2% and a precision of 84.3%.

Machine Learning in Clinical Genomics

Clinical genomics is the study of clinical outcomes with genomic data. Specifically, it is the field of study that utilises the entire genome of a patient in order to diagnose a disease or adjust medications. The human genome refers to the entire set of nucleic acid sequences encoded as DNA – estimated to be roughly 1.5 gigabytes worth of data. Machine learning, due to its capacity to process a large amount of data, has been adapted to address various steps involved in clinical genomic analysis as explained in this article.  

These steps include: variant calling, genome annotation, variant classification, and phenotype-genotype correspondence. Each of these processes will be discussed below.

Variant Calling

Variant calling is the process of identifying individual genetic variants among the millions populating each genome and requires extreme accuracy. DeepVariant, a CNN-based variant caller recently outperformed standard tools on a variety of variant-calling tasks.

Genome Annotation

Genome annotation is the process of identifying the locations and function of genes and coding regions in the genome. A recurrent neural network (RNN) with long short-term memory (LSTM) called DeepAnnotator was trained and produced an F-score of 94%, outperforming existing methods.

Variant Classification

Variant classification is the process of inferring the impact of genetic variants on the functional genomic elements. LEAP, or Learning from Evidence to Assess Pathogenicity, is a machine learning model that was set up to perform this exact task. The model is a random forest classifier that achieved an area under the receiver operating characteristic curve (AUROC) of 98%.

Phenotype-Genotype Mapping

Finally, there is phenotype-genotype mapping, which is the process of linking the observable characteristics (phenotype) and the genome (genotype) obtained from the preceding analyses. This is the process of justifying or diagnosing based on the obtained genetic analysis. This is the most complex step, due to the significant genetic and physical symptomatic variation that is possible. DeepGestalt, is another CNN-based facial image analysis algorithm used in parallel with PEDIA, a genome interpretation system. This system was able to outperform human dysmorphologists in the clinical diagnosis of Noonan syndrome which is testament to the consistency of machine learning.

Challenges and Limitations

It is clearly evident that machine learning has started to become integrated into the healthcare sector with incredible results. It is, however, important to realise that the integration of these elements must be done responsibly and conservatively.

Regulatory Issues  

Due to the nature of healthcare and healthcare data, these algorithms raise a number of ethical challenges around the sourcing and privacy of the data used for training. Furthermore, these algorithms are subject to much scepticism due to their transparency, the regulatory process for retraining the algorithms as well as the liability associated with prediction error.

AI Interpretability

The ‘black-box’ nature of these machine learning systems is often criticised for generating an output with no justification. Due to the nature of clinical diagnosis, which is typically a high-risk environment, it would be ideal for such systems to provide human-interpretable interpretations for each prediction.

Data and Machine Bias

There is a fair amount of substructure, or bias, associated with risk factors and health outcomes. These biases can be as a result of socioeconomic status, cultural practices, unequal representation and other non-causal factors. These biases can have an adverse effect on model performance and generalisability. Therefore, it is important to ensure that machine learning models in this sector are specifically evaluated for forms of non-causal bias.

Conclusion

It is clear that machine learning has thrived due to the significant amount of data generated by the healthcare sector. A large amount of machine learning models have been built and applied to different facets of medicine with incredible results. These areas include medical imaging and diagnosis using NLP as well as clinical genomics. These algorithms have mostly outperformed the current state-of-the-art methods but are subject to scrutiny due to their regulation, interpretability and possible bias. Finally, although the technical development of machine learning has taken root in medicine, it will become increasingly important to develop the legislation in order to allow for full integration.