Artificial Intelligence: From Science Fiction to an Everyday Tool

Written by Nascent Transcript writer Jennifer Zieba

The concept of Artificial Intelligence (AI) was brought forth in the mid-1950s when John McCarthy and Marvin Minsky used the term to describe machines that could do more than carry out simple input calculations. Now, AI and advanced machine learning techniques have not only become ubiquitous in our daily lives, e.g., traffic prediction and personalized news feeds, but are also an important tool in the advancement of science and medicine. Knowing how and when to use these tools is a skill that will be demanded more often as the technology advances.

What is AI?

Whereas AI still sounds futuristic, it is a relatively general term that refers to any machine that can perform human-like cognitive skills. Scientists have been building on the concept of AI for decades, designing advancements like machine learning, neural networks, and deep learning.

While AI can perform computations without input, machine learning refers to an AI system that can learn to make predictions from a given data set. As the field advanced, neural networks were designed with machine learning algorithms using multiple nodes similar to the neurons in a human brain. Finally, deep learning refers to an advancement of both machine learning and neural networks, using several layers of computation at each node in order to powerfully analyze the input data. Unlike machine learning, deep learning can learn and make intelligent decisions on its own based on that data.

Where is AI used?

One of the first modern uses of machine learning in medicine was to process histopathological images derived from cancer patients. In the last decade, these algorithms have been used to consistently detect disease progression and survivability in numerous cancers, including non-small cell lung cancer, pulmonary adenocarcinomas, and cervical cancers. More recently, several groups have combined histopathology and transcriptome data to predict patient survival, some with more than 80% accuracy.

As AI has advanced, so has its use in image processing. Last year, for example, researchers utilized deep learning to predict organelle distribution in cells. By teaching the program using hundreds of fluorescently tagged images, it can now identify cell types using simple brightfield microscopy, an advancement that saves both time and resources.

The field of genetics has also begun to utilize the power of deep learning. A recent paper by Allen et al. outlines deep learning methods to accurately predict the mutations generated using CRISPR-Cas9. These methods can also predict which gRNA sequence would be needed to repair a specific human mutation, something that could eventually be used in a clinical setting.

Although single-cell RNA sequencing has become quite common, separating the signal from the noise continues to present an issue. In another example, this year, two groups have developed deep learning algorithms that can extract more information from these datasets by eliminating artifacts and biological variation while also combining and comparing data sets from different sources. This will make single-cell sequencing a more accurate option going forward.

How can I use AI?

AI still has a ways to go before becoming fully implemented by the scientific community. However, there are resources currently available for researchers working with large biological datasets that can be found in the citations below and places like GitHub. Online classes through and Coursera, for example, can help scientists introduce themselves to the concepts of machine/deep learning. Advancements in AI are moving quickly, and within the next decade, AI will be utilized in some way by most researchers. So, it’s time to get on board!

ASHG uses cookies to provide you with a secure and custom web experience. Privacy Policy