New AI research method predicts metastatic risk in skin cancers using generative neural network in combination with supervised machine learning


Researchers have developed an artificial intelligence program that determines which skin cancer can be highly metastatic.

The study published in Cellular systems explores how AI systems can transform the pathology of cancer and other diseases. Researchers at UT Southwestern Medical Center have developed a method that predicts skin cancer that can be highly metastatic, using artificial intelligence. They reverse-engineered a conventional neural network autoencoder to recognize cellular properties that differentiate aggressive metastatic melanoma from less aggressive metastatic melanoma using unmarked images of living cells. This was only possible by amplifying the cell images in silico using cell characteristics that define metastatic efficiency, which was ultra-fine to find in raw images. The approach to predict metastatic efficiency was validated by comparing predictions and probing the spread of new xenografted melanoma cell lines in mice.

Head of Study and Professor and Chairman of the Department of Bioinformatics Lyda Hill at UTSW Gaudenz Danuser, PhD said that they have now developed a general framework that allows them to take tissue samples and anticipate the mechanisms inside cells that cause disease. Also, he added that the tools are currently inaccessible in any other way.

AI technology has been remarkably new in recent years. Danuser said that the use of tools or methods based on deep learning and AI can differentiate images invisible to the human eye.

AI technology can help research various properties of the disease to provide insight into diagnoses or guide a treatment plan has been recommended by researchers. However, Danuser also said that the differences favored by AI are generally not illustrated in particular cellular traits, making AI difficult to use in the clinic.

Danuser and his research team used AI to find the differences between melanoma cells with high and low metastatic potential images to solve this problem. The researchers then reverse-engineered the AI ​​results to find the features in the image that were responsible for the differences.

The researchers filmed a video of approximately 12,000 random cells from tumor samples taken from seven patients with information on their disease progression, including metastases in petri dishes, which generated approximately 1,700,000 unprocessed images. After that, the researchers then used an AI algorithm to find 56 different conceptual digital features from the images.

One characteristic that researchers have accurately found makes the difference between cells with high and low metastatic potential. Researchers developed artificial images that amplified the innate visible quality of metastases invisible to the human eye using conceptual digital features.

The extremely large metastatic cells formed slightly more pseudopodia wings and exhibited increased light scattering. This was an effect that could be due to ultra-fine rearrangements of the cell membrane.

To further demonstrate the convenience of this tool, the researchers primarily differentiate likely metastatic cells from human melanomas, which have been frozen and grown in Petri dishes for 30 years and then implanted into mice. Those predicted to be extremely high metastatic developing tumors spread with pleasure to all animals, while those that indicated low metastatic potential spread within a minute or not at all.

Danuser suggested that this approach needs further consideration before it can be implemented in clinical care. However, Danuser added that it may be possible to use AI to find differences between the predominant characteristics of cancer and other diseases.

Research Paper: