Researchers at UT Southwestern Medical Center have developed a novel artificial intelligence (AI) model that analyzes the spatial arrangement of cells in tissue samples. This new approach, detailed in Nature Communications, accurately predicted outcomes for cancer patients, marking a significant advancement in utilizing AI for cancer prognosis and personalized treatment strategies.

“Cell spatial organization is like a complex jigsaw puzzle where each cell serves as a unique piece, fitting together meticulously to form a cohesive tissue or organ structure. This research showcases the remarkable ability of AI to grasp these intricate spatial relationships among cells within tissues, extracting subtle information previously beyond human comprehension while predicting patient outcomes,” says study leader Guanghua Xiao, PhD, professor in the Peter O’Donnell Jr. School of Public Health, Biomedical Engineering, and the Lyda Hill Department of Bioinformatics at UT Southwestern. Xiao is a member of the Harold C. Simmons Comprehensive Cancer Center at UTSW.

Tissue samples are routinely collected from patients and placed on slides for interpretation by pathologists, who analyze them to make diagnoses. However, Xiao explained, this process is time-consuming, and interpretations can vary among pathologists. In addition, the human brain can miss subtle features present in pathology images that might provide important clues to a patient’s condition.

Further reading: New COVID Test Uses AI to Improve Accuracy

Various AI models built in the past several years can perform some aspects of a pathologist’s job, Xiao added—for example, identifying cell types or using cell proximity as a proxy for interactions between cells. However, these models don’t successfully recapitulate more complex aspects of how pathologists interpret tissue images, such as discerning patterns in cell spatial organization and excluding extraneous “noise” in images that can muddle interpretations.

The new AI model, which Xiao and his colleagues named Ceograph, mimics how pathologists read tissue slides, starting with detecting cells in images and their positions. From there, it identifies cell types as well as their morphology and spatial distribution, creating a map in which the arrangement, distribution, and interactions of cells can be analyzed.

The researchers successfully applied this tool to three clinical scenarios using pathology slides. In one, they used Ceograph to distinguish between two subtypes of lung cancer, adenocarcinoma or squamous cell carcinoma. In another, they predicted the likelihood of potentially malignant oral disorders—precancerous lesions of the mouth—progressing to cancer. In the third, they identified which lung cancer patients were most likely to respond to a class of medications called epidermal growth factor receptor inhibitors.

In each scenario, the Ceograph model significantly outperformed traditional methods in predicting patient outcomes. Importantly, the cell spatial organization features identified by Ceograph are interpretable and lead to biological insights into how individual cell-cell spatial interaction change could produce diverse functional consequences, Dr. Xiao said.

These findings highlight a growing role for AI in medical care, he added, offering a way to improve the efficiency and accuracy of pathology analyses. “This method has the potential to streamline targeted preventive measures for high-risk populations and optimize treatment selection for individual patients,” says Xiao, a member of the Quantitative Biomedical Research Center at UT Southwestern.

Shidan Wang, PhD, a former faculty member in the O’Donnell School of Public Health, was the study’s first author. Other UTSW researchers who contributed to the research are Yang Xie, PhD, professor in the O’Donnell School of Public Health and the Lyda Hill Department of Bioinformatics; John Minna, MD, professor of Internal Medicine and Pharmacology and director of the Hamon Center for Therapeutic Oncology Research; Justin Bishop, MD, professor of Pathology; Xiaowei Zhan, PhD, associate professor in the O’Donnell School of Public Health and the Center for the Genetics of Host Defense; Siyuan Zhang, MD, PhD, associate professor of Pathology; Ruichen Rong, PhD, and Donghan M. Yang, PhD, assistant professors in the O’Donnell School of Public Health; Zhikai Chi, MD, PhD, assistant professor of Pathology; Qin Zhou, PhD, postdoctoral researcher; and Xinyi Zhang, PhD, graduate student researcher.

Xiao holds the Mary Dees McDermott Hicks Chair in Medical Science.

This study was funded by grants from the National Institutes of Health (P50CA070907, R01GM140012, R01DE030656, R01GM115473, 1U01CA249245, R35GM136375, P30CA008748, and P30CA142543) and the Cancer Prevention and Research Institute of Texas (CPRIT RP180805 and CPRIT RP230330).

Featured image: This illustration shows how a new AI model named Ceograph created by Dr. Guanghua Xiao and his team analyzes raw tissue images (left panel) and uses advanced algorithms to distinguish various cell types (middle panel). It then constructs a complex graph (right panel) that maps the shapes and spatial relationships among cells. This graph becomes the foundation for a neural network (blue spheres) that is adept at learning patterns and predicting patient outcomes (red sphere). Ceograph shows promise in helping doctors figure out what might happen with a patient’s cancer, which would help determine the best treatment plan. Photo: UT Southwestern