AI Tool FaceAge Predicts Biological Age, Cancer Survival from Selfies

WASHINGTON, United States — A new deep learning algorithm called FaceAge uses selfies to predict biological age more accurately than chronological age, showing potential for medical decision-making and cancer care.

Doctors often begin patient examinations with an intuitive assessment known as the “eyeball test,” a quick judgment about whether a patient appears older or younger than their stated age. This assessment can sometimes influence key medical decisions. This traditional, intuitive evaluation may soon be enhanced with an artificial intelligence upgrade. FaceAge, a deep learning algorithm described Thursday in The Lancet Digital Health, is designed to convert a simple headshot photograph into a numerical value that more accurately reflects a person’s biological age rather than just the birthday recorded on their chart.

Predicting Biological Age and Survival

FaceAge was trained on a large dataset of photographs, specifically tens of thousands of portraits. Researchers found that the AI tool pegged cancer patients on average as biologically five years older than their healthy peers. The authors of the study suggest that this capability could help doctors make crucial decisions in patient care. They believe FaceAge could help determine which patients can safely tolerate punishing treatments and which might fare better with a gentler approach based on their biological age assessment.

Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston, and a co-senior author of the study, stated the potential application in cancer care: “We hypothesize that FaceAge could be used as a biomarker in cancer care to quantify a patient’s biological age and help a doctor make these tough decisions.” The source provides hypothetical examples to illustrate the concept: a spry 75-year-old whose biological age is assessed at 65 by FaceAge versus a frail 60-year-old whose biology reads 70.

Aggressive radiation might be appropriate for the former but risky for the latter. The same logic, the source suggests, could potentially help guide decisions about major medical procedures such as heart surgery or hip replacements or inform discussions about end-of-life care.

Sharper Lens on Frailty

Growing evidence indicates that humans age at different rates, with the pace influenced by various factors, including genes, stress levels, exercise habits, and lifestyle choices like smoking or drinking. While expensive genetic tests can provide insights into how DNA ages over time, FaceAge reportedly promises similar insights into a person’s biological age using only a selfie photograph. The model was trained on 58,851 portraits of presumed-healthy adults over the age of 60, with the images culled from publicly available datasets.

It was then tested on a group of 6,196 cancer patients treated in the United States and the Netherlands, using photos taken just before they underwent radiotherapy. The study found that patients with malignancies on average looked 4.79 years older biologically than their chronological age, according to FaceAge. Among cancer patients, a higher FaceAge score strongly predicted worse survival outcomes, even after accounting for the patient’s actual age, sex, and tumor type.

The hazard of worse survival rose steeply for anyone whose biological reading from FaceAge tipped past 85. Intriguingly, FaceAge appears to weigh the signs of aging differently than humans do, giving less importance to visible markers like gray hair or balding than to subtle changes in facial muscle tone.

Boosting Doctor Accuracy

The study also suggests that FaceAge could boost doctors’ accuracy in making certain predictions. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate in this task reportedly barely beat chance. However, when provided with FaceAge data, their predictions improved sharply, indicating the AI tool’s potential to enhance clinical assessment. The model even reportedly affirmed a favorite internet meme, estimating actor Paul Rudd’s biological age as 43 in a photo taken when he was 50.

Bias and Ethics Guardrails

AI tools have faced scrutiny for potential bias, particularly in underserving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge’s predictions, but the group is training a second-generation model on 20,000 patients. They are also probing factors like makeup, cosmetic surgery, or room lighting variations that could potentially fool the system. Ethics debates loom large. An AI that reads biological age from a selfie is a boon for clinicians but tempting for life insurers or employers seeking to gauge risk. Hugo Aerts, the study’s co-lead, stated, “It is for sure something that needs attention, to ensure that these technologies are used only for the benefit of the patient.”

Future Plans

Researchers are planning to open a public-facing FaceAge portal. This will allow people to upload pictures and enroll in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation.

IMPORTANT NOTICE

This article is sponsored content. Kryptonary does not verify or endorse the claims, statistics, or information provided. Cryptocurrency investments are speculative and highly risky; you should be prepared to lose all invested capital. Kryptonary does not perform due diligence on featured projects and disclaims all liability for any investment decisions made based on this content. Readers are strongly advised to conduct their own independent research and understand the inherent risks of cryptocurrency investments.

Share this article