A.I. Can Already See You in Ways You Can’t See Yourself
June 17, 2025
By Charley Locke Art and animation by Daniel Savage
Share full article
0
Picture a human face. The eyebrows are furrowed and the eyes wide. The lips are pressed together, turning downward at one corner.
When A.I. sees this face, it evaluates the features using a combination of metrics: how wrinkled the nose is, how squinted the eyes are, whether the jaw is clenched. Then it correlates those features with a range of emotions along with other states such as confusion and engagement.
The conclusion: This expression is associated with anger, sadness and surprise.
Not only can A.I. now make these assessments with remarkable, humanlike accuracy; it can make millions of them in an instant. A.I.’s superpower is its ability to recognize and interpret patterns: to sift through raw data and, by comparing it across vast data sets, to spot trends, relationships and irregularities.
As humans, we constantly generate patterns: in the sequence of our genes, the beating of our hearts, the repetitive motion of our muscles and joints. Everything about us, from the cellular level to the way our bodies move through space, is a source of grist for A.I. to mine. And so it’s no surprise that, as the power of the technology has grown, some of its most startling new abilities lie in its perception of us: our physical forms, our behavior and even our psyches.
Consider a small part of the human face: the eyes.
A.I. ocular tracking has gotten very good at analyzing tiny, involuntary eye movements such as pupil dilation and blinking. In some cases, it can predict where our gaze is drifting before we even know it ourselves.
By noting how our eyes move and comparing that information with thousands of recordings of people driving, A.I. can help bus and truck manufacturers determine when drivers are — or are about to be — distracted. By tracking where we look and how long our gaze lingers, it allows video-game developers to fine-tune user experience so minutely that players can aim a weapon using only their eyes.
Gaze duration
Blink rate
Direction of gaze
Pupil dilation
A.I. has also made lie-detection technologymore sophisticated. A traditional polygraph measures changes in the subject’s body — heart rate, sweating and blood pressure — as they answer a series of questions. But now an infrared camera can measure pupil dilation and eye movements during questioning, and that information can then be uploaded to a server, where A.I. reads the data and determines a credibility score. This technology has been used to hire United Nations personnel and to question suspects in police investigations.
A.I. is already revolutionizing the way doctors look for patterns inside our bodies.
A radiologist, screening for tuberculosis, might look at an X-ray and notice, for example, enlarged lymph nodes or fluid around the lungs. Or, in the image below, they might see a tiny nodule — about the size of a pencil eraser — that could possibly be lung cancer.
X-RAY OF A CHEST
But A.I. can look at a grayscale image and detect differences beyond the ability of the naked human eye. It could see the initial shape of a nodule forming and predict whether it was cancerous or benign or look at a patchy white area in the X-ray and determine whether it was more likely to be associated with tuberculosis or pneumonia.
A.I. can also help screen for cardiovascular disease, which is particularly meaningful for patients whose symptoms, like shortness of breath, are so common that they are often missed by doctors. Developers have trained one A.I. model to specifically predict those often-misdiagnosed cases and recommend certain patients for further care before they leave the hospital or clinic.
Blood work is one of our most important forms of bodily data.
It’s easy to see why this has proved irresistible as a target for health care start-ups: Blood testing is such a rich source of information that even marginal improvements could revolutionize the medical field and save lives.
Researchers are training A.I. to spot patterns in our blood, diagnosing cancer and heart disease. A.I. can also streamline the process of delivering blood transfusions, predicting supply from blood banks and demand from patients.
Our blood keeps an internal log of all the pathogens that have breached our immune defenses, recorded in the form of antibodies and receptors on white blood cells that can recognize those antibodies. One A.I. model has been trained to find patterns in the DNA of the receptors, which indicate responses to specific diseases.
Scientists have struggled to figure out exactly how our immune systems work.
Our immune systems are made up of almost two trillion blood cells.
Each cell has receptors that detect dangers and release attacks.
Each receptor is made up of a person’s unique DNA …
... making it hard to find patterns among people.
With autoimmune diseases like lupus, the immune system attacks itself. Diagnosis can be a lengthy process, requiring physical exams, a range of lab tests and ruling out other possible conditions. But researchers are using A.I. to sequence the immune system’s receptors and figure out which cells protect someone from tetanus or the flu and which are attacking their own body. The goal is to diagnose certain autoimmune diseases more quickly and accurately.
Our bodies also generate countless patterns with our physical movement.
A.I. is currently being used to optimize training for athletes, to improve prosthetic limbs, to correct movements in physical therapy and even to analyze many bodies at once — for example, to monitor crowd density and flow during public gatherings.
But the most novel and creative use of A.I. and biomechanics might be in animation.
Traditionally, animators had to create a digital skeleton beneath a character’s outer layer, then manually set the character’s poses. A.I. can now create incredibly realistic smooth movements that the animator can manipulate quickly and experiment with before committing to a certain pose.
With one A.I. tool, you can move a character’s foot and the rest of the body will move accordingly, much like a human’s does. The animator acts almost like a choreographer, setting a few poses and allowing the A.I. to fill in the frames between them. Animators can easily generate different kinds of movement, like walking, running or jumping.
PHYSICS APPLIED IN GREEN
A.I. also allows an animator to apply physics to a character’s body, automatically calculating the center of mass, weight distribution and momentum. For a character doing a backflip, for example, an animator can adjust where the figure lands, which automatically changes the arc of his flip. If an animator wants a character to jump higher, he can tweak the maximum height; the model will automatically adjust how forcefully the character jumps from the ground.
The foundational data set in our bodies is the firing of neurons, which creates everything we think and are.
Researchers are experimenting with A.I. to produce sentences from the brains of A.L.S. patients who have lost the ability to speak; to objectively measure how much pain a person is experiencing; and to treat Parkinson’s with “brain pacemakers.”
In one fascinating 2024 study that built on earlier research, A.I. was fed an image of blood flow in the brain that had been generated by functional magnetic resonance imaging (f.M.R.I.). When a person looked at an image of a cat, A.I. was able to read the fMRI data and generate a very realistic image of a similar cat.
Our brains are so vast that they have been compared to the universe.
The brain is made up of billions of cells called neurons.
A single thought requires messages sent between countless neurons.
Understanding how we think requires recognizing patterns ...
.... across a vast network, all at once.
Visual-reconstruction A.I. still has limitations, but the study’s suggestion felt profound: What if this technology could ultimately work as a kind of translator for our brains, recreating detailed memories or dreams, allowing others to see what we see?
The patterns generated by our neurons are the ultimate frontier in human self-knowledge — a system so complex that even neuroscientists haven’t fully deciphered it. The human brain, like A.I., has been compared to a “black box”; while we can comprehend its inputs and outputs, its exact machinery remains mysterious to us, too intricate and dynamic to map.
Will A.I., through its phenomenal powers of pattern recognition, be able to shed light on that mystery? If so, regardless of whether the technology ever achieves so-called “superintelligence,” its great legacy could be what it tells us about our own minds.
Charley Locke is a contributing writer for the magazine. She has recently written about what brings teenagers joy, how they are experiencing climate change, retirement rituals and memorizing poetry.
In a few key areas, humans will be more essential than ever.
After Peter Listro was diagnosed with blood cancer, his family decided to make a virtual avatar they can talk to after his death.
The technology’s ability to read and summarize text is already making it a useful tool for scholarship. How will it change the stories we tell about the past?
Either way, let’s not be in denial about it.
Share full article
0
……Read full article on The New York Times-Science
Technology Science Health
Comments
Leave a comment in Nestia App