A new paper published in Neuron suggested that current AI systems are missing awareness of “internal embodiment,” causing shortcomings.
The authors of the paper suggested that AI systems are missing “a body that interacts with the physical world and an internal awareness of that body’s own states such as fatigue, uncertainty, or physiological need.” Building functional analogues to these understandings, they say, is “one of the most crucial and underexplored frontiers in the field.”
The paper, whose focus is specifically on multimodal large language models, define the difference as the systems’ ability to “process and generate text, images, and video to describe a cup of water,” but not to “know what it feels like to be thirsty.” For instance, several models were unable to define a small number of dots “arranged to suggest a human figure in motion.” Humans pull from a lifetime’s worth of perception and would not fail a similar test.
AI systems have no equivalent mechanism to the body’s ability to regulate its internal states with its organs, hormones, and nervous system. The authors propose a new “class of tests, or benchmarks, designed to measure a system’s internal embodiment.”