Half a century ago, one of the hottest questions in science was whether humans could teach animals to talk. Scientists attempted to use sign language to converse with apes and parrots trained to use an increasing English vocabulary.
The work quickly attracted media attention and controversy. The research lacked rigor, critics argued, and what looked like animal communication could have simply been an illusion, as researchers subconsciously instructed their animals to respond in certain ways..
In the late 1970s and early 1980s, research fell out of favor. “The entire field completely disintegrated,” said Irene Pepperberg, a comparative cognition researcher at Boston University, who became known for her work with an African gray parrot named Alex.
Today, technological advances and a growing appreciation for the sophistication of animal minds have renewed interest in finding ways to bridge the gap between species. Pet owners are teaching their dogs to press “talking buttons” and zoos are training their apes to use touch screens.
In a cautious new paper, a team of scientists outlines a framework for evaluating whether such tools could give animals new ways to express themselves. The research is designed “to overcome some of the things that have been controversial in the past,” said Jennifer Cunha, a visiting research associate at Indiana University.
The paper, which will be presented at a scientific conference on Tuesday, focuses on Cunha’s parrot, an 11-year-old Goffin’s cockatoo named Ellie. Since 2019, Cunha has been teaching Ellie how to use an interactive “voice board,” a tablet app that contains more than 200 illustrated icons, corresponding to words and phrases including “sunflower seeds,” “happy,” and “ feel”. “When Ellie presses an icon with her tongue, a computerized voice speaks the word or phrase out loud.
In the new study, Ms. Cunha and her colleagues did not set out to determine whether Ellie’s use of the chat board amounted to communication. Instead, they used quantitative computational methods to analyze Ellie’s pressed icons and learn more about whether the speech panel had what they called “expressive and enrichment potential.”
“How can we analyze the expression to see if there may be a space for intention or communication?” said Mrs. Cunha. “And then, secondly, the question is: Could her selections give us insight into her values, the things that she finds meaningful?”
Scientists analyzed nearly 40 hours of video, collected over seven months, of Ellie using the voice panel. Next, they compared the icons she pressed with several simulations of a hypothetical user of a dialog panel selecting icons at random.
“Ultimately, they were all significantly different at multiple points in the real data,” said Nikhil Singh, a doctoral student at MIT who created the models. “This virtual user we had couldn’t fully capture what the real Ellie was doing when he used this tablet.”
In other words, whatever Ellie was doing, she didn’t seem to be just smashing random icons. The researchers found that the design of the voice panel, including the brightness and placement of the icons, also could not fully explain Ellie’s selections.
Determining whether Ellie’s selections were random or not “is a very good starting point,” he said. Federico Rossano, a comparative cognition researcher at the University of California, San Diego, who was not involved in the research. “The problem is that randomness is very unlikely.”
Just because Ellie wasn’t touching random icons doesn’t mean she was actively and deliberately trying to communicate her true desires or feelings, Dr. Rossano said. It is possible that she has simply been repeating sequences that she learned during training. “It’s like a vending machine,” he said. “You can learn to push a sequence of numbers and get a certain type of reward. It doesn’t mean you’re thinking about what you’re doing.”
To further investigate the possibilities, the research team looked for signs of what they called “corroboration.” If Ellie selected the apple icon, did she eat the apple she was given? If she selected a reading-related icon, did she engage with the book for at least one minute?
“You can give something to a bird and it will throw it or touch it,” Cunha said. “But for us it was about: Did she commit to that?”
Not all of Ellie’s selections could be evaluated in this way; The researchers found it impossible to determine, for example, whether she was actually feeling happy or hot at any given moment. But of the nearly 500 icons that could be evaluated, 92 percent were corroborated by Ellie’s subsequent behavior.
“It’s clear that there is a good correlation,” said Dr. Pepperberg, who was not involved in the research.
But proving that Ellie really understands what the icons mean will require additional testing, he said, suggesting that researchers try to deliberately bring Ellie the wrong object. to see how he responds. “It’s just another check to make sure the animal really understands what the label represents,” Dr. Pepperberg said.
Finally, the researchers attempted to assess whether the dialogue panel was serving as a form of enrichment for Ellie by analyzing the types of icons she selected most frequently.
“If it’s a means to an end, what is the end?” said Rébecca Kleinberger, an author of the paper and a researcher at Northeastern University, where she studies how animals interact with technology. “It seems there was a bias toward social activity or activity that means staying in interaction with the caregiver.”
About 14 percent of the time, Ellie selected icons for food, drinks or treats, the researchers found. On the other hand, about 73 percent of her selections were for activities that provided social or cognitive enrichment, such as playing, visiting another bird, or simply communicating with Ms. Cunha. Ellie also initiated use of the speech board 85 percent of the time.
“Ellie the cockatoo consistently interacted with her device, suggesting that it remained attractive and reinforcing to her for several months,” said Amalia Bastos, a comparative cognition researcher at Johns Hopkins University, who was not an author on the paper.
The study has limitations. There’s a limit to what scientists can extrapolate from a single animal, and it’s hard to rule out the possibility that Cunha may have been subconsciously cuing Ellie to respond in certain ways, outside experts said. But scientists also praised the researchers’ systematic approach and their modest claims.
“They’re not saying, ‘Can the parrot talk?’” Dr. Rossano said. “They say, ‘Can this be used for enrichment?'”
Dr. Bastos agreed. “This work is a crucial first step,” he said. It is also an example of how the field has changed, for the better, since the 1970s.
“Researchers currently working in the area are not making the same assumptions,” Dr. Bastos said. “We don’t expect animals to understand or use language the way humans do.” Instead, he added, scientists are interested in using communication tools to “improve the well-being of captive animals and their relationships with their caretakers.”