Machine learning gives us an overview


Dog minds are read! Somewhat.

The researchers used functional magnetic resonance imaging (fMRI) of dogs’ brains and a machine learning tool to reconstruct what the dog sees. The results suggest that dogs care more about what is going on than who or what is involved.

The results of the experiment conducted at Emory University in Georgia in the United States are published In the Visual Experience Magazine.

Three 30-minute videos of two unleashed dogs were shown. Neuronal data was recorded with fMRI, and a machine learning algorithm was used to analyze patterns in the scans.

“We’ve shown that we can monitor activity in a dog’s brain while they are watching a video and, to at least a limited degree, reconstruct what they are looking at,” says Gregory Burns, professor of psychology at Emory. “The fact that we are able to do this is remarkable.”


Read more: Cockatoos teach each other how to open chests and humans exchange strategies to stop them


The use of fMRI has recently been developed to study cognition in humans and only a few other species including some primates.

“While our work is based on only two dogs, it provides proof of concept that these methods work in dogs,” says lead author Erin Phillips, of the University of St Andrews in Scotland, who conducted the research as a specialist in the Canine Cognitive Neuroscience Laboratory in Burns. “I hope this paper will help pave the way for other researchers to apply these methods to dogs, as well as to other species, so that we can gain more data and greater insights into how the brains of different animals work.”

Interestingly, machine learning is the technology that aims to simulate the neural networks in our brains by recognizing patterns and analyzing vast amounts of data.

The technology “reads minds” by detecting patterns within brain data that can correlate with what is being played in the video.

By attaching a video-recorder selfie stick at the dog’s eye level, the researchers filmed relatable scenes of the canine audience.


Read more: How we talk is important to animals


Recorded activities included petting dogs and receiving treats from people.

Scenes with dogs showing that they sniff, play, eat or walk. Other objects and animals included in the scenes included cars, bikes, scooters, cats, and deer, as well as people sitting, hugging, kissing, presenting a toy to the camera and eating.

Timestamps on videos helped categorize them into objects (eg dog, car, human, cat) and verbs (eg sniffing, eating, walking).

Only two dogs showed patience to sit through the feature film. For comparison, two people also underwent the same experiment. Both types, most likely, were persuaded by cures and tummy tucks.

The Ivis machine learning algorithm was applied to the data. Ivis was first trained in humans and the model was 99% accurate in mapping brain data to both object and motion manufacturers.

However, in the case of dogs, the model did not work with object-based classifiers. However, it was between 75 and 88% accurate in decoding work classifiers in canine fMRI scans.

Bobo dog wearing a silencer with scanning in the background
Bobo, appearing with owner Ashwin, prepares for a video viewing session in the fMRI scanner. The dog’s ears are taped to earplugs that dampen the noise of the fMRI machine. Credit: Emory Canine Cognitive Neuroscience Lab.

“We humans are very objective-oriented,” says Burns. “There are 10 times as many nouns in the English language because we have a special obsession with naming things. Dogs seem to be less interested in who or what they see and more interested in the action itself.”

Dogs only see shades of blue and yellow but have a slightly higher density of vision receptors designed to detect movement.

“It makes perfect sense that dogs’ brains are so attuned to the actions first and foremost,” Burns adds. “Animals must pay close attention to things going on in their environment to avoid eating them or to keep an eye on animals they might want to hunt. Work and movement are essential.”

Philips believes that understanding how animals perceive the world is important in her own research into how predator reintroduction into Mozambique affects ecosystems.

“Historically, there hasn’t been much overlap in computer science and the environment,” she says. “But machine learning is a growing field that is beginning to find wider applications, including in ecology.”




Leave a Reply

Your email address will not be published.