It might sound like a good idea to put cute dogs in anMRI machine and watch home movies. It can be educational.

The scientists used machine learning to decode the visual processing taking place inside the minds of a pair of dogs. They found that dogs are more visual than humans when it comes to actions.

It shows what a dog's brain priorities when it comes to vision.

"While our work is based on just two dogs, it offers proof of concept that these methods work on canines", says neuroscience PhD student,ErinPhillips, now atPrinceton

The paper helps pave the way for other researchers to apply these methods on dogs, as well as on other species, so we can get more data and bigger insights into how the minds of different animals work.

There was research done on two dogs. Three 30-minute videos were filmed using a Gimbal and aSelfie Stick. Humans and dogs are interacting with each other. Humans interacting with each other, a deer crossing a path, a cat in a house, and dogs walking on leashes were some of the activities that took place.

bhubo the dog preparing to watch a movie
Bhubo and his human, Ashwin Sakhardande, preparing for a movie. Bhubo's ears are taped down to keep noise-dampening earplugs in place, because MRIs are very loud. (Emory Canine Cognitive Neuroscience Lab)

While relaxing in an fMRI machine, Daisy and Bhubo were shown three 30 minute movies. The use of training techniques designed by psychologist Gregory Berns was used to achieve this remarkable accomplishment.

The researchers scanned the brains of Daisy and Bhubo as they sat, awake, alert, and comfortable in the machine, watching home movies for them. It sounds nice.

They didn't need anything. It was amusing because it's serious science, and a lot of time and effort went into it, but it came down to these dogs watching videos of other dogs and humans acting silly.

Daisy the dog in the fmri machine
Daisy the dog taking a turn in the fMRI machine. Her human, Rebecca Beasley, is not pictured. (Emory Canine Cognitive Neuroscience Lab)

The video data was categorized into objects and actions by the time it was recorded. The information and the brain activity of the two dogs were fed into a neural network called Ivis that was designed to map the activity of the brain to other objects.

The data was given to Ivis, as well as the two humans who watched the videos while in the hospital.

The artificial intelligence was able to map the human brain data to the objects and actions. Ivis was a bit jittery with the dogs. It didn't work for the objects. The actions were mapped to the brain activity with an accuracy range between 75 and 88 percent by the artificial intelligence.

Berns says Humans are very object oriented. We have an obsession with naming objects that makes us have 10 times as many nouns as verbs. Dogs seem to be less interested in who they are seeing and more interested in what is happening.

He said that dogs and humans have different ways of seeing the world. They only distinguish the shades of the spectrum that are blue and yellow.

Maybe it's because dogs need to be more aware of threats in their environment than humans do, or maybe it's because they rely on other senses. Dogs have a larger portion of their brain devoted to processing olfactory information than humans.

It's difficult to map brain activity to olfactory input, but it could be enlightening. It is possible to conduct more detailed research into the vision of dogs and other animals in the future.

Berns says that they have shown that they can monitor the activity in a dog's brain while it is watching a video. It is remarkable that we can do that.

The research has appeared in a journal.