There is some recent news about Artificial Intelligence ‒ and it involves horses. A team of researchers has devised an AI model that can read and interpret the body language of horses in 3D so that humans can better understand how they feel pain and illness.
While not taking the place of X-rays or ultrasounds any time soon, the new modeling, called ‘Dessie’ after the famous English racehorse Desert Orchid, is able to transform 2D images from video into 3D to show a horse’s gaits and shape in real-time.
According to Earth.com, Dessie offers a new level of insight for owners and perhaps vets by translating changes in gait or other physical cues into 3D metrics. By isolating movement patterns from the video and turning it into a 3D model, Dessie can show changes in gaits more precisely. It does this by a process known as ‘disentangled learning’, which means it can separate pose and movement from other objects in the video such as background ‒ a factor that scientists think make Dessie’s modeling more reliable.
Given a single image of horses in any image style, Dessie reconstructs articulated 3D shape and pose of the horses.
Another benefit of Dessie is that it can generate 3D models from a simple phone video; no special high-tech gear is required. This means that rural horse farms and vet clinics don’t need expensive imaging equipment to assess an injury or lameness.
And because Dessie’s model is a digital record, it can be shared with other veterinarians and be viewed and compared to other records from other dates, either earlier or down the road after an injury to assess healing.
“We say we created a digital voice to help these animals break through the barrier of communication between animals and humans. To tell us what they are feeling,” Elin Hernlund, associate professor at Saint Louis University in Missouri and an equine orthopedics clinician told Earth.com. “It’s the smartest and highest resolution way to extract digital information from the horse’s body – even their faces, which can tell us a great deal.”