University of Veterinary Medicine

Using artificial intelligence to recognize pain in cats

Select Language

Artificial intelligence (AI) provides great opportunities to improve recognition of pain in cats, thus allowing for more humane treatment. A joint research team – from the University of Veterinary Medicine Hannover (TiHo) and the Department of Information Systems at the University of Haifa in Israel – is convinced of this.

One of the study’s findings: for precise recognition of pain by AI systems, the nose and mouth are more important than the ear region.

TiHo reported that the team of researchers compared two AI-based systems which automatically recognize and evaluate pain in cats with reference to facial expression. Their findings were published in the journal Scientific Reports.

"AI systems provide us with a tremendous opportunity in veterinary practice”

Professors Sabine Kästner and Holger Volk, of the Department of Small Animal Medicine and Surgery at TiHo, headed up this study in conjunction with Professor Anna Zamansky of the University of Haifa. Volk comments: "Our aim is to improve evaluation of pain in cats so that we can treat them more humanely. AI systems provide us with a tremendous opportunity for veterinary practice to improve medical care of cats.” Kästner adds: "It is already possible in certain animal species to evaluate pain with reference to facial characteristics. This involves measurement and categorization at selected points on the animals’ faces in different states of pain. A scientific system for pain assessment in cats already exists, known as the Feline Grimace Scale." However, assigning grimace scores takes a good deal of experience and expertise, and Zamansky adds that this method remains subjective and subject to bias. "That’s why we’re working on automated and AI-based systems allowing objective evaluation.”

Testing AI-based methods in real-world conditions 

A previous study on automated pain recognition in cats made use of images of a highly homogenous cat population. "To examine whether this AI-based method also works in the real world, we tested it on 84 randomly selected cats presented at TiHo’s Department of Small Animal Medicine and Surgery,” says Volk. This sample included animals of different breeds, ages, sex and health status.

Two different models

The team of researchers assessed the cats’ pain, using the validated pain assessment scale and clinical data on the animals in question. And the team also tested two different AI models. One approach was based on manually annotated facial landmarks, with the second approach based on facial-recognition landmarks automatically annotated by artificial intelligence. The first of these approaches achieved pain recognition accuracy of over 77 per cent. The machine-learning approach reached an accuracy level exceeding 65 per cent, making it slightly poorer. Kästner, who is Professor of Veterinary Anaesthesia and Analgesia at TiHo, notes: "Our results are promising, and the level of accuracy is already excellent. These systems are opening up new avenues to us in assessing pain felt by cats." Marcelo Feighelstein of the University of Haifa added: "The study also showed that varied data sets are needed if we are to obtain robust AI models.”

Which facial features are most important in pain recognition?

The team also looked into which facial characteristics are crucial in precise recognition of pain by the AI-based system. They found that the nose and mouth region plays a vital role in machine pain classification. By contrast, the ear region – which, previously, had also often been assessed as relevant in pain recognition – is less important. These observations were independent of the two differently used AI models. Zamansky says: "Knowing which facial features are important in machine pain recognition, we can now specifically refine the systems.” 

Videos

University of Veterinary Medicine Hannover ­on wissen.hannover.de

Videos of the University of Veterinary Medicine Hannover on the video portal of the Hannover Science Initiative.

read more