Digi Tails: Auto-Prediction of Street Dog Emotions

This project is lead by EASE's Dr Steve North and it is a sub-theme of the 'Tails from the Street' project (using trans-species ethnography to document, understand and help mitigate the ‘stray dog problem’ in Europe).

It represents a contribution to Steve's EASE research into Computational Anthrozoology.

Please note that there is a book section being prepared, describing this work:

North, S. in preparation. Who’s a happy boy? Developing digital era technologies to automatically predict emotional states from the faces of street dogs. In: HURN, S. (ed.) Tails from the Street: Kinship with dogs and the politics of canine identity. UK: Routledge.

The focus is on using Artificial Intelligence (AI) to predict canine emotions from videography of street dogs.

This work developed digital tools to analyse video data collected during EASE’s 2017 'Tails from the Street' Phase 1 fieldwork in Romania.

Using the OpenSource toolkit DeepLabCut, a deep neural network (DNN) was ‘trained’, initially using example video clips of Facial Action Units and Action Descriptors from the Dog Facial Action Coding System (DogFACS) and then adding video of street dogs from 'Tails from the Streets'.

After training, the network is now able to analyse novel video clips (either from ‘Tails’ data or other sources).

The output from this process is a new video file with coloured marker labels, indicating AI predictions about the moment-to-moment positions of key canine facial landmarks.

The literature on EMODOGFACS, allows a mapping to be made between the DogFACS alphabet of facial actions and a basic set of emotions.

EASE developed a software prototype to automate this mapping.

This tool displays predicted canine emotions as a video. This features an animated series of basic chart lines, which may then run in synchronisation with the labelled output video from EASE’s trained deep neural network.

Figure: Example Digi Tails analysis of a street dog from the 'Tails from the Street' Phase 1 2017 fieldwork in Romania.

References

Mathis, A., Mamidanna, P., Cury, K. M., Abe, T., Murthy, V. N., Mathis, M. W. & Bethge, M. 2018. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, (2018/08/20).  http://dx.doi.org/10.1038/s41593-018-0209-y. ISSN: 1546-1726 Available from: http://catniplab.github.io/journalclub/JCpapers/Mathis_markerless_defined_deeplearning.pdf

Meridda, A., Gazzano, A. & Mariti, C. 2014. Assessment of dog facial mimicry: Proposal for an emotional dog facial action coding system (EMDOGFACS). Journal of Veterinary Behavior: Clinical Applications and Research, 9, 6, p.e3. ISSN: 1558-7878

North, S. 2019. Software Program: Create EASE dog emotions video v1.0. DOI: 10.5281/zenodo.2638284. Available from: https://github.com/EASE-University-of-Exeter/create_ease_dog_emotions_video/releases/tag/v1.0

North, S. 2019. Software Program: EASE dog facial expression estimation DeepLabCut trained neural net v1.0. DOI: 10.5281/zenodo.2638281. Available from: https://github.com/EASE-University-of-Exeter/ease_dog_facial_expression_estimation_deeplabcut_trained_neural_net/releases/tag/v1.0

North, S. 2018. Computational Anthrozoology - a manifesto: ‘as the lens’ and ‘under the lens’. In Proceedings of the 27th International conference of the International Society for Anthrozoology (ISAZ 2018): 'Animals in Our Lives: Multidisciplinary Approaches to the Study of Human–Animal Interactions' (Charles Perkins Centre, University of Sydney, Australia. 2 - 5 July 2018). 83.  http://dx.doi.org/10.5281/zenodo.1319034

Waller, B. M., Peirce, K., Caeiro, C. C., Scheider, L., Burrows, A. M., McCune, S. & Kaminski, J. 2013. Paedomorphic Facial Expressions Give Dogs a Selective Advantage. PLOS ONE, 8, 12, p.e82686.  http://dx.doi.org/10.1371/journal.pone.0082686. Available from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.920.348&rep=rep1&type=pdf

Waller, B. M. 2017. Dog Facial Action Coding System (DogFACS) [Online]. Available: http://dogfacs.com.