Multi-view Learning with Perceptron for Dog Tail Displacement Identification

Carlos Enrique Greene Mex, Antonio Armando Aguileta Güemez, Jorge Alberto Ríos Martínez, Francisco Alejandro Madera Ramírez, Raul Antonio Aguilar Vera

Abstract


This paper presents a novel approach to enhancing human-canine communication through data fusion techniques that analyze tail displacement patterns in dogs. By integrating data from multiple viewpoints— specifically, the tail tip, hip, and neck—this study aims to improve the automatic interpretation of canine signals. Tail displacements to the right are generally associated with positive emotions, while leftward displacements suggest negative emotions. A Perceptron model was developed using this fused data and compared with a previous Perceptron model that used only tail-tip data. Various performance metrics, including accuracy, precision, recall, and F1-scores, and a statistical test was performed to identify which Perceptron model is the best at identifying these displacements.

Keywords


Dog, dog tail displacement, multi-view learning, perceptron

Full Text: PDF