Computer-aided method for calculating animal configurations during social interactions from two-dimensional coordinates of color-marked body parts
Language English Country United States Media print
Document type Journal Article, Research Support, Non-U.S. Gov't
PubMed
11591068
DOI
10.3758/bf03195390
Knihovny.cz E-resources
- MeSH
- Behavior, Animal * MeSH
- Observer Variation MeSH
- Computing Methodologies * MeSH
- Observation methods MeSH
- Swine MeSH
- Spatial Behavior * MeSH
- Reproducibility of Results MeSH
- Social Behavior * MeSH
- Software MeSH
- Videotape Recording MeSH
- Animals MeSH
- Check Tag
- Animals MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
In an experiment investigating the impact of preweaning social experience on later social behavior in pigs, we were interested in the mutual spatial positions of pigs during paired social interactions. To obtain these data, we applied a different colored mark to the head and back of each of 2 pigs per group and videotaped the pigs' interactions. We used the EthoVision tracking system to provide x,y coordinates of the four colored marks every 0.2 sec. This paper describes the structure and functioning of a FoxPro program designed to clean the raw data and use it to identify the mutual body positions of the 2 animals at 0.2-sec intervals. Cleaning the data was achieved by identifying invalid data points and replacing them by interpolations. An algorithm was then applied to extract three variables from the coordinates: (1) whether the two pigs were in body contact; (2) the mutual orientation (parallel, antiparallel, or perpendicular) of the two pigs; and (3) whether the pig in the "active" position made snout contact in front of, or behind, the ear base of the other pig. Using these variables, we were able to identify five interaction types: Pig A attacks, Pig B attacks, undecided head-to-head position, "clinch" resting position, or no contact. To assess the reliability of the automatic system, a randomly chosen 5-min videotaped interaction was scored for mutual positions both visually (by 2 independent observers) and automatically. Good agreement was found between the data from the 2 observers and between each observer's data and the data from the automated system, as assessed using Cohen's kappa coefficients.
References provided by Crossref.org