top of page

Animal behaviour studies using machine learning to speed up video processing

Video data is crucial for fine-scale animal behaviour studies, but it can be incredibly time consuming
for researchers to process the data collected. This issue can be overcome by using machine learning algorithms to rapidly extract the necessary information from the videos, generating a workflow which can be used by other researchers. This theme is being investigated by two projects - by Eliza Fernandez-Fueyo in her studies of Chacma baboon gestural communication, and by Larissa Barker in her assessment of animal responses to humans using Automated Behavioural Responce camera traps.

901A0357.jpg

Larissa's project is funded by the Leverhulme Trust and follows on from the research she did during her PhD. Larissa will be developing a machine learning pipeline to recognise pygmy marmosets from camera trap footage, then classify their behaviours. The data will be used to test contrasting hypotheses about how animals repond to humans. Larissa will then further develop the pipeline by applying it to new species, specifically red deer in the UK and white-lipped peccary in Peru.

IMG_4078.jpg
bottom of page