< Back to previous page

Dataset

Data and R code for What you see is where you go: visibility influences movement decisions of a forest bird navigating a 3D structured matrix

Animal spatial behaviour is often presumed to reflect responses to visual cues. However, inference of behaviour in relation to the environment is challenged by the lack of objective methods to identify the information that effectively is available to an animal from a given location. In general, animals are assumed to have unconstrained information on the environment within a detection circle of a certain radius (the perceptual range; PR). However, visual cues are only available up to the first physical obstruction within an animal’s PR, making information availability a function of an animal’s location within the physical environment (the effective visual perceptual range; EVPR). By using LiDAR data and viewshed analysis, we model forest birds’ EVPRs at each step along a movement path. We found that the EVPR was on average 0.063% that of an unconstrained PR and, by applying a step-selection analysis, that individuals are 1.57 times more likely to move to a tree within their EVPR than to an equivalent tree outside it. This demonstrates that behavioural choices can be substantially impacted by the characteristics of an individual’s EVPR and highlights that inferences made from movement data may be improved by accounting for the EVPR.
Publication year:2020
Accessibility:open
Publisher:Dryad
License:CC0-1.0
Format:pdf, txt
Keywords: Biology