how do birds stabilize their heads

1Biomotion Lab, Department of Psychology, Department of Biology, School of Computing, Queens University Kingston, Kingston, ON, Canada

2Applied Cognitive Psychology, Faculty of Engineering, Computer Science and Psychology, Institute of Psychology and Education, Ulm University, Ulm, GermanyFind articles by

1Biomotion Lab, Department of Psychology, Department of Biology, School of Computing, Queens University Kingston, Kingston, ON, CanadaFind articles by

Stabilization of the head in animals with limited capacity to move their eyes is key to maintain a stable on the retina. In many birds, including pigeons, a prominent example for the important role of head stabilization is the characteristic head-bobbing behavior observed during walking. Multimodal sensory feedback from the eyes, the vestibular system and proprioceptors in body and neck is required to control head stabilization. Here, we trained unrestrained pigeons (Columba livia) to stand on a perch that was sinusoidally moved with a motion platform along all three translational and three rotational degrees of freedom. We varied the frequency of the perturbation and we recorded the pigeons responses under both light and dark conditions. Head, body, and platform movements were assessed with a high-speed motion capture system and the data were used to compute gain and phase of head and body movements in response to the perturbations. Comparing responses under dark and light conditions, we estimated the contribution of visual feedback to the control of the head. Our results show that the head followed the movement of the motion platform to a large extent during translations, but it was almost perfectly stabilized against rotations. Visual feedback only improved head stabilization during translations but not during rotations. The body compensated rotations around the forward-backward and the lateral axis, but did not contribute to head stabilization during translations and rotations around the vertical axis. From the results, we conclude that head stabilization in response to translations and rotations depends on different sensory feedback and that visual feedback plays only a limited role for head stabilization during standing.

Vision is the most important sensory modality to obtain distant information about our environment. To ensure the perception of relevant information, animals do not only need specialized eyes (Jones et al., 2007), but also appropriate control mechanisms to coordinate the movements of their eyes in external space. This involves pursuit eye movements, exploratory saccades, but also eye and head movements to maintain fixation while compensating for body movements in space.

Humans can actively move their eyes in a wide range. They perform eye saccades to change fixation from one object to the other and smooth pursuit eye movements to keep moving objects in focus. To stabilize the eyes, two reflexes play major roles: the vestibulo-ocular reflex (VOR) and the optokinetic reflex (OKR, Angelaki and Cullen, 2008). The OKR, which is based on visual feedback, is especially helpful to fixate an object while there might be relative motion between the object and the observer. In contrast, the VOR can be seen as a feed-forward mechanism that uses linear and angular acceleration measured by the vestibular system to control eye muscles in order to stabilize the on the retina.

Fixating by means of eye movements that compensate translations of the head and body through space achieves stabilization only on the part of the retina that is fixating, generally the center of the fovea. Perifoveal regions will experience parallactic movements that increase with eccentricity and also depend on the distance of visual objects from the observer.

In humans, high acuity is concentrated in the relatively small area of the central fovea. Many vertebrates, particularly those with laterally places eyes and those under heavy predatory pressure distribute visual acuity over much larger areas of the visual field. This is particularly true for many birds (Jones et al., 2007). Pigeons, for instance, have two areas of high photoreceptor density in each eye and both of them are much larger than the four-degree fovea in the human eye (Land and Nilsson, 2012).

The only way to maintain a perfectly stable over extended areas of the visual field is to not move the eye at all with respect to the visual environment—at least for short periods of time. Many birds adopt that strategy. A walking pigeon, for example, locks its head in space as long as possible and then thrusts it with one sudden, ballistic movement into a new position, where it again becomes motionless while the body catches up in a more continuous movement. The alternation of these so-called hold and thrust phases is called “head-bobbing” (Dunlap and Mowrer, 1930; Friedman, 1975; Frost, 1978; Troje and Frost, 2000). Head-bobbing occurs mainly during walking, but can also be observed in swimming birds.

While the saccadic head-movements during head-bobbing are mainly translatory, birds also rotate their head with sudden fast movements and then keep it relatively stable between these rotational thrusts (Kress et al., 2015). At least during locomotion on the ground and while perching, rotational head saccades are observed in all bird species. In contrast, the saccadic translational movements characteristic for head-bobbing are only observed in some bird species, but not in others.

Of course, there is a major difference between rotational and translational perturbations: for rotations, angular compensation of gaze must be always as large as the angular perturbation to ensure a stable on the retina, whereas during translation required angular adjustments are generally small, particularly when fixating at distant objects (Land, 1999).

Given their physical nature, rotations, where retinal velocity is perfectly predictable and does not depend on distance, and translations, where the velocity of the retinal flow depends on the distance to the observed object, require very different sensory systems to respond to the corresponding perturbations. In general, it is advantageous if the respective sensory system is located on the structure that is perturbed. For instance, the VOR in humans is so fast because the vestibular sensors are located on the head (the structure that is perturbed) rather than on the eye (the part that needs to be controlled). Direct measurements of the cause of the perturbation can be used for fast feed-forward control, while measurement of the structure that is controlled can only provide an error signal for slower feedback control (Wolpert et al., 1998).

The VOR in humans and other vertebrates clearly implements a feed-forward system that efficiently controls eye movements (Angelaki and Cullen, 2008). Likewise, in flies the halters located on the body are used for feed-forward control of head movements (Sandeman and Markl, 1980; Hengstenberg, 1988). However, it is not clear how birds control their head movements to maintain a stable on the retina. Visual feedback alone is comparatively slow leading to blurred s, and if vestibular sensory information had to be used to stabilize position and orientation of the head (VCR), the sensors would move along with the structure that is meant to be controlled, i.e., the head (Wilson et al., 1994) and can therefore only be used for feedback control. In pigeons, a good candidate to provide direct information about body orientation is the lumbosacral vertebral canal system, which has been suggested to work similarly as the vestibular system as an equilibrium sense and might be able to measure rotational movements (Necker et al., 2000; Necker, 2005, 2006).

In pigeons, head-bobbing can be observed in response to visual flow (Friedman, 1975; Frost, 1978), which seems to be sufficient to elicit eye (optokinetic nystagmus, OKN, (Mowrer, 1936; Nye, 1969; Wallman and Velez, 1985) as well as head rotations (opto-collic reflex, OCR; (Gioanni, 1988a; Wallman and Letelier, 1993; Maurice et al., 2006). Head rotations based on the OCR account for about 80–90% of the gaze response if the head is free to move (Gioanni, 1988a). In addition, vestibular reflexes play a major role in gaze stabilization. The VOR—similar to humans but with smaller amplitudes—can induce rotational eye movements (Dickman and Angelaki, 1999; Dickman et al., 2000) and the vestibulo-collic reflex (VCR) serves head stabilization (Gioanni, 1988b; Goode et al., 1999). Although the VOR is under-compensatory in response to head rotations in the dark, gaze was well-stabilized in a head free condition (Haque and Dickman, 2005). Again, head movements contributed more to gaze compensation than eye movements. In a follow up study involving vestibular lesions, it was shown that the vestibular system plays a major role for gaze compensation (Haque et al., 2008). Furthermore, head stabilization is state dependent: When a pigeon is in a simulated flight mode, head stabilization in response to body rotations in darkness was much better compared to the non-flying condition (Maurice et al., 2006; McArthur and Dickman, 2011). These examples show that gaze is mainly stabilized through head stabilization in response to rotational perturbations.

Overall, optokinetic compensations of rotational perturbations can be found in almost any animal. Often they are not exclusively performed by eye movements but include head or even whole body movements, e.g., birds (Gioanni, 1988a), flies (Land, 1973) and even water flea (Frost, 1975). In contrast, translational perturbations are more often compensated with eye movements. The difference in the response to rotational and translational perturbations is particularly obvious in birds: as far as we know, all birds conduct head rotations by means of sudden saccadic movements, whereas only some birds perform head-bobbing.

Here, we compared head stabilization in response to translational and rotational perturbations in pigeons. In contrast to most previous studies, where restrained birds were rotated in the dark or head movement was analyzed in response to visual motion, we used freely standing pigeons to allow the whole body, including legs, trunk and head to respond to perturbations of the platform on which they were standing either in light or dark conditions. This setup models the natural situation when a pigeon perches on a moving branch. To investigate head and body stabilization during translations and rotations, we trained pigeons to stand on a perch, which was sinusoidally translated along and rotated around the three spatial dimensions with different frequencies and amplitudes.

For the experiment, we used 10 (7 male, 3 female) rock doves (Columba livia) that were originally obtained from pigeon breeders in Ontario, Canada (Limestone City Flyers) and kept at Queens University under veterinary supervision. The birds were between 4 and 10 years old, weighed between 380 and 520 g and were housed in a separate aviary for the duration of the experiment. Food and water were available ad libitum and light was kept on a 12 h on/off cycle. All experiments conformed to the ethics requirements for animal research of the Canadian Council on Animal Care (CCAC) and were approved by the Queens Animal Care Committee.

The experimental setup consisted of a six degree of freedom motion platform (116 × 116 × 36 cm, W3s 6DOF Motion System, CKAS, Australia) onto which we had mounted a polyvinyl chloride box frame (41 × 20.5 × 41.5 cm) that was covered with mist netting (Figure ). The pigeons stood on a perch (19 × 1.5 × 1.5 cm) in the center of the frame. The perch was located 5 cm above the platform and was aligned with the lateral axis (y-axis) of the coordinate system used to describe and measure movements of motion platform, as well as head and body of the pigeon. Therefore, the coordinate system was defined as followed: The x-axis was the platform dimension along which the bird was perched (forward-backward). The y-axis was the horizontal axis perpendicular to x, which coincides with the lateral axis of the perched bird. The z-axis is the vertical axis (Figure ).The platform was placed in the middle of a window-less room (3.2 × 3.4 m) in which the door, the cameras with tripods (Figure ), a table with metal boxes and a table with the measurement computer provided a rich visual environment during light conditions. In dark conditions, the room was completely darkened with every light source, e.g., power lights from the cameras and the computer, covered with tape and the computer monitor switched off. The only light in the room was the infrared light required by the optical motion capture system. The LEDs of that system are invisible to the human system and given that their light is entirely in the infrared part of the spectrum. The same is probably true for pigeons, because the absorption spectrum of the cone class in the pigeon retina with longest wavelength spectrum peaks at 565 nm, which is almost exactly where the human L-cone peaks (Bowmaker, 1977; Bowmaker et al., 1996). Movement of the motion platform was controlled with customized Python scripts that also synchronized the start of the motion capture recording via an analog signal sent to the motion capture system.

The motion platform was surrounded by a five-camera motion capture system (Oqus 300 series, Qualisys AB, Sweden) that recorded three-dimensional positions of passive retro-reflective markers at a sample rate of 360 Hz with a spatial resolution (RMS error) smaller than 1 mm. Four markers were attached to the motion platform (Figure ) to record its movements with similar accuracy as the pigeons movements.

To record the movements of the pigeons, we created two custom-made lightweight rigid frames, a triangle for the head (2 g) and a rectangle for the back (3 g, see Figure ). Each frame had four retro-reflective markers (3 × 3 Designs Ltd., 4 mm in diameter) attached to it. In order to attach the frames to the birds, two areas of feathers (one on the head and one on the back) were trimmed and patches of Velcro were attached to them using white crafts glue. These patches remained on the birds until data collection was finished. The frames with the motion capture markers were attached to the birds via the Velcro patches prior to an experimental session, and were removed once the session was completed.

Perturbations consisted of sinusoidal movements of the motion platform with five different frequencies (0.25, 0.5, 1, 2, 4 Hz) for translations (x: forward-backward, y: lateral, z: vertical) and four different frequencies (0.25, 0.5, 1, 2 Hz) for rotations around x, y and z. By adjusting the amplitude of the movements to the different frequencies (Translations: 50, 25, 12.5, 6.25, 3.125 mm; Rotations: 9, 4.5, 2.25, 1.125 degrees) we kept the maximum velocity constant at 78.5 mm/s and 14 deg/s for translations and rotations, respectively. These amplitudes were chosen due to the limitations in amplitude (50 mm; 10 degrees), velocity (100 mm/s; 15 deg/s) and acceleration (2942 mm/s2; 150 deg/s2) that the motion platform was able to handle. The motion capture data revealed that the measured amplitudes, velocities and accelerations were close to the intended values for low frequencies, but a bit smaller for 2 and 4 Hz perturbations (Supplementary Table 1). The rotational acceleration limit (150 deg/s2) prevented us from rotating the platform with 4 Hz, because the maximum amplitude would have been unreasonably small (only 0.02 degrees).

The experimenter attached the two marker frames on the head and the back of the bird and placed the bird on the perch. The birds had been trained to stay on the perch and were not restrained. In the rare case that the bird stepped off the perch, the recoding was stopped and the whole session was repeated after the bird was placed back on the perch.

In total, we recorded 40 sessions (20 light and 20 dark) per bird within the same day. Light and dark conditions were blocked, such that five sessions, each containing perturbations in all six degrees of freedom (translations in and rotations around x, y and z) of the same frequency in light condition were followed by five sessions in dark condition. Sessions with low frequencies (0.25, 0.5, and 1 Hz) were recorded separately, whereas the two higher frequencies (2 and 4 Hz) were recorded together, to reduce the total duration of the experiment. Given that every direction was presented once during each session, sessions consisted of six (low frequencies) or nine (high-frequencies) trials (4 Hz only translations).

A trial was defined as perturbation along one direction and consisted of 10 sinusoidal cycles, except for trials of the highest frequency (4 Hz), which consisted of 20 cycles to provide 5 s of stimulation. This resulted in trial durations of 40, 20, 10, and 5 s, respectively. The order of directions was randomized and between directions, there was an interval of 2 s without perturbation.

To analyse the movements of the platform, the head and the body, a rigid body model was assigned for each marker set (platform, head and body), marker trajectories were semi-automatically labeled and gaps were filled automatically using Qualisys QTM software. Position and orientation data of each rigid body model and marker data were exported to Matlab (The Mathworks, USA), where they were combined with data about timing and direction of each perturbation.

Before calculating gains and phases of the head with respect to the platform and body movement, we identified saccadic head movements. A saccadic head movement was identified when the difference between Euclidian head and body velocity was faster than three times the maximum velocity of the platform for at least nine consecutive frames (25 ms). Beginning and end of each saccade was then adjusted using a model that consisted of a concatenation of individual constant velocity movements and that was optimized with the simplex search method (Lagarias et al., 1998) that reduced the area between the original and the modeled data.

To calculate gain and phase, we first computed amplitudes and phases of platform, body and head in world coordinates with the help of the discrete Fourier transform:Xf = ?txq,te -i2?ft(1)

Here, xq,t denotes the position of q ? {p = platform, h = head, b = body} at time t, and f is the perturbation frequency. From this, we can compute amplitude a and phase ? of each motion_aq=?(Xq)2+?(Xq)2 and ?q=arctan?(Xq)?(Xq)(2)

To quantify the heads response to platform perturbations, we first estimated the gain in world coordinates as gw=ahap and the time lag between head and platform motion as ?t=?h-?p?. Here, ? is the angular frequency ? = 2?f.

These estimates were then used to initialize a gradient decent optimization to fit the following model to the original data:xh(t) = gwxp(t – ?t) + S(t)(3)

where S(t)denotes the saccade model that we described above. Initial values of gw and ?t were adjusted with 300 iterations using the simplex search method (Lagarias et al., 1998)—similarly to the one used for the saccade model—that minimized the area between the original head data and the outcome of the model.

The gain gw and the corresponding phase ?w = ?t? describe the relation between head and platform in world coordinates. Since we want to understand how the pigeon responds to the perturbation it experiences, we transformed these values into gain and phase of the systems response. The resulting response gain describes the response of the pigeon in a coordinate system that is fixed to the platform. The transformation from movement gains to response gains is based on_xhp=xhw-xpw(4)

We used the Mann-Whitney U-Test to compare medians of gains between light and dark conditions, because data were not normally distributed. Phase differences were analyzed with the Watson-Williams-test as an equivalent of the Mann-Whitney U-Test for circular statistics (Berens, 2009). The comparison of active head saccades was analyzed with an ANOVA.

Head stabilization depended on the type of movement of the motion platform. It differed between translations and rotations, and it depended on whether or not visual flow information was available.

During translations, body and head movements were similar to the movement of the platform, as shown by representative time courses in forward-backward direction (Figure ). No compensation was observed and the head was not stabilized. This seemed to be frequency independent and even at 4 Hz, both, the body and the head moved with about the same amplitude as the platform. Small differences were observed between light and dark conditions for head motion (Figure ), but not for body motion (Figure ). Amplitude of head motion in light condition seemed to be slightly smaller than the amplitude of the platform movement.

In the example of Figure , it is also obvious that both head and body were lagging behind the platform movement for higher frequencies. A lag of 100 ms, which is about the amount observed in the example in the 4 Hz condition, translates into a considerable phase lag of 145 deg.

In contrast to translational movements, only the body followed rotations (Figure ), whereas the head seemed to be perfectly stabilized aside from active rotational head saccades (Figure ). Here, examples of rotations around the lateral axis are shown and the head was always stabilized independent of light conditions (Figure ).

Accurate head stabilization in response to rotational perturbation

When the platform was rotated, the head was almost perfectly stabilized, in contrast to the observations under translational perturbation. This was true in theory for both light conditions and all frequencies (Figures ) All gains were nearly equal to one and phases were nearly zero during rotations, suggesting that the head was actively stabilizing. Response gains for rotations around x and y were frequency independent, while rotations around z experienced very little loss in gain at higher frequencies (Figure ).

We used a two-way ANOVA to assess the number of head saccades per second (Figure ) in addition to gain and phase differences, utilizing the type of movement (rotation versus translation) and light condition (light on vs. darkness) as factors. The amount of head saccades was significantly influenced by the type of movement, according to the ANOVA [ANOVA, F(1, 36) = 135]. 5, p < 0. 001]. Additionally, the light condition had a significant impact on head saccades, with fewer saccades observed in the dark [ANOVA, F(1, 36) = 64]. 0, p < 0. 001] and we discovered a substantial interaction between the type of movement and the light condition [ANOVA, F(1, 36) = 40]. 46, p < 0. 001]. When the platform rotated in the light, it produced an average of 1 Approximately ten times more head saccades per second than during translations (0 1 saccades/s; Figure , red). This trend could also be observed in darkness (rotation: 0. 3 saccades/s; translation: 0. 03 saccades/s).

Poor head stabilization in response to translational perturbations

The gains and matching phases of the head in response to the six degrees of platform perturbations for various frequencies and lighting conditions are summarized in the figure. Remember that a gain of one and a phase of zero would indicate perfect head stabilization. When the gain and phase are both zero, the head moves in tandem with the platform. With a gain of one and a phase of 60 degrees, the head would likewise move with the same amplitude as the platform, but it would move more slowly. Consequently, the amplitude of the head in world coordinates would be twice as large as that of the platform with a gain of one and a phase of 180 degrees. It’s evident from the data plotted in Figure that the pigeons did a poor job of compensating translations. For low frequencies, gains were close to zero. Gains approached one with 2 Hz perturbations, but phases were in the range of 60 degrees, indicating that head stabilization was also not accomplished (Figures ; 2 Hz). Gains significantly exceeded one at 4 Hz, indicating once more a lack of stabilization. For vertical perturbations, the gain increased with frequency less than that of horizontal perturbations (compare Figure with Figures ). Phase shifts during translations varied for low frequencies depending on whether the condition was light or dark (Figures ). However, in all cases, head stabilization was virtually absent.

And more importantly, can you learn how to do that in time for your next dance party, because that’s what those chickens are doing?

The hens in the advertisement are merely doing what all birds do naturally: moving their bodies to maintain a stable view of whatever they’re viewing, but keeping their heads motionless. After watching the advertisement, my main concern was what they were staring at, as it was obvious they were interested in it enough to stare through the distractingly swooping camera. Based on the last scene where the two rows of chickens face each other, my naive assumption is: another chicken

First, watch this chicken video if you haven’t already. Yes, it’s an ad, but it’s completely worth it.

Thus, you have an agitated creature whose eyes must remain motionless. When the head moves, humans and many other animals respond to this by moving their eyes in the opposite direction. Your eyes automatically adjust to compensate for head movements, so if you focus on a word in this sentence and move your head side to side, that word remains in the center of your vision. This is made possible by the opto-kinetic reflex, which uses motion detectors in the retina of your eye, and the vestibulo-ocular reflex, which uses rotation detectors in your inner ear. As you can see, it’s completely subconscious and difficult to reverse. Simply try to maintain eye stillness while slowly rotating your head. You can’t (or I can’t, anyway).

Can you learn to do this move then? Probably not, since birds do it instinctively, just as humans do with our compensatory eye movements. However, I won’t stop you from trying.


How can birds keep their heads still?

When the body of a bird is held in the hand and rotated or moved in different directions the head often appears ‘locked in space’ or glued to the spot, and does not move with the rest of the body. To maintain this stable position the bird has, of course, to make complex compensatory movements of the neck.

What is the stability of a bird’s head?

Stabilization of the head in animals with limited capacity to move their eyes is key to maintain a stable image on the retina. In many birds, including pigeons, a prominent example for the important role of head stabilization is the characteristic head-bobbing behavior observed during walking.

What is a bird head stabilization called?

This phenomenon, known as gaze stabilization, is analogous to the stability of our eyes when we shake our heads. Gaze stabilization is a critical adaptation that enables birds to preserve a consistent visual perspective even as their heads undergo motion.