How do animals use information that may be an important, yet unreliable, cue for something they are searching for? To answer this question I am studying the behavioral responses of freely walking and flying fruit flies, Drosophila melanogaster, to a range of odors with different levels of specificity to fermenting fruit (a fruit flies primary food source and social gathering space). Thanks to genetic tools available in the fly, I am able to silence and activate individual neurons in the brain with high temporal and spatial specificity, making it possible to probe which parts of the brain are responsible for their behavioral choices.
Behavior is a complex subject; with over 16,000 fly-hours of detailed data collection with over 1,000 individual animals, there are still some unresolved questions. Stay tuned for results soon!
In late summer, the shores of Mono Lake, California, are bustling with small flies, Ephydra hydropyrus, which dive under water inside small air bubbles to feed. Despite Mark Twain’s charismatic description of them in his book Roughing It, we still do not understand how they are able to perform this entertaining and miraculous feat.
“You can hold them under water as long as you please–they do not mind it–they are only proud of it. When you let them go, they pop up to the surface as dry as a patent office report, and walk off as unconcernedly as if they had been educated especially with a view to affording instructive entertainment to man in that particular way.”
Using a combination of high speed videography, force measurements, scanning electron microscopy, and manipulations of water chemistry I am working towards understanding what makes these flies so unique. See this recent press article for a more detailed description: Fly makes air ‘submarine’ to survive deadly lake (Science, 2016).
To find human hosts, mosquitoes must integrate sensory cues that are separated in space and time. To solve this challenge my collaborators Michael Dickinson, Jeff Riffell, and Adrienne Fairhall and I showed that mosquitoes respond to exhaled CO2 by exploring visual features they otherwise ignore. This guides them to potential hosts, where they use cues such as heat and humidity to locate a landing site.
Coauthored with: Jeff Riffell, Adrienne Fairhall, Michael Dickinson. Read more about our work in Current Biology.
Animation: The above animations show a collection of 200-500 mosquito trajectories, each aligned to the moment when they last passed through a CO2 plume. Only trajectories that approach either the room temperature (blue), or the 37° C (orange), object are shown. Note that the mosquito trajectories were recorded at different times, and superimposed for presentation purposes. Only 20 mosquitoes were released into the (1.5x.3x.3 m^3) wind tunnel at a time, and rarely were there more than a few flying simultaneously – in flight interactions were rare. Note how the mosquitoes spend more time near the warm object.
Vision is arguably the most widely used sensor for position and velocity estimation in animals, and it is increasingly used in robotic systems as well. Many animals use stereopsis and object recognition in order to make a true estimate of distance. For a tiny insect such as a fruit fly or honeybee, however, these methods fall short. Instead, an insect must rely on calculations of optic flow, which can provide a measure of the ratio of velocity to distance, but not either parameter independently. Nevertheless, flies and other insects are adept at landing on a variety of substrates, a behavior that inherently requires some form of distance estimation in order to trigger distance-appropriate motor actions such as deceleration or leg extension. Previous studies have shown that these behaviors are indeed under visual control, raising the question: how does an insect estimate distance solely using optic flow? In this paper we use a nonlinear control theoretic approach to propose a solution for this problem. Our algorithm takes advantage of visually controlled landing trajectories that have been observed in flies and honeybees. Finally, we implement our algorithm, which we term dynamic peering, using a camera mounted to a linear stage to demonstrate its real-world feasibility. Read more in BioInspiration and BioMimetics.
Movie: Real time performance of the dynamic peering estimation algorithm. The video shows the data from figures 4 and 5 as an animation. Bottom left: Camera image sequence showing the visual target and region of interest (red box). Bottom right: Optic flow as a function of camera pixel calculated using the current and previous frames using OpenCV’s Lucas Kanade algorithm. For the purposes of control, we calculated a linear fit of the data (red line) over the region of interest. Top row: Dynamic peering performance (red) compared with the ground-truth values (blue) for position, velocity, and optic flow estimates, as well as the applied control effort. After an initial period where the robot accelerates to the steady state optic flow rate of -0.1 1/s, the estimates lock on to the actual values.
Recent evidence suggests that flies’ sensitivity to large-field optic flow is increased by the release of octopamine during flight. This increase in gain presumably enhances visually mediated behaviors such as the active regulation of forward speed, a process that involves the comparison of a vision-based estimate of velocity with an internal set point. To determine where in the neural circuit this comparison is made, we selectively silenced the octopamine neurons in the fruit fly Drosophila, and examined the effect on vision-based velocity regulation in free-flying flies. We found that flies with inactivated octopamine neurons accelerated more slowly in response to visual motion than control flies, but maintained nearly the same baseline flight speed. Our results are parsimonious with a circuit architecture in which the internal control signal is injected into the visual motion pathway upstream of the interneuron network that estimates groundspeed.
Background: For a fruit fly, locating fermenting fruit where it can feed, find mates, and lay eggs is an essential and difficult task requiring the integration of olfactory and visual cues. Here, we develop an approach to correlate flies’ free-flight behavior with their olfactory experience under different wind and visual conditions, yielding new insight into plume tracking based on over 70 hr of data.
Results: To localize an odor source, flies exhibit three iterative, independent, reflex-driven behaviors, which remain constant through repeated encounters of the same stimulus: (1) 190 6 75 ms after encountering a plume, flies increase their flight speed and turn upwind, using visual cues to determine wind direction. Due to this substantial response delay, flies pass through the plume shortly after entering it. (2) 450 6 165 ms after losing the plume, flies initiate a series of vertical and horizontal casts, using visual cues to maintain a crosswind heading. (3) After sensing an attractive odor, flies exhibit an enhanced attraction to small visual features, which increases their probability of finding the plume’s source.
Conclusions: Due to plume structure and sensory-motor delays, a fly’s olfactory experience during foraging flights consists of short bursts of odor stimulation. As a con- sequence, delays in the onset of crosswind casting and the increased attraction to visual features are necessary behav- ioral components for efficiently locating an odor source. Our results provide a quantitative behavioral background for eluci- dating the neural basis of plume tracking using genetic and physiological approaches.
Coauthored with Michael Dickinson. Read more about my work in Current Biology.
Landing behavior is one of the most critical, yet least studied, aspects of insect flight. In order to land safely, an insect must recognize a visual feature, navigate towards it, decelerate, and extend its legs in preparation for touchdown. Although previous studies have focused on the visual stimuli that trigger these different components, the complete sequence has not been systematically studied in a free-flying animal. Using a real-time 3D tracking system in conjunction with high speed digital imaging, we were able to capture the landing sequences of fruit flies (Drosophila melanogaster) from the moment they first steered toward a visual target, to the point of touchdown. This analysis was made possible by a custom-built feedback system that actively maintained the fly in the focus of the high speed camera. The results suggest that landing is composed of three distinct behavioral modules. First, a fly actively turns towards a stationary target via a directed body saccade. Next, it begins to decelerate at a point determined by both the size of the visual target and its rate of expansion on the retina. Finally, the fly extends its legs when the visual target reaches a threshold retinal size of approximately 60deg. Our data also let us compare landing sequences with flight trajectories that, although initially directed toward a visual target, did not result in landing. In these ʻfly-byʼ trajectories, flies steer toward the target but then exhibit a targeted aversive saccade when the target subtends a retinal size of approximately 33deg. Collectively, the results provide insight into the organization of sensorimotor modules that underlie the landing and search behaviors of insects.
High speed video: A fruit fly (Drosophila melanogaster) approaches and lands on a vertical post, filmed at 5,000 frames per second. To keep the fly in focus I built a custom feed-forward focus control system for the camera, which used 3D information from a realtime computer vision based tracking system.