Open Access
How to translate text using browser tools
2 September 2013 Very-High-Resolution Time-Lapse Photography for Plant and Ecosystems Research
Mary H. Nichols, Janet C. Steven, Randy Sargent, Paul Dille, Joshua Schapiro
Author Affiliations +

The information contained in traditional single-frame photographs is a compromise between image detail and the amount of area covered. Panoramic tripod mounts can assist with taking sequences of images that can be stitched to produce composite images. A variety of manual and robotic mounts are commercially available. Ongoing advances in technologies for capturing and viewing very-high-resolution images have greatly expanded the capacity to study a broad range of biotic and abiotic ecosystem processes across spatial scales (Sargent et al., 2010a; Brown et al., 2012). The GigaPan system ( http://gigapansystems.com) consists of hardware to capture multiple images on a grid through controlled positioning, software to stitch the images and create very-high-resolution panoramas, and a website to facilitate viewing, searching, exploring, and annotating, and to encourage discussion. Very-high-resolution photography has proven valuable for the study of processes in extreme detail over space in a number of scientific disciplines (Sargent et al., 2010a) including geology, archaeology, biodiversity, glaciology, and rangeland ecosystem research (Nichols et al., 2009).

A logical extension of the ability to capture high-resolution spatial detail is to capture repeat images over time to develop time-lapse sequences. Repeat photography plays an important role in quantitatively assessing changes and processes across many areas of ecosystem research. Examples include landscape change detection and analyses (Turner et al., 2003; Webb et al., 2007; Villarreal et al., 2013) and phenological responses to abiotic drivers (Crimmins and Crimmins, 2008; Richardson et al., 2009; Kurc and Benton, 2010; Sonnentag et al., 2012). Time-lapse imagery is also an important tool in studies of plant behavior (Trewavas, 2009). Time-lapse at the level of a single plant has revealed searching behavior by parasitic plants (Runyon et al., 2006), solar tracking (Hangarter, 2000), and other plant responses to the environment. Typically, only one or a few plants can be captured with a single time-lapse camera at the desired level of detail, limiting sample size and typically confining observations to a laboratory setting. The advantage of imagery captured with the GigaPan system over traditional time-lapse photography is the large increase in resolution over a broader spatial scale, which enables the viewer to make observations both at the level of the individual plant and at the ecosystem level. This increased resolution allows a researcher to capture both variation within a population (Fig. 1) and interactions between the environment and the population within a single sequence, a technique not possible with traditional time-lapse photography.

The goals of our project were to produce a very-high-resolution, zoomable, time-lapse video in the laboratory and subsequently to build a reliable, weatherized, solar-powered system for capturing images in the field. Here we describe example applications showing individual plant growth in a laboratory and landscape-scale vegetation change at a remote rangeland site. Very-high-resolution panoramic photography provides a low-cost (<$2000) method for collecting large amounts of data across a range of spatial and temporal scales.

Fig. 1.

Frames from a GigaPan time-lapse sequence of Wisconsin Fast Plants showing (A) the entire group of plants and (B) flowers on three plants. The user can zoom from one level of detail to another while the sequence is running. The full sequence is at  http://timemachine.gigapan.org/wiki/Plant_Growth.

f01_01.jpg

METHODS AND RESULTS

Laboratory application —A GigaPan EPIC Pro robotic camera mount (GigaPan Systems, Portland, Oregon, USA) and a Canon PowerShot G10 camera (Canon USA, Melville, New York, USA) with a 58-mm telephoto lens were used to capture the growth and movement of Wisconsin Fast Plants, a quick-growing variety of Brassica rapa L. developed by Paul Williams at the University of Wisconsin—Madison. Although a wide range of cameras are compatible with the robotic mount, the Canon G10 was selected because it is a moderately priced camera that allows manual control of white balance, shutter aperture and speed, and focus. The general procedure for setting up the hardware was to level the robotic mount and set the field of view following the onscreen instructions that guide the user through aligning the top and the bottom of the LCD screen with a point in the field of view. The camera was then set to full zoom and the top left and bottom right corners of the overall image were selected by rotating the robotic mount. Overall image geometry, including the upper left and lower right corners of the scene to be captured and the position of a visually identifiable reference point, was stored in memory to support capturing repeat sets of the image sequences as the Wisconsin Fast Plants grew. A generalized set of procedures for setting up and capturing time-lapse panoramas can be found in Appendix 1 with complete details at  http://wiki.gigapan.org. The plants were grown under 24 h of fluorescent light, and cabbage white butterfly (Pieris rapae) eggs were introduced on day 11.

The EPIC Pro is battery operated, but includes an external power adapter for using AC power. Both the camera and the GigaPan were continuously powered using commercially available AC adapters, and a 32-GB card was used to store images. The card was changed every other day. To test and demonstrate the system, a grid of images seven photos wide by three photos high was taken at 15-min intervals over the course of 32 d to create single panoramic images. The capture of 21 images at the 15-min interval allowed us to capture enough change to make the final time-lapse video fluid. This combination also meant that the memory card filled in about 48 h. The appropriate number of images to capture and the time step will vary depending on the scientific application and choice of camera and component parts.

The resolution achievable within a time lapse is determined by camera equipment, image storage, and image acquisition and processing time, and the resolution desired by the researcher is determined by the subject matter and questions asked. The cost of the system is therefore partially determined by the camera equipment used. Any camera that can be connected to a shutter release cable can be used with the GigaPan mount. The interval between card changes is determined by the size and frequency of panoramas, and by the size of the memory card used in the camera. There are 256-GB cards available, but not all camera models are able to read them. A 25-image panorama taken once an hour of images that are 14 MB each will take about 91 h to fill a 32-GB card. A camera with a telephoto lens will take pictures with a smaller field of view, which will increase resolution in the overall image. However, the number of images acquired to cover the same area will increase, as will both the time it takes for the GigaPan to acquire the images and the time it takes to stitch an image. Cameras that take pictures with more megapixels will also provide higher resolution. The time interval between images affects the speed at which the card fills as well as the amount of change captured by the time lapse. The time interval should be determined by the rate at which the subject of interest is changing and the level of detail within that change the researcher is interested in capturing. Ultimately, the spatial and temporal resolution within a panorama is a balance between the area the researcher wishes to cover, the detail needed within the larger image, the camera equipment available, the frequency with which the cards can be changed out of the camera, and the image storage space available.

Individual sets of images were aligned and stitched to create panoramas using software developed at the Community Robotics, Education and Technology Empowerment (CREATE) Laboratory within the Robotics Institute at Carnegie Mellon University. The software for stitching single panoramic images is included with the GigaPan hardware. Alignment through time is accomplished based on overlap in the images and the geometry of the GigaPan robot (see Sargent at al., 2010b for additional detail). Directions and software for stitching panoramas into a time-lapse sequence can be found at  http://wiki.gigapan.org/creating-time-machines. The plant growth sequence was played back at 24 frames/s, accelerating plant growth and movement to 21,600 times its natural speed. The resultant time-lapse sequence ( http://timemachine.gigapan.org/wiki/Plant_Growth) captures details such as circumnutation and caterpillar feeding and growth. The very high resolution of the images allows the viewer to focus on individual leaves and observe the caterpillars while they are very small. The size of the panorama also allows for comparison among plants and captures behavioral changes as the plants grow and come into contact with one another.

Time-lapse sequences can be viewed at  http://timemachine.gigapan.org through browsers supporting HTML5 video such as Google Chrome. The viewer allows for seamless exploration through both space and time. Images can be zoomed to realize spatial detail as the image sequence is stepped forward or backward. The website supports the creation of time warps, which allow a user to save and replay specific areas within a larger panorama. This is useful for highlighting a particular feature of interest. Currently, extracting quantitative data from time-lapse sequences involves user interpretation and manipulation of individual frames using software such as ImageJ ( http://rsbweb.nih.gov/ij/) or KineRoot (Basu et al., 2007), and there is a distinct need for tools to automate quantitative data extraction from high-resolution images (Brown et al., 2012).

Field application —Following the demonstrated success in the laboratory, the system was modified for use in the absence of AC power. We designed a prototype solar charging system and electronics to provide regulated voltage to the camera and the robotic mount. A 20-W solar panel was connected to a controller and 12-V deep-cycle marine battery. Power from the 12-V battery was supplied to both the camera and the GigaPan camera mount by connecting in to a 12-V DC isolated power supply and out through two 7.4-V regulators (one for the camera and one for the robotic mount). In addition, a small fan was powered by the solar panel to provide airflow through the support base to facilitate cooling. A list of parts and costs is included in Appendix 2. The prototype was set up in a weatherproof housing behind an acrylic dome. The solar panel was mounted above the acrylic dome and oriented to shade the robotic mount and camera. The system was subsequently modified to include a Pelican case for the housing and a tripod base to facilitate moving the system among field sites. Additional details can be found at ( http://wiki.gigapan.org/outdoor-time-machine-capture-v1-0).

The solar-powered time-lapse system was set up in an instrumented subwatershed within the Walnut Gulch Experimental Watershed (WGEW) in southern Arizona (Moran et al., 2008). The system was set up in the Kendall grassland subwatershed, and 28 pictures were taken every 2 h with a Canon PowerShot G10 to create panoramas from a grid seven photos wide by four photos high. Images were stored on the camera's 32-GB Secure Digital High-Capacity (SDHC) flash card. Although the 32-GB card can hold approximately 5000 images taken at the highest resolution with the G10, we visited the site once per week to verify that the hardware was working and we downloaded the images during these visits. The solar-powered GigaPan system was successfully used to capture summer monsoon season precipitation-induced vegetation changes on the WGEW. In outdoor settings, diurnal lighting variations and changing weather conditions create panoramas with substantial differences in exposure. These differences can affect algorithms for quantifying ecological processes, such as those that rely on red, green, and blue intensity values for assessing phenological changes (Kurc and Benton, 2010). Within an individual GigaPan image, information identifying shutter speed, aperture, and ISO stored in exchangeable image file format (Exif) tags associated with each image is used to adjust exposures to the average exposure for the overall image. Lighting variations are compensated for algorithmically during the stitching process by aligning panoramas with those with the most similar lighting. The resultant time-lapse video, which runs from 11 July 2011 through 8 August 2011, can be seen at  http://timemachine.gigapan.org/wiki/Arizona_Grasslands. Rainfall and runoff data collected at the site provide supplemental information that can be used to interpret the qualitative information in the video. A total of 208 mm of rain was recorded during 31 precipitation events from June through September. There was one runoff event on 29 July 2011 during the video, and there was an additional runoff event on 9 July 2011, two days prior to the start of image capture. The temporal dynamics of changing vegetation as the landscape becomes green in response to rainfall events that occur between 20 July and 30 July and reduction in bare soil can be observed within the video. In addition, the dynamism and range of scales accommodated within the time-lapse video is exemplified by viewing a cholla cactus (Cylindropuntia) as it hydrates through the monsoon season (Fig. 2 and  http://timemachine.gigapan.org/wiki/Arizona_Grasslands).

During the monsoon season, clouds typically formed midday and were persistent during days with rainfall. On such days, the 20-W solar panel was not sufficient to run the system without draining the battery so a 60-W panel was installed. The system ran through the night (and images were deleted) because we have not yet incorporated a timer to cycle the power, so our power draw was greater than necessary. Additional work to improve the efficiency of solar charging is needed and is ongoing. This improvement will also allow for a greater number of images to be taken during daylight hours. Each of the stitched panoramic images taken every 2 h is 0.20 gigapixels. The overall resolution (and thus detail captured) can be increased by capturing a large number of individual images at a higher zoom. The capture of the 28 images that make up a single panorama took between 2 and 4 min, indicating that a much larger set of images could be captured on the 2-h time step.

Fig. 2.

Frames from a GigaPan time-lapse sequence of summer monsoon season precipitation-induced vegetation change. A, B. Landscape views on (A) 13 July 2011 and (B) 8 August 2011. C–E. The level of image detail capture is illustrated by the sequence of photos showing the response of a cholla cactus (Cylindropuntia) (detail from white box in parts A and B) to precipitation. The full sequence is at  http://timemachine.gigapan.org/wiki/Arizona_Grasslands.

f02_01.jpg

CONCLUSIONS

Very-high-resolution photography can be employed to collect very large amounts of qualitative data across a range of spatial and temporal scales at low cost. We have described and demonstrated a system for acquiring very-high-resolution time-lapse imagery using commercially available hardware and off-the-shelf digital cameras with modification to accommodate use at remote field sites. Both the laboratory and remote field site examples provided clear images at multiple scales that contain data on size, color, growth, and movement for a large number of plants. Time-lapse photography can be used to document conditions and provide baseline data against which to compare future conditions. The current system can be employed to capture processes and responses that occur at time scales ranging from minutes and hours (e.g., flash floods), to months (e.g., seasonal phenologic responses), to years and decades (e.g., postfire recovery). Very-high-resolution time-lapse photography is being used in research to study habitat and ecosystem dynamics (Smith, 2010), and field research is ongoing to study plant response to grazing and precipitation patterns using the GigaPan system. Coupling data extracted from the images with environmental data are likely to be a powerful technique for collecting large data sets over time with a minimum of manpower. There is an opportunity to build upon current research ongoing in the remote sensing, phenology, visualization, and engineering communities to extract data from time-lapse and repeat photographic series of very-high-resolution ground-based images.

The GigaPan robotic camera mount and stitching software are commercially available. Development of value-added software and analysis tools is ongoing. Example time-lapse images and information on the current status of the viewer are available at  http://timemachine.gigapan.org, a website and wiki for developing collaborative content.

LITERATURE CITED

1.

P. Basu , A. Pal , J. P. Lynch , and K. M. Brown . 2007. A novel imageanalysis technique for kinematic study of growth and curvature. Plant Physiology 145: 305–316. Google Scholar

2.

T. B. Brown , C. Zimmermann , W. Panneton , N. Noah , and J. Borevitz . 2012. High-resolution, time-lapse imaging for ecosystem-scale phenotyping in the field. In J. Normanly [ed.], High-throughput phenotyping in plants: Methods in molecular biology, 71–96. Springer, New York, New York, USA. Google Scholar

3.

M. A. Crimmins , and T. M. Crimmins . 2008. Monitoring plant phenology using digital repeat photography. Environmental Management 41:949–958. Google Scholar

4.

R. Hangarter 2000. Plant tropic responses. Website  http://plantsinmotion.bio.indiana.edu/plantmotion/movements/tropism/tropisms.html [accessed 2 April 2013]. Google Scholar

5.

S. A. Kurc , and L. M. Benton . 2010. Digital image-derived greenness links deep soil moisture to carbon uptake in a creosotebush-dominated shrubland. Journal of Arid Environments 74: 585–594. Google Scholar

6.

M. S. Moran , W. E. Emmerich , D. C. Goodrich , P. Heilman , C. D. Holifield Collins , T. O. Keefer , M. A. Nearing , et al. 2008. Preface to special section on fifty years of research and data collection: U.S. Department of Agriculture Walnut Gulch Experimental Watershed. Water Resources Research 44(5): 1–3. Google Scholar

7.

M. H. Nichols , G. B. Ruyle , and I. R. Nourbakhsh . 2009. Very high resolution panoramic photography to improve conventional rangeland monitoring. Rangeland Ecology and Management 62: 579–582. Google Scholar

8.

A. D. Richardson , B. H. Braswell , D. Y. Hollinger , J. P. Jenkins , and S. V. Ollinger . 2009. Near-surface remote sensing of spatial and temporal variation in canopy phenology. Ecological Applications 19: 1417–1428. Google Scholar

9.

J. B. Runyon , M. C. Meschler , and C. M. De Moraes . 2006. Volatile chemical cues guide host location and host selection by parasitic plants. Science 313: 1964–1967. Google Scholar

10.

R. Sargent , C. Bartley , P. Dille , J. Keller , R. LeGrand , and I. Nourbakhsh [eds.]. 2010a. Proceedings of the Fine International Conference on Gigapixel Imaging for Science, 11–13 November 2010;, Pittsburgh Pennsylvania, USA. Website  http://gigapixelscience.gigapan.org/papers-2 [accessed 2 April 2013]. Google Scholar

11.

R. Sargent , C. Bartley , P. Dille , J. Keller , R. LeGrand , and I. Nourbakhsh . 2010b. Timelapse GigaPan: Capturing, sharing, and exploring time-lapse gigapixel imagery. Website  http://gigapixelscience.gigapan.org/papers-2/timelapsegigapan [accessed 14 May 2012]. Google Scholar

12.

A. Smith 2010. A year in an urban forest: Dairy Bush GigaPan 2009–2010. Website  http://gigapixelscience.gigapan.org/papers-2/ayearinanurbanforestdairybushgigapan2009-2010 [accessed 17 July 2013]. Google Scholar

13.

O. Sonnentag , K. Hufkens , C. Teshera-Sterne , A. M. Young , M. Friedl , B. H. Braswell , T. Milliman , et al. 2012. Digital repeat photography for phenological research in forest ecosystems. Agricultural and Forest Meteorology 152: 159–177. Google Scholar

14.

A. Trewavas 2009. What is plant behavior? Plant, Cell & Environment 32: 606–616. Google Scholar

15.

R. M. Turner , R. H. Webb , J. E. Bowers , and J. R. Hastings . 2003. The changing mile revisited: An ecological study of vegetation change with time in the lower mile of an arid and semiarid region. The University of Arizona Press, Tucson, Arizona, USA. Google Scholar

16.

M. L. Villarreal , L. M. Norman , R. H Webb , and R. M. Turner . 2013. Historical and contemporary geographic data reveal complex spatial and temporal responses of vegetation to climate and land stewardship. Land 2: 194–224. Google Scholar

17.

R. H. Webb , S. A. Leake , and R. M. Turner . 2007. The ribbon of green: Change in riparian vegetation in the southwestern United States. The University of Arizona Press, Tucson, Arizona, USA. Google Scholar

Appendices

APPENDIX 1.

Generalized procedure for setting up and capturing time-lapse panoramas; see  http://wiki.gigapan.org/home for details.

tA01_01.gif

APPENDIX 2.

Parts list, costs, and example sources for components used to build solar-powered very-high-resolution photo system.

tA02_01.gif

Notes

[1] The authors thank B. Freniere and the field staff of the Walnut Gulch Experimental Watershed field station whose assistance made this research possible. Funding was provided through the Fine Outreach for Science Fellows Program ( http://www.cs.cmu.edu/~fofs/fofs.html).

[2] Disclaimer: Mention of trade names or commercial products in this article is solely for the purpose of providing specific information and does not imply recommendation or endorsement by the U.S. Department of Agriculture.

Mary H. Nichols, Janet C. Steven, Randy Sargent, Paul Dille, and Joshua Schapiro "Very-High-Resolution Time-Lapse Photography for Plant and Ecosystems Research," Applications in Plant Sciences 1(9), (2 September 2013). https://doi.org/10.3732/apps.1300033
Received: 19 April 2013; Accepted: 5 August 2013; Published: 2 September 2013
KEYWORDS
digital photography
phenology
plant behavior
Visualization
Back to Top