J. Alex Thomasson,John Valasek 2016-06-29 04:37:07
As the world population grows rapidly from its current 7 billion and finally plateaus around 12 billion by the year 2100, we must roughly double food production per unit land area. Two major tools at our disposal are optimization of crop production through precision agriculture (PA) and crop improvement through breeding involving high-throughput phenotyping (HTP). Advances in both of these areas rely on the development of rapid, consistent, and reliable sensing technologies. One of the foremost new sensing technologies for PA and HTP is high-resolution imaging with small unmanned aerial systems (sUAS). Compared to manned aircraft systems, sUAS are much less expensive, are much more flexible in scheduling, enable lower flight altitudes, use lower speeds, and ultimately provide much better spatial resolution in the resulting images. The main disadvantage is that only relatively small areas can be imaged at the lower altitudes and speeds. However, on large farms with contiguous fields, images can be collected with sUAS using sensing technologies that enable high-quality image mosaics to be created with sufficient metadata and ground-control points. TAMU and sUAS Texas A&M University (TAMU) and its associated agencies, Texas A&M AgriLife Research and the Texas A&M Engineering Experiment Station (TEES), initiated a comprehensive research project in 2015 for agricultural remote sensing with sUAS. The initial objective is to provide geometrically corrected and radiometrically calibrated largearea image mosaics of high-resolution field and plot data to breeders and agronomic researchers within 48 hours of image acquisition. The project has been conducted at AgriLife’s research farm near College Station, Texas, about 10 km from TAMU’s main campus. The farm consists of 568 ha of crop fields and plots where corn, cotton, sorghum, and wheat are the main crops grown for breeding and agronomic research. The research group is made up of five teams involving over 40 scientists and engineers. The Administration team provides and manages funds, coordinates meetings and initiatives, and assists and encourages faculty members in garnering external funding. The Flight Operations team conducts UAV flights to provide remote-sensing data to the field researchers. The Sensors team manages the sensors used onboard the sUAS and ensures that the imagery received by the researchers is of high quality. The Data Management team stores and manages the data and conducts preprocessing to geographically correct images and construct image mosaics. Both the Sensors and Data Management teams work with the field researchers to develop analytic techniques. The Field Research team evaluates the data with respect to ground truth, and develops and uses analytic tools to facilitate breeding and agronomic research. Three flight team subgroups focus on specific types of aircraft. The principal subgroup focuses on non-commercial and custom fixed-wing sUAS as well as overall data workflow. The second subgroup focuses on rotary-wing aircraft and very high-resolution data. The third subgroup focuses on commercially available fixed-wing sUAS. The sensors carried on these aircraft include numerous multispectral cameras, and other available sensors include hyperspectral cameras, LiDAR sensors, and high-quality thermal-infrared cameras. Flight operations require obtaining flight authorizations, planning, preparation, and execution. The project team has worked closely with Lone Star Unmanned Aerial Systems (UAS) Center of Excellence, one of six FAA test centers for UAS, in obtaining Certificates of Authorization from the FAA for sUAS flights. During mission planning, the altitude, air speed, camera resolution, and camera exposure time must be balanced to achieve desirable image quality. In addition, to create good mosaics from many images, flight paths must be designed to provide adequate image overlap. The research farm has been divided into “route packs” —contiguous groups of fields and plots that can be covered efficiently during an individual flight. Route packs are flown by one or more of the flight subgroups at least once per week. Each flight subgroup maintains and installs sensors on its own aircraft. The Sensors team maintains proper operation of the cameras and other sensors and has developed methods to enable radiometric calibration of images. In 2015, the project team accomplished over 100 flights, and work is well underway in 2016. After images are collected, they are geographically registered and mosaicked together. Geographic registration involves using software to modify pixel positions based on positioning data collected onboard the sUAS during flight as well as ground control points (GCPs) visible in the images. In addition to natural features that can be used as GCPs, the Sensors team installed numerous pairs of 24 in. (61 cm) square tiles throughout the farm so that all mosaicked images of the route packs include at least two of these pairs. Mosaicking involves using software to geometrically correct individual images and then “stitch them together” to create one large image of each route pack. The positioning and inertial sensors integrated with the camera systems onboard the aircraft, along with the GCPs, provide the metadata that are used in this process. Finally, radiometric correction involves using software to calibrate pixel values according to objects of known spectral reflectance that appear in the images. The pairs of tiles used as GCPs have been painted to provide upper and lower calibration values for this purpose. In addition to installing and painting the tiles, their precise field positions were measured with RTK GPS, their reflectance values are measured weekly with a handheld spectroradiometer, and they must be cleaned regularly. Data collection and other issues Several data collection issues came to light in 2015, the project’s first year. First, image spatial resolution depends on the resolution of the camera detector, the field of view (FOV) provided by the lens, the camera exposure time, and the altitude and ground speed of the aircraft. Camera resolution depends on the number of photosites on the detector and the spatial frequency at which the detector is sampled. While camera resolution, FOV, and altitude determine the theoretical image spatial resolution, the camera exposure time and aircraft ground speed determine the amount of “pixel smear,” which is essentially a reduction in resolution in the direction of travel. To maintain the resolution of the camera at a given altitude, there must be a balance between ground speed and camera exposure time so that the distance traveled during the exposure time is small compared to the spatial resolution. One issue is that accuracy of the inertial sensors and GPS is essential for precise control of the aircraft and accurate calculation of the camera angle relative to nadir. Accurate optical- angle calculations are critical for accurate mosaicking of images. Accurate mosaicking also requires a high level of overlap among images. Ideally, GCPs are clearly visible in the images in order to line the images up and for geographic registration. Another issue is airframe stability, particularly for fixedwing aircraft, which is critical for collection of high-quality images that can be processed into high-quality orthomosaics. Ideally, the camera is positioned so that its viewing angle is on nadir, i.e., directly downward. There are two ways to achieve this. One method is to use an aircraft with controls that can keep it level even when the winds aloft are variable. Another method is to use a gimbaled camera that can maintain an on-nadir angle even when the aircraft is not level. Without either of these methods, many images will be collected when the camera is significantly off nadir, resulting in images and mosaics of poor quality. Two other issues relate to the radiometric properties of the images. First, some research applications require radiometric correction of images so that pixel values relate directly to reflectance values. In manned aircraft images, this is commonly done by laying out large tarps with known low and high reflectance values, thereby allowing each pixel in the image to be corrected according to a linear model. When UAV images are mosaicked into a larger image of a field or route pack, each individual image is relatively small, so it is impossible to include a calibration reference in every image. As mentioned, painted tiles have been laid out strategically across the route packs, and these tiles are used for radiometric correction (as well as GCPs) after mosaicking. This process requires excellent mosaicking to provide a high level of confidence in the radiometric values. Moving forward Texas A&M’s comprehensive project for agricultural remote sensing and high-throughput phenotyping with sUAS had a very successful first year in 2015, with numerous processes developed and refined to help us meet project objectives. We have developed the capability to provide image data of complete fields and plots to researchers in breeding, genetics, and agronomic research at a high resolution, with pixel dimensions commonly less than 50 mm. We are still working on improving the workflow so that we can provide preprocessed data by the next day, but providing data in less than a week is now common. These data include highquality, geometrically corrected, mosaics of large fields. Current work in 2016 includes the following: • Development and evaluation of data for decisionmaking in precision agriculture and high-throughput phenotyping. • Further refinement of methods to improve the timeliness of data delivery to participating scientists. • Further development and evaluation of techniques for radiometric calibration. • Consideration of techniques to account for cloud shadows in images. • Improvement in aircraft control systems to improve stability. • Real-time calculation of image area coverage to ensure adequate overlap for mosaicking. • Testing of new sensors to improve the spatial and spectral resolution of images. ASABE member J. Alex Thomasson, Professor, Department of Biological and Agricultural Engineering, Texas A&M University, College Station, USA, email@example.com. ASABE member John Valasek, Professor, Department of Aerospace Engineering and Center for Autonomous Vehicles and Sensor Systems, Texas A&M University, College Station, USA, firstname.lastname@example.org.
Published by ASABE. View All Articles.