Robots Afield

Improving Aerial Imagery for Agriculture through Robotics

By: Meg Henderson

Robots Afield

Collin McLeod (left), research associate, and Kha Dan (right), research engineer, test the all-terrain robot in an agricultural field at Mississippi State. (Photo by Dominique Belcher)


The practice of precision agriculture has been in place since the late 1980s, when soil analysis data from specific locations determined by GPS began allowing farmers to fertilize a field-still by adjusting their machines manually-according to its varied needs instead of using a uniform application. Today, farm labor is difficult to find, so the agriculture industry is becoming less labor dependent and more data-driven than ever, thanks to unmanned aerial vehicles (UAVs) and robotic technology.

Dr. Alex Thomasson, a precision agriculture and cotton ginning expert and head of the Department of Agricultural and Biological Engineering, specializes in agricultural autonomy, the latest development in precision agriculture. The MAFES scientist has spent the last several years working with UAVs, commonly known as drones, and his latest research focuses on developing a ground-based autonomous robot that works in tandem with a drone to provide more accurate data on row crops.

Drones equipped with remote-sensing systems are commonly used to collect image data that provide essential feedback on plant health, growth, and response to environmental effects. This information can help farmers detect diseased plants, schedule irrigation, and evaluate fruit production more quickly and accurately. Thomasson explained that a commercial drone can fly over a hundred-acre crop field in twenty minutes and collect about a thousand images, which are "stitched" together by a software program.

"Most people doing this work use manufacturer-recommended protocols to capture images from the drone's built-in camera and stitch them together with commercial software, but we have determined that there is a lot of room for error in that data," he said.

By producing these stitched aerial images from commercial drones, scientists are looking to capture data on the temperature, color, and height of plants in a field, so it is important that the data be as accurate as possible. Thomasson and his research team including Dr. Xiongzhe Han, then a postdoc at Texas A&M University, set out to find the margins of error in the drones' measurements and to develop solutions to remedy them. That initial work resulted in building a robot that could collaborate with a drone to provide the input needed to calibrate its image data.

The robot looks like a tall all-terrain vehicle and is equipped with four panels on its roof, two for calibrating temperature and the other two for calibrating color. This roving robot communicates its GPS position to the drone (and vice versa) via wireless transmission so both can recognize the other's position in the field at any given moment. As the drone flies in a programmed route across the field, the robot automatically moves to meet up with it at set locations where the drone captures images of the robot's panels.

Some of the most difficult data to capture accurately are the temperatures, which provide strong feedback about the health of the plant, Thomasson noted.

"There may be a temperature difference in plants of more than ten degrees Fahrenheit across a field, but when the sensing camera has a 20-degree margin of error, you know almost nothing about what's going on in the field temperature-wise," he said.

Thomasson added that the robot has a known height of six feet, so the images captured of the robot are used to calibrate the height data from other objects in the field-namely plants-of unknown height.

"Having very accurate measurements of the plants in terms of color, height, and temperature is critical," Thomasson said. "These data tell us a lot about the current health of the plants and about their growth over a period of time. We're able to do that much more accurately with this system than with an off-the-shelf drone system."

The team's studies have shown promising results from the drone-robot collaboration strategy. After calibrating the drone's camera to the robot's fixed outputs, the margin of error for temperature measurement dropped from 20 degrees Fahrenheit to just two degrees, and the margins of error in height and color measurement were cut in half.

Thinking about his study's practical applications, Thomasson stated that breeders, mainly those developing improved varieties of crops such as cotton, corn, and soybeans, would be the early beneficiaries of the findings.

"Having better information will help breeders make more informed decisions on which varieties they want to continue to work with because they will be working with much more accurate data," he said.

However, as these technologies are perfected and become more reasonably priced over time, they can also provide actionable information for growers considering where and when to apply inputs like irrigation and fertilizer.

Thomasson began this study in 2018, partnering with scientists from Texas A&M University and the University of Illinois on a USDA-NIFA grant. Having completed the original project, he is now working on another USDA-NIFA project with Texas A&M and the University of Nebraska, which runs through 2024. MSU Research Associate Collin McLeod and Research Engineer Kha Dan are working under Thomasson on the current project, and Dr. Robert Moorhead has provided additional support from MSU's Geosystems Research Institute. While the projects and partners have shifted, Thomasson noted that his research focus has remained the same.

"We'll be testing our latest version of the robot in conjunction with a drone soon here in Starkville," he said. "And when the operation becomes seamless, we'll also test at Nebraska and Texas A&M."

In addition to the imaging project, Thomasson is working with Hussein Gharakhani, a doctoral student, to build a robotic cotton harvester that plucks cotton from individual bolls with an "end-effector," a picking device on a robotic arm. This work is funded by Cotton Incorporated.

"This work is very futuristic. Today's machines can harvest a large field of cotton in a relatively short amount of time, but they have some drawbacks. They are prohibitively expensive for small farmers, they are heavy and compact the soil where the wheels roll, and they rely on human operation," Thomasson said.

He added that current harvesting practices require farmers to wait until the end of the growing season and harvest all the bolls at once. Cotton bolls grow lower on the plant earlier in the season and are exposed to the elements throughout the rest of the growing season, and as a result, the early-growing bolls produce a lower quality product. Waiting all season to harvest also carries risks of loss due to hurricanes and other destructive weather events.

"Research has shown that there is a significant improvement of quality of the cotton fiber if it can be harvested as early as the boll opens," Thomasson said. "We could potentially send several little robots with these end-effectors into a field to harvest multiple times, starting much earlier in the growing season."

The future of agriculture sits presently in the research of scientists like Thomasson and his colleagues and students. With agriculture being the top industry in the state, robotic devices will help farmers of the future work more efficiently and make the most of Mississippi's long-growing season.

"Mississippi State University has led the way in precision agriculture for years, and I believe we are poised to be a major player in this new field of agricultural autonomy," Thomasson said.

This research is funded by USDA-NIFA. Collaborators include Texas A&M University, University of Illinois, and University of Nebraska,. MSU collaborators include the Geosystems Research Institute.

Sections