Precision agriculture gets a boost from multirotors
Here at the Roswell Flight Test Crew, we typically focus on the “sexy” applications for rotor drones: flying a First-Person View (FPV) machine over a burning building to support firefighters battling the blaze on the ground below, or using an aerial cinema rig to capture a dynamic shot that would be impossible by any other means, including a manned helicopter.
However, according to an estimate from the Association of Unmanned Vehicle Systems International (AUVSI), 80 percent of the civilian drones that will be deployed over the coming decade will serve a single, decidedly un-sexy industry: agriculture. Flying an autonomous, waypoint-driven mission over a field of wheat will never deliver the same jolt of adrenaline as a nighttime tactical mission using a thermal imaging camera, but it could deliver something even more valuable: improved crop yields on an increasingly hungry planet.
Enter the AgBOT
Based in Clackamas County, Oregon in the heart of the state’s flourishing wine country, Aerial Technology International (ATI) has developed the AgBOT: a rotor drone purpose-built to assess crop health via airborne multispectral imagery.
“We designed the AgBOT in response to the needs of the agriculture industry,” ATI CEO Stephen Burtt told me. “There are many issues with modern-day farming that can be resolved with new types of sensors mounted on an aerial platform.
“However, to be practical, we knew it had to be easy to use, it had to fly for a long time on a single battery charge, it had to be quick to deploy and it had to be rugged. Basically, it had to be something that a guy could haul around in the back of his pickup truck, take out in the field and launch in less than five minutes without any pre-planning or RC flying experience.”
Burtt and ATI co-founder Lawrence Dennis first conceived of the AgBOT after meeting a multispectral sensor manufacturer at a precision agriculture conference.
“He approached us and said that his company, MicaSense, had a new product that he thought would really work well on a drone,” Burtt recalled. “We went ahead and developed the system, then spent the next year flying it over vineyards: testing it, refining it and getting feedback from end-users.”
Owing to the Federal Aviation Administration’s failure to deliver workable rules for the commercial use of small drone aircraft, most of ATI’s customers for the AgBOT are overseas.
“The U.S. market has been really tough because of the regulatory environment,” said Burtt. “We’ve sold a lot of units into South America—Uruguay, Chili and Peru—and Indonesia has also emerged as a real hotbed for some reason.”
Flying over a vineyard in West Linn, Oregon the AgBOT compares the amount of visible light in the red spectrum to the amount of invisible near-infrared light that are being reflected by the grape vines in the field below to measure their health.
In the visible light image of a vineyard, above left, notice that the vines appear to be uniformly vibrant and green—suggesting that they are all equally healthy. However, in the false-color, multispectral image, above right, distinct differences between the health of the plants emerge. The vines at the bottom of the frame show much greater signs of stress, as indicated by the yellow and red highlights, than the vines at the top of the image.
Blinded by Invisible Light
Multispectral imagery is such an effective tool in agriculture because it allows the farmer to detect subtle changes in plant health up to two weeks before they become visible to the naked eye—providing an early warning that crops are being affected by pests, disease or a lack of water.
Built to be a single-case solution for working farmers, the AgBOT records its sensor data onto a standard SD memory card. After the aircraft lands, the data can either be uploaded to the cloud for analysis, which yields results in 24 to 48 hours, or downloaded directly onto the farmer’s computer for immediate results.
The key, according to Gabriel Torres, the CEO and co-founder of multispectral sensor manufacturer MicaSense, is to measure the amount of light that a plant reflects in the visible wavelengths—like red, green and blue—with the amount it reflects in the invisible “near infrared” wavelengths that lie just outside the frequencies of light that the human eye can perceive.
“A plant reflects more of the green light that falls upon it than the red or blue light, which is why plants appear to be green to us,” Torres explained to me. “The amount of reflected light jumps way, way up when you move into the near-infrared. If you could see those wavelengths, you would be blinded every time you looked at a plant.”
As a plant becomes more stressed, it reflects less and less change that the multispectral sensor can detect. According to Torres, generating false-color images to illustrate these differences is actually of limited value, without the ability to assess how they change over time.
“We’re trying to educate everybody that those pretty pictures are nice to look at, but they key is to provide a quantified, calibrated scale along with those images, so that they mean the same thing even if they are captured under different lighting conditions, with a different aircraft or a different camera serial number,” Torres said.
Of course, the data is of no value at all if the farmer can’t actually see it, so Burtt described how the easy-to-use design of the AgBOT extends from flight operations to analysis.
“The back-end processing is really simple,” he said. “The sensor has its own Inertial Measurement Unit and GPS antenna, so all you have to do is pull the SD card and upload the data to the cloud, and within 48 hours you get maps back with all of the various layers and indicators. Or, you can install the analytic software right on your own computer and do the processing yourself.”
Aerial Technology International (ATI) of Clackamas, Oregon designed the AgBOT specifically to fulfill the needs of precision agriculture with features that include a multi-spectral sensor, extended flying time and tablet-based, autonomous piloting. Mounted on a two-axis gimbal to maintain its nadir orientation, the Red Edge multi-spectral sensor from MicaSense gives the AgBOT its ability to gauge crop health from the air.
Machine Vision, Human Judgment
“UAVs and multispectral imagery do not automatically solve all of the problems in agriculture,” said Torres. “It’s another tool that will help them solve problems, along with everything else they are already doing.”
Leigh Bartholomew, a viticulturalist with Results Partners in Oregon, agrees with that assessment.
MicaSense developed the RedEdge sensor specifically for airborne multispectral crop imagery. In addition to an integrated Inertial Measurement Unit and GPS receiver, it has five separate lenses, each keyed to a specific wavelength of light: red, green, blue, near-infrared and the “red edge”—the range of wavelengths between red and near infrared. It weighs just 150 grams—the same as a GoPro Hero 4 sports camera in its waterproof housing.
“You’ve still got to walk through the fields and do the ground-truthing, if you’re really going to understand what those results are telling you,” she said.
Also, according to Bartholomew, the happiest plant is not necessarily the most productive plant.
“It’s a weird thing about growing grapes,” she explained. “If a vine is getting just the perfect amount of water and sunlight, it spends all of its energy growing leaves, instead of growing and ripening fruit. There is an old saying that in order to produce the best fruit, the vine has to experience a little bit of stress.
“This technology really helps us fine-tune those results.”
For Burtt and his colleagues at ATI, the AgBOT is just the first part of an ongoing effort to develop rotor drones that are purpose-built to fulfill specific missions.
“Our view is that when the car was invented, the Model Ts just came rolling off the assembly line and everybody loved them—that’s what the DJI Phantom is today,” he said. “But look at all the different kinds of automobiles there are today: we have minivans, pickup trucks and compacts. It’s going to be the same thing with drones. If you’re doing aerial cinematography, you’re going to need a very different feature set than if you’re doing precision agriculture.
“Our goal is to build aircraft that are well-suited to each of those specific applications.”
By Patrick Sherman