Black Swift Technologies (BST), a specialized engineering firm based in Boulder, CO, announced today its revolutionary Automated Emergency Safe Landing (AESL) functionality for UAS. The technology integrates state of the art machine learning algorithms and cutting edge onboard processors into the Black Swift S2™ UAS to capture and classify images, at altitude, enabling a UAS to autonomously identify a safe landing area in the event of a catastrophe—a key enabler for safe beyond line of sight flights. This solution processes large amounts of data quickly and efficiently to enable the identification of objects and terrain to be avoided in order to land the aircraft without harm to people or property.
“Our emphasis is to make UAS operations safer for both operators and the public,” emphasizes Jack Elston, Ph.D., CEO of Black Swift Technologies. “The goal of AESL is to be able to take a snapshot and within 60 seconds of something like a catastrophic engine failure, be able to identify a landing zone, calculate a landing trajectory, and safely land a UAS away from people and obstacles. We remain convinced that a thorough understanding and integration of artificial intelligence and machine learning can help serve as a catalyst for accelerating UAS growth and adoption industry-wide.”
AESL functionality is the result of a NASA SBIR Grant awarded to BST, and an ongoing collaboration with Luxonis LLC, a Colorado-based technology company that specializes in embedded machine learning, artificial intelligence, and computer vision, from concept through custom hardware, firmware, software and UI/UX.
“This technology uses video or still imagery of the ground to determine what those objects are (Figure 1), and classifies them as humans, vehicles, and/or structures—things you have to avoid at all costs, even if it’s at the expense of the aircraft—to identify safe landing areas for a UAS in distress,” states Brandon Gilles, CEO, Luxonis LLC. “Leveraging machine vision and artificial intelligence, AESL enables a human-like perception of the world where autonomy doesn’t have to rely entirely on GPS, altimeters, or the like. This system can visually understand what’s around it and make decisions accordingly, in real-time.”
While AESL functionality can serve as a significant stepping stone towards obtaining FAA exemptions for safe beyond line of sight flights, what observers and users are describing as the most striking feature is the size of the components and their power requirements (which are quite low) for what’s actually doing this image capture/processing onboard the aircraft.
“This is a very complex solution,” says Austin Andersen, Machine Learning Lead Engineer, Black Swift Technologies. “It is a robust, onboard data collection system with a very small footprint and low power requirement. It enables real-time data collection and the ability to review massive amounts of imagery that a human alone could not. Now users can gather this high level of intelligence without having to lug around a giant, power-hungry system.”
While AESL functionality is currently exclusive to BST’s purpose-built UAS platform, Elston notes that it doesn’t preclude pairing its technology with third-party systems.
“Our technology is intended to help enable a safer flight experience both for UAS operators and the general public,” Elston emphasizes. “All the data products that we’d be accessing are going to be common across any autopilot. We don’t want to give too much decision making data to the autopilot—just enough to know what to avoid and where to safely land.”
Share this article:
- Click to share on LinkedIn (Opens in new window)
- Click to share on Facebook (Opens in new window)
- Click to share on Twitter (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on Pinterest (Opens in new window)
- Click to share on Pocket (Opens in new window)
- Click to print (Opens in new window)