An official website of the United States government.

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.



This Phase I SBIR project will determine the technical feasibility of using a newly available LiDAR on a chip (LoC) to detect insect pollinators in the field. There are substantial commercial and ecological needs to detect, quantify and characterize the effectiveness of pollinators on crops and critical wild plants. Current time-consuming methods that rely on manual observations are being augmented by powerful new machine vision, machine learning and other AI powered technologies. As promising as these new image-based technologies are, they are limited by the cost, complexity and electrical power required to bring the power of AI-enhanced computer vision to remotely located agricultural lands.If a producer wants to respond to a pollination deficit in time to preserve adequate yields, the producer needs to know the current state of pollination in almost real time during critically short blooms. Ineffective commercial hives can be mitigated on the spot, and building a record of year-to-year effectiveness of commercial, feral and native pollinators can help formulate site-specific pollinator management plans that account for weather and local pollinator habitat conditions.Leveraging rapidly evolving computer vision technologies to meet these producer needs will require new low-cost and low-power ways to bring these computer technologies to all corners of remote croplands. Running power-hungry computer vision cameras full-time in the field will fail to meet the logistical constraints of real world food production. Commercial camera traps, aka trail cameras, are designed to acquire images of critical events without the need for huge batteries, solar panels and cloud connectivity. Camera traps achieve this by leaving the power-hungry image sensor turned off most of the time. Only one small micro-powered electrical component is left turned on and searching for movement. This small component is a passive infrared (PIR) sensor. Only after the PIR sensor detects motion is the power-hungry image sensor turned on to acquire images. After the images are acquired and stored the power-hungry image sensor and main processor are again turned off to save battery power.A similar approach will be taken by this project in order to increase the feasibility of using computer vision technologies for monitoring pollinators. Micro-powered PIR sensors that are used in trail cameras would solve the power supply issue except for the fact that PIR sensors are only triggered when the moving target presents a temperature differential with a static background. Camera traps work well on large warm-blooded animals moving against a cooler background but PIR sensors are not always effective for detecting flying insects. PIR sensors are also susceptible to false triggers from sunlight reflecting off of vegetation that is moving in the wind.Another way to detect a moving object is using the image sensor enhanced by onboard digital imaging processing techniques. Onboard image processing can acquire a series of baseline images to determine if a new object has moved into the field of view. This process is called background subtraction because when a new image is taken, the pixels are subtracted from the previously acquired baseline images, pixel by pixel, to identify clusters of pixels that are new and different from the baseline. This method of detecting moving objects does not require a temperature differential of the moving target, but it has two disadvantages: 1) Image background subtraction requires that the power-hungry image sensor be turned on almost continuously, and 2) background subtraction is susceptible to false triggers from background vegetation moving in the wind.This project will evaluate a new and inexpensive electrical component for detecting insect pollinators that might replace or augment PIRs and image background subtraction for achieving cost effective monitoring of remote pollination events. In late 2020 STMicroelectronics introduced the first commercialized multi-zone rangefinder on a chip. This micro-powered $9 chip produces an 8x8 grid of LiDAR distance measurements. This grid of accurate distance measurements is achieved by emitting an 8x8 grid of tiny invisible infrared laser pulses. The chip measures the time between emitting a pulse and the return of reflected pulses. From these measurements the chip accurately measures the distance (within a couple of millimeters) to the nearest object within each of the 64 zones in its field of view.This LoC sensor has a field of view of about 45 degrees, and it can measure distances of up to several meters. This field of view and measurement capability is well-suited for detecting medium to large insect pollinators. An LoC sensor uses slightly more battery power than a PIR sensor but it uses orders-of-magnitude less battery power than operating an image sensor full time for background subtraction and image object recognition.The enticing potential of leveraging LoCs for monitoring pollinators begs further investigation. However, being a new and undocumented application for LoCs our team must first conduct substantial feasibility testing. The goal of this Phase I project is to measure and determine the envelope of conditions for which LoCs can be used to solve the problem of cost-effective use of machine vision technologies for monitoring insect pollination of agricultural crops and critical wild plants.To determine the envelope of conditions under which LoCs will be effective. An automated test system will be built that will automatically move variously sized targets past through an LoC sensor's field of view. The automated test systems will compile extensive and precise tabulations of positive detections and false negative and false positives. These test data will be taken while modifying target reflectivity, speed, size and distance from the sensor. Background conditions will also be manipulated and tested including moving vegetation and over a range of realistic outdoor lighting conditions.The goal of these tests will be to narrow and carefully define the range of insect characteristics and conditions for which LoCs will be effective. From this information an optimized device can be designed and evaluated for commercial potential and Phase II project feasibility. The following objectives were developed to meet the project goals:1) Identify and mitigate crosstalk noise that might affect LoC performance. This objective involves determination of electronic noise impacts on LoC performance from other electronic subsystems in a smart camera system such as high-speed clocks, USB, GPS and main processor.2) Measure and mitigate unit to unit variation of LoC sensors. LoC sensors being a newly commercialized component it is prudent to look for unit-to-unit variation, and if found, mitigate it by screening components or other measures.3) Experimentally verify that insect detection accuracy with LoC or an LoC plus smart camera (LoC+Cam) system is as good or better than existing machine vision devices.4) Experimentally verify that honey bee sized insect pollinators can be reliably detected when within a specified range of size, flying speeds and distances.5) Experimentally verify that software analysis of LoC and LoC+Cam sensor data can distinguish between major insect taxa.6) Experimentally verify end-to-end system performance of battery, data acquisition and analysis appropriate for remote field deployments.7) Experimentally verify that software analysis of LoC and LoC+Cam sensor data can detect insect pollination events.8)Resources permitting, we can address two stretch goals: A) determine whether an array of LoCs is substantially better than a single LoC unit. And B) determine if LoC assisted focus of a high-end camera board can improve detection of species and behaviors.

Bonham, D.; Woodman, CO, .
Start date
End date
Project number
Accession number