2024
Petti, Daniel; Zhu, Ronghang; Li, Sheng; Li, Changying
Graph Neural Networks for lightweight plant organ tracking Journal Article
In: Computers and Electronics in Agriculture, vol. 225, pp. 109294, 2024, ISSN: 0168-1699.
Abstract | Links | BibTeX | Tags: Convolutional Neural Network, Graph Neural Network, High-throughput phenotyping, Machine vision, Multi-Object Tracking
@article{PETTI2024109294,
title = {Graph Neural Networks for lightweight plant organ tracking},
author = {Daniel Petti and Ronghang Zhu and Sheng Li and Changying Li},
url = {https://www.sciencedirect.com/science/article/pii/S0168169924006859},
doi = {https://doi.org/10.1016/j.compag.2024.109294},
issn = {0168-1699},
year = {2024},
date = {2024-01-01},
journal = {Computers and Electronics in Agriculture},
volume = {225},
pages = {109294},
abstract = {Many specific problems within the domain of high throughput phenotyping require the accurate localization of plant organs. To track and count plant organs, we propose GCNNMatch++, a Graph Convolutional Neural Network (GCNN) that is capable of online tracking objects from videos. Based upon the GCNNMatch tracker with an improved CensNet GNN, our end-to-end tracking approach achieves fast inference. In order to adapt this approach to flower counting, we collected a large, high-quality dataset of cotton flower videos by leveraging our custom-built MARS-X robotic platform. Specifically, our system can count cotton flowers in the field with 80% accuracy, achieving a Higher-Order Tracking Accuracy (HOTA) of 51.09 and outperforming more generic tracking methods. Without any optimization (such as employing TensorRT), our association model runs in 44 ms on a central processing unit (CPU). On appropriate hardware, our model holds promise for achieving real-time counting performance when coupled with a fast detector. Overall, our approach is useful in counting cotton flowers and other relevant plant organs for both breeding programs and yield estimation.},
keywords = {Convolutional Neural Network, Graph Neural Network, High-throughput phenotyping, Machine vision, Multi-Object Tracking},
pubstate = {published},
tppubtype = {article}
}
Many specific problems within the domain of high throughput phenotyping require the accurate localization of plant organs. To track and count plant organs, we propose GCNNMatch++, a Graph Convolutional Neural Network (GCNN) that is capable of online tracking objects from videos. Based upon the GCNNMatch tracker with an improved CensNet GNN, our end-to-end tracking approach achieves fast inference. In order to adapt this approach to flower counting, we collected a large, high-quality dataset of cotton flower videos by leveraging our custom-built MARS-X robotic platform. Specifically, our system can count cotton flowers in the field with 80% accuracy, achieving a Higher-Order Tracking Accuracy (HOTA) of 51.09 and outperforming more generic tracking methods. Without any optimization (such as employing TensorRT), our association model runs in 44 ms on a central processing unit (CPU). On appropriate hardware, our model holds promise for achieving real-time counting performance when coupled with a fast detector. Overall, our approach is useful in counting cotton flowers and other relevant plant organs for both breeding programs and yield estimation.