
In what until recently was science-fiction, a new study has now unveiled an advanced drone-based system that, for the first time, enables smarter monitoring of sesame crop health. By combining hyperspectral, thermal, and RGB imagery with deep learning, researchers have developed a powerful method for detecting simultaneous nitrogen and water deficiencies in field-grown sesame. This innovative approach leverages state-of-the-art UAV imaging technology and artificial intelligence to significantly improve the accuracy of crop stress detection.
Integrating multiple data sources allows for the identification of combined nutrient and water-related deficiencies—an important advance in the field of precision agriculture. This development not only enhances crop management but also promotes more sustainable and efficient use of water and fertilisers, both of which are essential for building climate-resilient food systems.
A team led by Dr Ittai Herrmann at The Hebrew University of Jerusalem, in collaboration with Virginia State University, the University of Tokyo, and the Volcani Institute, has demonstrated that this advanced drone-based system can accurately detect combined nitrogen and water deficiencies in field-grown sesame. Their work paves the way for more efficient and sustainable farming practices.
Published in the ISPRS Journal of Photogrammetry and Remote Sensing, the study highlights how unmanned aerial vehicles (UAVs) equipped with hyperspectral, thermal, and RGB sensors can work in tandem with artificial intelligence models to diagnose complex crop stress scenarios. Traditional remote sensing methods often struggle to detect combined environmental stresses, such as water and nutrient shortages. This research is among the first to successfully address this challenge in a crop as variable as sesame.
“By integrating data from multiple UAV imaging sources and training deep learning models to analyse it, we can now distinguish between stress factors that were previously difficult to tell apart,” said Dr Herrmann. “This capability is vital for precision agriculture and for adapting to the challenges of climate change.”
The team’s multimodal ensemble approach improved the classification accuracy of combined nutrient and water stress from just 40–55% with conventional methods to an impressive 65–90% using their custom-developed deep learning system.
The field experiment took place at the Experimental Farm of the Robert H. Smith Faculty of Agriculture in Rehovot, with seeds provided by Prof Zvi Peleg. Rom Tarshish, then an MSc student, cultivated sesame plants under varied irrigation and nitrogen treatments, collecting plant traits and leaf-level spectral data. Dr Maitreya Mohan Sahoo analysed the UAV imagery using machine learning pipelines to generate maps of leaf nitrogen content, water content, and other physiological traits, helping to identify early stress markers.
Sesame, a climate-resilient oilseed crop with rising global demand, was chosen for its nutritional value and potential for expansion into new agro-ecosystems. This new remote-sensing method could enable growers to reduce fertiliser and water use while maintaining yields, thereby improving both economic and environmental outcomes.
The research paper, “Multimodal ensemble of UAV-borne hyperspectral, thermal, and RGB imagery to identify combined nitrogen and water deficiencies in field-grown sesame”, is available in the ISPRS Journal of Photogrammetry and Remote Sensing and can be accessed at https://doi.org/10.1016/j.isprsjprs.2025.02.011.
Researchers:
Maitreya Mohan Sahoo, Rom Tarshish, Yaniv Tubul, Idan Sabag, Yaron Gadri, Gota Morota, Zvi Peleg, Victor Alchanatis, Ittai Herrmann
Institutions:
- The Hebrew University of Jerusalem, Rehovot, Israel
- The University of Tokyo, Japan
- Agricultural Research Organization – Volcani Institute, Rishon LeZion, Israel
- School of Animal Sciences, Virginia Polytechnic Institute and State University, USA