Vidalia sweet onions, celebrated for their mild flavor and low pungency, are a cornerstone of Georgia’s agricultural economy.
These onions generate over $150 million annually and occupy 5,000 hectares of farmland, representing 40% of the U.S. sweet onion market. Despite their economic importance, traditional farming methods struggle with unpredictability.
Farmers often rely on visual cues, such as leaf fall, to decide when to harvest, but this approach is labor-intensive, subjective, and fails to predict yield or bulb size.
A groundbreaking study published in Smart Agricultural Technology (2025) offers a solution: using drones, multispectral imaging, and machine learning to forecast yield and market class weeks before harvest.
This innovation promises to transform how Vidalia onions are grown, harvested, and sold.
The Challenges of Traditional Onion Farming
For decades, Vidalia onion farmers have faced two major hurdles. First, determining the optimal harvest time is tricky.
Farmers typically wait until 50–80% of the plant’s top leaves fall, but pests, diseases, or weather can disrupt this process, leading to premature or delayed harvesting.
Second, yield and bulb size—critical factors for profitability—are only measured after harvest at grading facilities.
This destructive process destroys bulbs, delays decision-making, and leaves farmers guessing about their crop’s value until the last minute. To address these challenges, researchers from the University of Georgia turned to modern technology.
By combining drone-collected data with machine learning, they developed a non-destructive, scalable method to predict both yield and market class (medium, jumbo, colossal) weeks in advance.
Methodology: How Drones and AI Forecast Yield and Market Class
The research team focused on two commercial Vidalia onion fields in Georgia: one near Glennville and another near Cobbtown. Both fields used a four-row bed system, with plants spaced 30 cm apart and irrigated using pivot systems.
Starting 90 days before harvest, a DJI Mavic 3 Multispectral drone captured images of the fields every two weeks. Equipped with Green, Red, RedEdge, and Near-Infrared (NIR) sensors, the drone collected data on plant health and canopy structure.
After each flight, specialized software stitched thousands of images into detailed maps called orthomosaics. These maps were then processed to remove soil pixels, isolating plant data with 97% accuracy.
At harvest, researchers manually measured yield (tons per hectare) and market class distribution in 50 plots per field. This ground truth data was paired with drone data to train machine learning models.
Eight algorithms were tested, but the Random Forest (RF) model stood out. Known for handling complex, non-linear data, RF achieved the highest accuracy by building multiple decision trees and averaging their results.
This approach minimized errors and provided clear insights into which factors—like canopy texture or NIR reflectance—most influenced predictions.
Key Results: Precision Forecasting for Vidalia Onions
The study produced two major breakthroughs.
First, yield could be forecasted with 73% accuracy just 30 days before harvest.Second, the distribution of medium-sized onions (5–7.6 cm) was predicted with an error margin of 8%. Here’s how the results unfolded over time:
Yield Forecasting:
Early in the season (90 days before harvest), predictions were unreliable because young plants lacked distinct growth patterns. However, accuracy improved steadily, peaking at 30 days pre-harvest.
During this window, the model explained 73% of yield variability (R² = 0.73) with an average error of 4.8 tons per hectare. Beyond 30 days, accuracy declined as leaves began to senesce (age), masking critical growth signals.
Market Class Predictions:
Forecasting bulb size was more complex. Medium onions, which made up 25% of the crop, were predicted most accurately at 45 days pre-harvest, with an 8% error rate.
Jumbo onions (66% of the crop) were harder to predict due to their dominance, while colossal onions (9%) had smaller errors but limited data. Texture metrics, which quantify canopy patterns like contrast and randomness, emerged as vital predictors.
For example, fields with moderate entropy (a measure of randomness) at 45 days pre-harvest tended to have more medium bulbs, likely due to balanced resource distribution.
Why Texture Data is a Game-Changer
While most agricultural studies focus on spectral data (e.g., NIR for chlorophyll levels), this research highlighted the importance of texture metrics.
Unlike spectral data, which can become “saturated” in dense canopies, texture data captures spatial patterns that reflect plant competition, stress, and growth stages.
For instance, contrast—a measure of variation between neighboring pixels—revealed areas of uneven growth. High contrast at 30 days pre-harvest often signaled competition between plants, correlating with higher yields as larger bulbs dominated.
Meanwhile, entropy (randomness in pixel values) helped predict medium onions. Fields with moderate entropy at 45 days pre-harvest had more uniform bulb sizes, likely due to consistent nutrient uptake.
Overcoming Traditional Farming Limitations
Traditional methods, which rely on manual inspections and post-harvest grading, are fraught with inefficiencies. Destructive sampling destroys bulbs, while labor costs soar as fields expand.
The drone-based approach solves these issues. For example, a single drone flight covering 50 plots takes just one hour, compared to days of manual work.
Additionally, the non-destructive nature of drone imaging allows farmers to monitor fields repeatedly without damaging crops.
The study also addressed technical challenges. Removing soil pixels from images, a process achieved with 97% accuracy, ensured plant data wasn’t skewed by bare soil.
Similarly, filtering out less useful features (e.g., Green Band dissimilarity) improved model accuracy by 15%.
Real-World Applications for Farmers
The implications of this research are vast. Farmers can now make informed decisions weeks before harvest, optimizing both yield and profitability.
Precision Harvest Scheduling:
By identifying high-yield zones early, farmers can allocate resources to areas needing extra attention.
For example, a field with patches yielding 78 tons per hectare might receive targeted irrigation, while low-yield zones (34 tons/ha) are harvested later to allow bulb growth.
Supply Chain Efficiency:
Knowing that 66% of the crop will be jumbo onions enables farmers to negotiate bulk contracts with retailers in advance, reducing storage costs and price fluctuations.
Sustainable Practices:
Drone maps can pinpoint nitrogen-deficient areas (via RedEdge contrast metrics), allowing precise fertilizer application. This reduces waste by 20–30% and minimizes environmental impact.
Future Innovations in Precision Agriculture
This study is just the beginning. Future innovations could integrate hyperspectral sensors (capturing more light bands) to improve chlorophyll monitoring or combine drone data with satellite imagery for statewide yield mapping.
Ground robots equipped with LiDAR might also validate drone predictions and apply treatments in real time.
Conclusion: Pioneering Sustainable Onion Farming
The fusion of drone technology and machine learning marks a turning point for Vidalia onion farming. Farmers no longer need to rely on guesswork or destructive sampling.
Instead, they can forecast yields with 73% accuracy and predict market classes weeks before harvest, all while reducing labor costs by 40%. Supported by the Vidalia Onion Committee, these tools are poised to become industry standards.
As adoption grows, Georgia will solidify its position as a global leader in sustainable, tech-driven agriculture—ensuring the sweetest onions reach tables worldwide, season after season.
Barbosa Júnior, M. R., Sales, L. A., Santos, R. G., Vargas, R. B. S., Tyson, C., & Oliveira, L. P. (2025). Forecasting yield and market classes of Vidalia sweet onions: A UAV-based multispectral and texture data-driven approach. Smart Agricultural Technology, 10, 100808. https://doi.org/10.1016/j.atech.2025.100808Reference: