A review of augmented reality in embedded systems for smart farming: innovations, applications and future directions

Lauren Genith ISAZA DOMÍNGUEZ , Oscar AGUDELO VARELA , Nestor SUAT ROJAS

Front. Agr. Sci. Eng. ›› 2026, Vol. 13 ›› Issue (2) : 25652

PDF (1506KB)
Front. Agr. Sci. Eng. ›› 2026, Vol. 13 ›› Issue (2) : 25652 DOI: 10.15302/J-FASE-2025652
REVIEW
REVIEW

A review of augmented reality in embedded systems for smart farming: innovations, applications and future directions

Author information +
History +
PDF (1506KB)

Abstract

The integration of augmented reality (AR) into embedded agricultural systems is reshaping precision farming by enabling real-time visualizations and interactions with complex environmental data. In the face of mounting global pressures, from climate variability to resource constraints and food system demands, AR-enhanced platforms present a promising pathway toward more efficient farming practices. However, existing research has predominantly treated AR as a standalone tool, overlooking its potential to link and enable the functional integration of diverse embedded technologies. Therefore, the objective of the present review is to investigate how AR visualization can integrate data from Internet of Things devices, unmanned aerial vehicles, farming machinery, robotics, edge computing platforms and artificial intelligence (AI) to enable their coordinated, field-level deployment in precision agriculture. The article offers three primary contributions: a structured synthesis of AR applications across embedded systems, a conceptual architecture for AR-centered smart farming and an integrated analysis of research gaps and future directions. Key research gaps include the lack of studies addressing model interpretability and system interoperability, insufficient exploration of real-time edge AI processing and gesture-based AR controls, and the absence of globally representative data sets for AI image analysis. Future research directions include the development of low-latency data pipelines, explainable AI interfaces, swarm-capable Drone-AR systems, energy-efficient edge AI models, federated learning for data privacy and participatory design strategies tailored for resource-limited contexts. These findings offer valuable insights for researchers, technology developers, policymakers and farmers working to implement scalable, secure and accessible AR-powered agricultural solutions.

Graphical abstract

Keywords

Augmented reality / embedded systems / smart farming / artificial intelligence / IoT / edge computing

Highlight

● Augmented Reality (AR) and artificial intelligence (AI) enhance embedded systems in precision agriculture applications.

● AR integration with the Internet-of-Things improves crop monitoring, irrigation and farm automation.

● Unmanned aerial vehicles with AR enable large-scale disease detection and yield estimation.

● AI-driven AR supports decision-making in smart farm robotics and automation.

● An AR-centered interface model is proposed to unify embedded farming technologies.

Cite this article

Download citation ▾
Lauren Genith ISAZA DOMÍNGUEZ, Oscar AGUDELO VARELA, Nestor SUAT ROJAS. A review of augmented reality in embedded systems for smart farming: innovations, applications and future directions. Front. Agr. Sci. Eng., 2026, 13(2): 25652 DOI:10.15302/J-FASE-2025652

登录浏览全文

4963

注册一个新账户 忘记密码

1 Introduction

Global agriculture is under increasing pressure due to rapid population growth, climate change and resource depletion, all of which threaten food security and sustainability[1,2]. By 2050, the global population is expected to reach 9.7 billion, necessitating a projected 70% increase in food production to meet rising demand[3]. Current farming methods, which rely on manual labor and standard management strategies, are proving insufficient in addressing these challenges[4]. As a result, Agriculture 4.0 has emerged as a data-driven, automated and intelligent farming paradigm, driven by embedded systems that integrate artificial intelligence (AI), the Internet of Things (IoT), robotics, edge AI, unmanned aerial vehicles (UAVs), cloud computing and augmented reality (AR)[5]. These advancements enable farmers to monitor real-time data, optimize resource usage and implement predictive decision-making, leading to more efficient and sustainable agricultural practices[6].

Of these emerging technologies, AR has gained attention as a powerful tool for precision agriculture, offering interactive visualization, on-the-spot decision support and data-driven analysis that supports optimized decision-making in farming practices[6]. AR enables farmers to overlay digital information onto physical surroundings, enhancing spatial awareness and supporting timely, location-based decisions[7]. Although AI systems can generate powerful analytical insights, they often lack the spatial context and environmental interactivity required for practical deployment in the field[8]. Without a contextual interface such as AR, AI outputs remain abstract and disconnected from farm-level actions. AR is essential for localizing, operationalizing and translating AI-generated data into actionable workflows for precision agriculture.

Current applications of AR in Agriculture 4.0 span multiple domains (Fig.1), including fertilizer and water management, AI decision support, ripeness detection, crop recommendation, disease identification and pest analysis[9]. These applications highlight the versatility of AR in modern farming. Technological innovations in smart farming are expanding the potential of AR by enabling integration with AI, IoT, robotics and UAV-assisted sensing technologies, paving the way for more advanced applications in precision agriculture[1,10].

The impact of plant diseases, which cause 10% to 16% yield loss annually and have an estimated economic cost of 220 billion USD globally[11], underscores the importance of early detection to ensure food security and prevent crop failures. Together, AI and AR can transform plant disease detection by providing non-destructive, rapid and precise diagnostic tools. Techniques such as image recognition models integrated with AR, combined with AI decision models, can significantly improve the accuracy of disease identification[10].

The integration of AR with IoT and sensor networks can also transform how farmers monitor soil conditions, crop health and environmental factors. Recent studies show that AR-based decision support systems combined with AI image recognition and deep learning (DL) models can significantly improve pest and disease detection[10]. In addition, UAVs equipped with hyperspectral imaging and AI analytics can work in conjunction with AR to enable large-scale farm mapping, pesticide spraying and site-specific fertilization[9].

Although AR has been studied within the context of individual technologies, such as AI, IoT, UAVs, robotics and edge AI, no existing review has examined how it functions across these systems collectively or considered its potential as a unifying interface within embedded agricultural architectures. To address this gap, this review asks the research question, how can AR visualization and overlays serve as a central interface for integrating and enabling the functional deployment of diverse embedded technologies in data-driven precision agriculture? The novelty of this research lies in framing AR not as a peripheral tool, but as a central interface that spatially and functionally links these components of smart farming systems. The main contributions of this review include: (1) a structured synthesis of AR applications across embedded technologies; (2) a conceptual illustration of the unifying role of AR within precision farming workflows; and (3) the identification of key research gaps and future directions to guide scalable, accessible AR deployments. These findings offer valuable insights for farmers, policymakers and stakeholders seeking to adopt AR-enabled agricultural solutions.

This paper introduces a conceptual AR-centered architecture for embedded smart farming systems, then explores AR integration in IoT-based smart farming, analyzes AR applications in agricultural robotics and machinery, and examines UAV-AR systems for crop monitoring and diagnostics. Next it reviews AI-driven decision support and image recognition in AR contexts, and evaluates AR-enabled edge computing frameworks. The discussion covers contributions, research gaps, study limitations and future research directions, followed by a summary of findings, implications for stakeholders and broader contributions to the field.

2 Augmented reality as a central interface in embedded smart farming systems

To demonstrate the integrative role of AR, this section presents a conceptual illustration of how AR can function as the core interface within embedded smart farming systems. Rather than proposing a new system, the model synthesizes current research to illustrate how AR can coordinate sensing, diagnostics and interventions across multiple embedded technologies. Unlike current implementations that treat AR, AI, IoT, UAVs and edge computing as disconnected tools, the model shown in Fig.2 positions AR as the decision-support layer linking these components, enabling farmers to interact with data and AI recommendations through a field-based interface.

The system, as illustrated in Fig.2, initiates with aerial and ground-based sensors detecting deviations in expected crop conditions and transmitting spatial coordinates through the IoT network. AR-enabled glasses or mobile devices either guide the farmer to the affected area or are used via the AR interface to direct machinery, such as a pesticide drone, to the site. Once at the location, the farmer can scan the problem area through the AR interface, which relays visual data to an edge AI module for analysis. Alternatively, if machinery has been deployed, a connected camera feed, accessed through the AR system, can transmit visual data from the equipment to the edge AI module. If the edge AI confirms an issue, the recommended actions can be relayed through the AR interface, whether on smart glasses or a mobile device, providing the farmer with auditory guidance on appropriate responses or suggesting machine-based interventions. Depending on the situation, the farmer may act directly or use the AR system to guide robotic machinery to perform the necessary tasks. If no issue is detected by the edge AI module, data can be escalated to a cloud-based model for further analysis, if internet connectivity is available. If uncertainty remains, the farmer may proceed with manual inspection as a final step. This forms a continuous loop of detection, diagnosis and response, with AR functioning as the central interface for both human- and machine-led interventions.

3 Integration of augmented reality with IoT networks

3.1 AR as an interface layer for IoT-based farming

Although IoT systems generate vast streams of agricultural data, their utility often depends on how effectively farmers can interpret and act on that information. AR enhances IoT usability by anchoring sensor data within the spatial layout of the farm, allowing farmers to make immediate, location-specific decisions. In the conceptual model, AR serves as the interface layer through which IoT-based alerts, such as irrigation inefficiencies, nutrient imbalances, or microclimate anomalies, are delivered to the farmer. This section reviews current research demonstrating how AR could improve the interpretability, responsiveness and practical deployment of IoT in precision agriculture.

3.2 AR-integrated smart irrigation systems

An AR-integrated smart irrigation system was developed to optimize water usage and irrigation efficiency by combining AR, IoT and machine learning (ML)[12]. The AR-based user interface of the system enables farmers to visualize real-time soil moisture, temperature and humidity data while adjusting irrigation schedules through AR-enabled controls. IoT sensors continuously monitor environmental conditions, while ML models (SARIMA and exponential smoothing) predict optimal irrigation timing. The AR interface built using Unity, Vuforia and ARCore, overlays on-the-spot data onto the physical farm environment and demonstrates how IoT and AR could collaboratively enhance irrigation accuracy and minimize water loss.

Another study developed an IoT and ML-based irrigation system using long-range wireless area networks to remotely monitor environmental and agricultural parameters, including air temperature, pressure, humidity and soil moisture[13]. The system incorporated a novel wind-driven optimization algorithm, which improved the prediction accuracy of irrigation-related data achieving 87.5% accuracy in forecasting soil moisture and irrigation timing. Integrating AR could further enhance usability by overlaying forecast data and soil moisture conditions directly onto the field of view of the user through smart glasses or mobile devices, enabling farmers to make location-specific irrigation decisions in real time without navigating complex dashboards or remote interfaces.

AR could also support IoT-based water management by visually conveying nutrient runoff patterns and environmental risks in real time. When combined with IoT water quality sensors, AR could serve as an interactive decision-support tool for live monitoring of runoff contamination. The Affluent Effluent system, developed for Microsoft HoloLens, integrates AR with a system dynamics model to simulate fertilizer runoff, oxygen depletion and algal bloom formation in water bodies[14]. Built using Unity and ShaderLab, the AR application allows users to manipulate nutrient levels, aeration and hydrological variables. By making invisible to the eye water quality changes perceptible, AR has the potential to complement IoT water monitoring by reducing agricultural runoff pollution.

3.3 Optimizing IoT sensor placement and fertigation

Efficient IoT sensor placement is essential for maximizing data accuracy while minimizing costs[15]. A quantum deep reinforcement learning model, enhanced with a modified wild geese algorithm, improved environmental parameter monitoring (temperature, humidity, air and water quality) with 96.4% accuracy while reducing sensor deployment costs by 15%[16]. AR-based visualization tools could complement this approach by enabling interactive assessment of sensor placement, ensuring optimal distribution for large-scale agricultural operations. By overlaying sensor coverage maps onto the physical farm environment, AR interfaces could help farmers identify gaps in data collection and refine sensor deployment strategies.

IoT fertigation automation has demonstrated its potential in optimizing resource efficiency while maintaining high crop yields. A fertigation system designed for banana cultivation found that a −50 kPa irrigation strategy with 50% of the recommended dose of fertilizer resulted in 26.0% water savings while sustaining productivity at 102 t·ha–1[17]. Integrating AR-based interfaces with monitoring systems could enhance decision-making by allowing farmers to visualize live nutrient distribution and soil conditions through augmented overlays, while receiving recommendations from on-device edge AI for optimal fertilizer application.

Fertigation management could further benefit from the ability of AR to facilitate noninvasive trichome density measurement, aiding in the assessment of fertilizer-induced stress in crops such as tomatoes. A smartphone-based AR system was developed that quantifies trichome density using calibration markers on measurement paper, ensuring precise image analysis[18]. Computer vision algorithms process these images, while ML models predict fertilizer stress with high performance, achieving a precision-recall area under the curve of 0.82, a receiver operating characteristic area under the curve of 0.64 and a strong correlation with observed stress levels (r = 0.79). This system operates entirely on-device, eliminating cloud dependency, and could integrate with IoT sensors to enable synchronized detection of nutrient stress conditions in the field.

3.4 AR and IoT in pest and disease detection

IoT pest and disease detection systems have significantly advanced real-time monitoring in precision agriculture. For example, a convolutional neural network (CNN) based intelligent pest management system achieved 97.5% accuracy in detecting oriental fruit flies, outperforming existing machine learning models such as support vector machines, k-nearest neighbors and random forest[19]. Combining IoT with AR overlays could further advance functionality by displaying live pest density heatmaps, enabling farmers to visually assess infestation severity and implement AI suggested intervention strategies more effectively.

IoT systems have also improved crop health assessment by integrating sensor data with AI models. A recursive segmentation model developed for tracking bok choy (a kind Chinese cabbage) growth achieved high segmentation accuracy, outperforming common image recognition models such as Mask R-CNN and YOLO, which struggle with foliage occlusion[20]. AR-assisted visualization could further improve plant growth monitoring by overlaying segmentation results directly onto the field view, allowing farmers to detect early stress indicators more accurately.

IoT disease monitoring systems have also advanced crop diagnostics by using DL and embedded sensor networks[21]. For example, a smart crop disease monitoring system using deep residual networks and an optimized routing framework demonstrated 94.3% accuracy in classifying rice diseases (brown spot, bacterial leaf blight and leaf Smut), outperforming existing models while reducing energy consumption and latency[22]. AR could optimize its usability by generating on-the-spot severity overlays and interactive treatment recommendations on AR interfaces.

3.5 AR-enabled IoT for digital agriculture

IoT wireless sensor networks are key for remote farming environments, where live monitoring is necessary, but internet connectivity is often limited[23]. For example, a low-power, internet-independent WSN using nRF24L01 + transceivers has been shown to effectively monitor air temperature, humidity, soil moisture and battery voltage while ensuring reliable data transmission with minimal power consumption[24]. AR integration could aid these systems by displaying stored sensor data in visual overlays, allowing farmers to assess crop conditions without requiring a constant internet connection.

AR is also being evaluated in combination with digital technologies such as blockchain to improve agricultural traceability. Studies combining AR with AI, IoT, 5G and blockchain have demonstrated how visual overlays can enhance transparency in food systems by linking real-time disease visualization with securely stored agricultural data[25].

3.6 IoT-based crop recommendation and decision support with AR

Integrating AR with IoT crop recommendation systems could enable farmers to make immediate, data-based decisions. A smart crop recommendation system was developed using ESP32 microcontrollers to collect current soil and environmental data, which were then processed by a DL model trained on the Kaggle Crop Recommendation Data set, achieving 97% accuracy[26]. The system utilizes AR to enhance user interaction by overlaying AI crop recommendations onto the physical farm environment. Through a mobile device, farmers can access live data from IoT sensors, visualizing optimal crop choices and soil conditions in an augmented interface. Built with Unity and Vuforia, the AR application provided interactive decision support, allowing users to assess recommended crops and soil management strategies more effectively.

4 Augmented reality in smart farm machinery and robotics

4.1 AR as a control and diagnostic interface for smart machinery

In the integrated AR-centered system described above, machinery and robotics represent one of the key endpoints of the decision-making loop, activated by the farmer through AR-based interfaces. AR facilitates this interaction by acting as both a visualization and control interface for smart agricultural machinery. Whether guiding the farmer through maintenance tasks or interfacing with autonomous equipment, AR overlays key operational metrics, such as fuel levels, machine diagnostics and system alerts, onto the field of view of the user. This includes both semi-automated systems and fully autonomous vehicles such as self-driving tractors and robotic sprayers, which can be directed or monitored via AR interfaces.

This section reviews current research on AR integration in agricultural machinery and robotics, with emphasis on machine monitoring, predictive maintenance, gesture-based control and autonomous field operations. Across these applications, AR functions as an interface layer that enables more responsive, precise and interactive human-machine coordination for farmers.

4.2 AR-enabled tractor interfaces and predictive maintenance

AR integration in tractor systems is emerging as a key application of smart farming machinery, especially for optimizing operation and reducing downtime. Although standard dashboards offer static readouts, AR interfaces allow real-time overlays of navigation guidance, fuel consumption and load balancing directly within the field of view of the operator[27,28]. These systems increasingly incorporate IoT-connected sensors and AI-based diagnostics to deliver contextual alerts and actionable system feedback.

Beyond routine operation, AR-enabled predictive maintenance tools are gaining support. Smart glasses equipped with AR can display internal sensor data directly on tractor components, assisting technicians in fault detection and repair. By highlighting wear indicators, error codes or maintenance checklists in situ, AR can reduce repair time and human error[28]. These interfaces are particularly useful in remote areas where expert technicians may not be readily available, enabling local operators to perform guided repairs or relay live diagnostics to remote experts.

4.3 AR interfaces for gesture control and automation in agricultural machinery and robotics

Researchers have evaluated gesture-based control mechanisms to improve user interaction with AR systems in smart farming. One approach optimized AR usability using Fitts’ Law, refining gesture-based interactions for crop monitoring, machinery management and decision support[29]. A refined 3D spatial interaction model incorporated head movement tracking and ergonomic adjustments, reducing task complexity by 40%. Implemented on Microsoft HoloLens 2 using unity extended reality, and simultaneous localization and mapping, the system improved operational efficiency for farm workers through adaptation to user movement using quaternion-based tracking and genetic optimization of interaction parameters.

Gesture-based AR interfaces also show promise in advancing hands-free control of robotic systems. Integrating gesture recognition with AR navigation allows operators to control tractors, drones and robotic tools through simple hand motions, minimizing dependence on standard control panels[30]. Although gesture control supports active operation, AR could also assist in supervisory roles. For example, a spatial AI-enabled robotic platform for wheat farming demonstrated how embedded depth sensing and object recognition could support autonomous navigation, obstacle avoidance and real-time crop row detection[31]. In similar autonomous systems, AR interfaces could provide status updates, alerts and decision logs when operator oversight or intervention is needed.

Recent work has also integrated AR interfaces with mobile field robots to support crop monitoring and control. One system enables growers and technicians to interact with an autonomous robot via an AR headset, facilitating remote operation and spatially guided navigation for field assessment[32]. This platform supports live data exchange and control, allowing users to teleoperate the robot, request graphical crop updates or direct data collection. These advancements highlight how AR-based spatial interaction models and gesture-driven interfaces have the potential to enable more seamless human-machine collaboration across both robotic and mechanized farming systems.

4.4 AR-integrated harvesting systems

AR-assisted harvesting systems could also streamline fruit-picking processes by integrating object detection models with interactive visualization tools. BerryScope, an AR-assisted strawberry picking aid, integrates Microsoft HoloLens 2 with deep neural networks for real-time ripeness detection[33]. The system captures images using an optical see-through head-mounted display, processes them with the QueryInst instance segmentation model and classifies ripeness using ResNet18. Fully ripe strawberries are highlighted with AR overlayed bounding boxes, guiding users during harvesting. In field trials, BerryScope users achieved a 93% success rate in selecting ripe fruit, surpassing current manual harvesting performance.

In mechanized harvesting, AR is also being used to support tractor-based workflows. One system integrates AR with deep reinforcement learning to simulate harvesting scenarios and guide navigation and obstacle avoidance[34]. Through an immersive interface, users can monitor machine status, adjust parameters and initiate harvesting routines, supporting both manual and autonomous operations.

Beyond control and visualization, AR has been shown improve operator safety and system awareness in large-scale harvesting machines such as combines and loaders. Research on AR applications in heavy machinery shows that integrating see-through interfaces and spatial overlays reduces cognitive load and increases situational awareness for machine operators[35]. These interfaces help monitor alerts, visualize hidden hazards and coordinate with other autonomous or semiautonomous systems in the field, enhancing harvesting efficiency, safety and precision under varying conditions.

5 Augmented reality-integrated UAVs

5.1 AR as a visual analytics interface for UAV-assisted agriculture

In the integrated AR system described above, UAVs serve as a key input layer for detecting anomalies across large-scale farms. By integrating aerial sensing with AR interfaces, UAVs expand the diagnostic reach and spatial coverage of the system. AR could enhance UAV applications by linking aerial data with context-specific visual overlays, transforming passive observation into interactive diagnostics. This allows farmers to visualize vegetation stress, soil variability or crop health conditions overlaid directly onto UAV feeds via AR-enabled devices, supporting rapid intervention.

This section reviews recent advances in AR-UAV integration, highlighting how AR serves both as a visual analytics interface and a control mechanism in tasks such as disease detection, fertilization, soil assessment, navigation and targeted pesticide application.

5.2 Smart farming with AR-integrated UAVs

AI-enabled UAVs, when integrated with AR interfaces, streamline large-scale crop surveillance and enable faster agronomic decision cycles. Standard scouting methods require time-consuming ground assessments, whereas UAVs can efficiently scan large fields while capturing high-resolution images for immediate analysis[36].

The integration of multimodal sensors such as red, green, blue (RGB), multispectral, LiDAR and thermal cameras, allows UAVs to detect environmental variations and plant health issues with greater accuracy. AI-powered AR overlays could further optimize this process by visualizing crop health indicators directly onto UAV footage, enabling farmers to pinpoint problem areas and take immediate corrective actions[37]. To illustrate this approach, Fig.3 shows a drone image taken by the authors in a remote farming region near the Caribbean coast of Colombia. The red boxes highlight areas of possible vegetation stress, identified using color segmentation and edge detection ML models.

One study used a CNN-based UAV system to detect leaf diseases in rice, potatoes and corn, generating AR disease severity maps directly on the UAV video feed. This improved precision diagnostics and allowed for more targeted pesticide applications[38]. AI-enabled UAVs integrated with AR-generated 3D farm maps have also been shown to optimize soil condition analysis, irrigation planning and preharvest decision-making, providing farmers with the necessary data to maximize crop yields[39].

UAV-based AR visualization may also streamline collaborative decision-making by allowing multiple users, farmers, agronomists and researchers, to interact with the same data sets in real time[40]. This reduces subjective interpretations and improves overall agricultural planning.

5.3 AR-integrated UAVs for fertilization, soil assessment and crop decision-making

AR-integrated UAVs have the potential to improve both accuracy and sustainability in fertilization. AI, IoT and AR-powered drone systems can use markerless AR mapping to scan fields, analyze soil conditions and apply nutrients only where needed[41,42]. This reduces over-application and ensures targeted nutrient delivery, minimizing environmental impact[43].

Multi-UAV coordination is also transforming agricultural operations. AR-integrated digital twins allow farmers to visualize drone activity in the field, supporting coordinated spraying, fertilization and disease monitoring[39]. These AR-enhanced models help optimize drone routes and treatment timing by providing a spatially accurate digital view of the farm.

Beyond fertilization, UAVs equipped with NDVI and LiDAR-based mapping technologies are advancing soil assessment by capturing high-resolution data on nutrient levels, fertility, pH and moisture variation across the field[44]. Although these sensing systems operate primarily through IoT-enhanced UAV platforms, AR could be used to overlay this spatial data onto the visual field of farmers, supporting more precise interpretation and guiding input decisions based on current soil conditions.

Drone-assisted AR soil sampling has been shown to be more efficient than current methods, generating detailed soil property maps that guide both fertilization and irrigation strategies[45]. Farmers can also overlay AI-generated soil recommendations onto AR interfaces, potentially allowing for interactive, location-specific management adjustments.

In addition, UAVs enable ongoing soil monitoring, detecting early signs of degradation that could affect yields[46]. By combining AR-driven soil analysis with AI decision tools, farmers could make more informed crop selection decisions, leading to improved productivity and better resource use.

5.4 AR for UAV navigation and gesture-controlled interfaces

Effective navigation and flight optimization are essential for UAV-assisted farming, as challenges including terrain obstacles, wind conditions and in-field adjustments must be managed for reliable deployment[47]. An AR-enhanced UAV manipulation system improved path planning and obstacle avoidance, enabling farmers to set flight paths interactively while minimizing collision risks during pest scouting and crop disease mapping[48]. By overlaying live environmental data onto AR interfaces, UAVs could adjust routes rapidly and responsively to ensure coverage and safety.

Gesture-driven AR interfaces further support hands-free drone control, reducing cognitive load during complex field inspections[49]. In challenging terrain, farmers can use AR gesture recognition to control UAVs without relying on standard hand-held controllers, improving usability in dynamic farm environments[48].

6 Augmented reality and AI image

6.1 AR as an interface for AI decision support and diagnostics

In the AR-centered conceptual model, AI serves as the analytic engine that processes visual and sensor data collected through the AR interface, whether from drones, IoT sensors, or on-site field scanning. Once a problem area is identified, AR enables farmers to relay visual input to an edge AI module for analysis, making the AI layer a vital diagnostic and decision-making node in the system. When the AI confirms an issue, the system delivers context-specific recommendations, whether for manual intervention or robotic action, back to the farmer via an AR interface, through smart glasses as visual overlays, a smartphone screen, or audio prompts, providing step-by-step guidance tailored to the specific crop, condition and location.

This section reviews how AI models integrated with AR interfaces enhance decision support across a range of use cases, including disease diagnosis, image recognition, phenotyping, irrigation planning, crop quality assessment and anomaly detection. Without AR, these AI outputs remain abstract and disconnected from the operational context, limiting their practical utility for time-sensitive, location-specific tasks.

6.2 AI decision support in smart farming

In terms of AI decision support, one key application is with phenomics models, which analyze vast amounts of genetic, phenotypic and environmental data to predict crop performance under different conditions[50]. This enables breeders to select optimal cultivars with high precision. AR interfaces could further improve this process by overlaying plant trait assessments, allowing farmers to visually compare and select the best crops for their specific environments.

AI-controlled irrigation systems are also transforming water management by using synchronous data to optimize irrigation schedules. These models can analyze soil moisture, evapotranspiration and weather forecasts to predict crop water requirements, helping farmers apply water more efficiently. The integration of high-resolution land data assimilation systems and crop growth models such as AquaCrop has shown water savings between 10% and 40%, without compromising crop yields[51]. These AI-based irrigation scheduling systems adjust watering thresholds dynamically throughout the crop cycle, preventing over-irrigation and water stress. AR interfaces could improve these systems by visualizing live irrigation data, allowing farmers to interactively monitor soil moisture levels, project water needs and make irrigation adjustments.

The integration of blockchain and AI technologies is also improving agricultural traceability, ensuring secure, tamper-proof records of crop production and distribution. Blockchain-backed systems eliminate the risks of centralized databases by creating a decentralized, transparent ledger, strengthening data security and traceability efficiency[52]. AI enhances blockchain-based traceability by automating data interpretation, uncovering inefficiencies and supporting quality assurance and sustainability metrics[53]. Additionally, smart contracts and encryption protocols protect farm data while enabling real-time transaction verification, reducing fraud and ensuring regulatory compliance[54].

AR could complement these technologies by making traceability data accessible at the point of use. For example, farmers and supply chain managers could scan QR codes placed on produce, equipment, or storage units to visualize AI-generated information on input history, crop quality and compliance metrics through AR interfaces. In logistics, AR-assisted inventory systems could overlay blockchain-verified updates on storage conditions, transport history and product status directly onto physical assets. By linking AR visualization with blockchain-AI systems, these tools could improve operational transparency, strengthen consumer trust and support data-driven decision-making across the agricultural supply chain.

6.3 AI-powered image recognition for AR-based diagnostics

AI-based image recognition is transforming pest and disease detection in agriculture, enabling mobile-compatible diagnostics that form the foundation for AR-integrated systems[55]. Fig.4 presents an example of an AI-powered image recognition model applied to plant disease detection using photos captured by the authors. The images illustrate how color segmentation and edge detection techniques can be used to highlight potential problem areas in crops. Green bounding boxes with corresponding labels indicate suspected plant health issues, including fungal infections, nutrient deficiencies and pest-related damage. This example serves to demonstrate how AI-based models can assist in the identification of disease symptoms.

CNNs have demonstrated high accuracy in crop pathology classification[56]. For example, NASNet Mobile and EfficientNetB0 were shown to achieve over 98% precision in detecting black gram diseases[57]. Combining CNNs with mobile vision transformers enhances both local and global feature extraction, making AI-powered AR diagnostics more adaptable to unpredictable field conditions[58]. Similarly, domain adaptation models, such as Wasserstein-based unsupervised domain adaptation, improve model generalization by reducing discrepancies between controlled data sets and real-world agricultural environments[58]. Another study demonstrated the effectiveness of DL-based image recognition for plant disease detection using the YOLOv4 CNN model[59]. Their study applied object detection to classify leaf diseases, achieving high accuracy in identifying diseased regions in crops. Also, data sets incorporating images across various growth stages, backgrounds and lighting conditions have improved classification accuracy[60]. Together, these advances highlight the growing maturity of AI-based image recognition as a foundation for field-ready diagnostics.

Building on this technical foundation, recent studies have moved beyond static classification to develop real-time, user-based AR applications. One study developed a YOLOv5-based AR system for plant disease classification, refining semantic segmentation to detect individual leaf abnormalities rather than whole plant structures[61]. This CNN-based AR model improved disease severity assessment, achieving 70% to 90% accuracy, demonstrating its potential for AR in smart farming.

Another example is the modified pyramidal convolutional shuffle binary attention residual network, an AR-powered cassava disease detection system that integrates DL with AR overlays for diagnostics and fertilizer recommendations[62]. This system improved feature extraction, while advanced Harris hawk optimization refined model parameters, increasing computational efficiency. Trained on 286 cassava leaf images, the system achieved 99.0% accuracy, surpassing CNN and transformer-based models in classification performance. Additionally, its AR interface provided disease visualizations and delivered disease-specific fertilizer recommendations, achieving a precision of 99.0% and a recall rate of 97.6%. This illustrates the potential of AR not only to detect disease and deliver agronomic advice directly in the field.

Object detection and multimodal imaging approaches further refine AR diagnostics by integrating RGB, infrared and thermal imaging for enhanced crop analytics. YOLOv8-based object detection has demonstrated high accuracy in the early-stage disease identification of leaf blight in taro crops, outperforming current models[63]. Meanwhile, deep augmented learning frameworks incorporating RGB, infrared and hyperspectral imaging significantly improve banana leaf disease classification[64]. Transformer-based architectures further optimize feature generalization, improving accuracy in distinguishing early and late-stage infections[65]. These advances indicate that multispectral overlays in AR interfaces could help surface subtle physiologic changes in crops, enabling more spatially precise interventions. However, realizing this potential in embedded agricultural systems will require translating these models into lightweight, computationally efficient formats compatible with edge AI devices.

In addition to disease detection, AR tools are also being evaluated to improve monitoring and accuracy in areas with insect infestations. An AR-integrated smart glasses system for rice planthopper detection combined high-resolution imaging with DL-based classification[66]. Using the Cascade-RCNN-PH model, enhanced with adaptive sample matching and a squeeze-and-excitation feature pyramid network, the system improved small-target detection, achieving a recall of 83.4% recall and a precision of 83.6%. In the same study, AR imaging via a mobile application reduced manual labor by 50% while maintaining high detection accuracy in heavily infested fields.

Beyond detection, AR systems are also being used to support decision-making by visualizing insect development stages and guiding appropriate interventions. One such system, designed for organic farming, integrated ML and computer vision to support live analysis of insect presence in the field[67]. Its CNN model, trained on the IP102_V1.1 data set, a comprehensive benchmark for agricultural insect classification, achieved 90% accuracy in recognizing caterpillars, flea beetles and whiteflies, while also analyzing leaf damage patterns to refine detection accuracy. Using a smartphone-based AR interface, the system overlaid 3D insect models, developmental stages and severity metrics. Based on the classified insect type and stage, the interface then displayed predefined organic treatment options such as neem extract application, crop rotation and the use of beneficial insects.

6.4 AI yield estimation, crop quality and anomaly detection

Beyond pest and disease management, AI and AR are increasingly being used to monitor crop development, estimate yields and detect anomalies throughout the growing cycle. For example, AR-IA facilitates real-time tomato yield estimation and quality assessment[68]. Built using Unity and ARKit, the system optimizes image capture to minimize redundancy. Trained on 2083 RGB images from UAV-collected and Kaggle data sets, AR-IA improves yield prediction accuracy through AR overlays that display projected yield estimates, supporting preharvest decision-making.

Advancements in AR-powered fruit classification can improve harvesting accuracy by integrating DL-based image processing. An AR-assisted ripeness detection system for greenhouse-grown strawberries was developed, incorporating YOLOv7 for automated fruit classification[69]. Trained on 8000 images, the model correctly identified ripe fruit with 89% overall accuracy and balanced precision and recall at 92%, making it more effective than previous fruit detection methods. Integrated with Microsoft HoloLens 2, the system enabled live AR visualization of ripeness levels. Applying AR-powered fruit assessment to more crops could help farmers harvest at the right time, improving efficiency and reducing losses after harvest.

AI anomaly detection is improving real-time plant health monitoring and improving disease recognition. For example, diffusion-based models such as CropDetDiff refine disease detection by analyzing plant features at different levels of detail, which could make AR-assisted diagnostics more effective, especially when data are limited[70]. AI is also improving fruit and crop classification, EfficientNet-based models accurately assess apple quality based on color, shape and texture, reducing the need for manual sorting[71]. Similarly, multimodal fusion techniques that combine RGB and depth imaging improve tea shoot detection by enhancing image contrast in low-light environments[72]. Early yield estimation models using YOLOv8 and YOLOv5 have demonstrated strong correlations between detected and actual flower counts, improving harvest planning and labor management[73].

Advancements in segmentation and object detection models could further enhance AR-assisted agricultural monitoring. Vision transformers combined with Mask R-CNN effectively detect and segment small green citrus fruits in dense orchards, overcoming occlusion and background noise challenges[74]. FLTrans-Net, a transformer-based feature learning model for wheat head detection, demonstrated superior performance in identifying small, overlapping wheat spikes under challenging field conditions, achieving a mean average precision of 96.1% on the GWHD-2021 data set while maintaining lightweight efficiency[75]. Another example is AgriDeep-Net, a feature-fusion DL model, that improves fine-grained classification of visually similar plant species, ensuring more accurate plant identification[76]. Self-supervised learning methods, such as channel randomization, improve anomaly detection by training AI models to detect subtle variations in plant color, outperforming current data augmentation techniques[77]. These innovations have the potential to improve the ability of AR to detect physiologic stress, disease progression and crop quality changes. As these models are designed for edge computing, they can be used in embedded systems, allowing on-site crop monitoring without relying on cloud computing or high computational demands.

7 Augmented reality for precision agriculture through edge computing

7.1 AR interfaces enhanced by edge computing for precision farming

In the conceptual model described above, edge computing acts as the local processing backbone for AR interfaces. By supporting AI diagnostics and decision-making directly on-site, edge computing could enable AR systems to deliver important data and field-level guidance, even in areas lacking stable cloud access. This is particularly important in rural or connectivity-limited environments, where latency and bandwidth constraints can hinder responsiveness.

Recent studies have investigated how integrating edge computing with AR enables key functions such as spatial mapping, collaborative diagnostics and secure data handling in precision farming. Across these applications, edge computing enhances the responsiveness, energy efficiency and scalability of AR-powered embedded systems.

7.2 Spatial mapping, task offloading and secure farm data transmission

An edge-cloud coordination platform integrating simultaneous localization and mapping (SLAM) with the robot operating system was developed to enhance AR-assisted disease tracking, soil health monitoring and crop stress visualization. The SLAM image analyzer dynamically selects the most efficient SLAM model, reducing power consumption by 30% and achieving an impressive latency of 50 ms, enabling rapid collaboration among farmers, agronomists and researchers[78].

Alongside spatial mapping, optimized AI-driven task offloading further improves AR-based crop diagnostics and automation. By balancing computing loads between local AR devices and edge servers, a hybrid particle swarm optimization–genetic algorithm model could reduce AR task execution latency, streamlining pest monitoring, disease identification and irrigation control[79]. Similarly, a quality of service-aware AR task offloading model has been shown to improve execution efficiency by 92%, ensuring fast and reliable processing for farm monitoring[80].

A major challenge in AR-assisted farming is transmitting data in areas with poor connectivity. A recent study proposed a deep learning and Lagrange optimization-based model to enhance the reliability and efficiency of IoT communication in smart agriculture[81]. By optimizing transmission distance and reducing interference in environments with overlapping wireless signals, the model significantly improved energy efficiency and data throughput, which are critical for ensuring stable AR-based monitoring in rural settings.

The security of IoT-integrated AR farming networks is another major consideration, especially as AR devices handle large volumes of sensitive farm data. A lightweight authentication protocol using extended Chebyshev chaotic maps and physical unclonable functions reduce computational costs by 50%, making secure AR data exchange feasible for IoT-connected agricultural devices[82]. This ensures data integrity and cybersecurity in automated precision farming networks, reducing the risk of unauthorized access and system vulnerabilities.

7.3 Edge computing for AR imaging and remote farm management

Edge computing is also advancing high-resolution AR imaging for AI-assisted crop health analysis. A generative AI-powered super-resolution model in multiaccess edge computing environments was developed to improve object detection accuracy by reconstructing high-resolution images from low-resolution inputs, enabling detailed analysis of crop health, disease patterns and pest infestations[83]. By using edge computing to process these images in this manner, power consumption is reduced, improving power efficiency in edge-based AR diagnostics.

Beyond imaging, edge computing is improving AR farm monitoring and diagnostics in remote areas. Long-range wireless area networks-based task offloading has significantly enhanced AR-enabled field diagnostics by distributing computationally intensive tasks across multitier edge computing nodes, achieving a latency reduction from ~2 to just ~0.3 s[84]. This could allow for real-time crop health monitoring, soil moisture assessment and environmental stress detection, making low-power AR interfaces practical for rural precision agriculture where cloud access may be limited.

Improving energy efficiency in large-scale AR-powered farm inspections is key to advancing edge computing. A hybrid Monte Carlo tree search-based AR task offloading model, which integrates YOLOv7 for object recognition and structure from motion for 3D mapping, has significantly reduced energy consumption to 1.29 MJ per inspection, while maintaining fast response times of 24 ms[85]. This reduction means UAVs and mobile AR platforms could operate longer without frequent recharging, making AI-assisted monitoring more viable for extended field inspections.

To further enhance AR-based agricultural diagnostics, DL models must be optimized for deployment in edge environments. RTR_Lite_MobileNetV2, a low-power CNN designed for edge computing, achieves 99.9% accuracy while reducing model size by 53.8%, making it ideal for deployment on IoT-enabled AR farming tools such as Raspberry Pi-based systems[86]. Integrating these lightweight models supports scalable, battery-conscious deployment of AI-assisted diagnostics, allowing farmers in connectivity-limited regions to benefit from high-performance AR applications without relying on cloud infrastructure.

8 Discussion

8.1 How this review advances beyond prior work

This review positions AR not merely as a visualization tool but as the connective interface that enables embedded AI, IoT, UAV and robotics technologies to operate as a coherent, field-responsive smart farming ecosystem. This study grounds its AR-centered model in empirical findings, where recent literature demonstrates the role of AR across embedded technologies. By reviewing each domain separately and identifying the demonstrated utility of AR within each, we believe we have constructed a credible argument for their unification under a single AR-centered interface. This integrative architecture is not speculative but represents a feasible convergence of independently validated subsystems, grounded in the presented literature.

Recent secondary literature emphasizes the growing importance of AI and AR for precision agriculture. Image-based phenotyping technologies, increasingly deployed via smartphone platforms, are transforming crop diagnostics and trait analysis in field environments[87]. This reinforces the central argument of our review that AR can serve as a spatial interface to render AI-derived insights actionable at the farm level. The role of bioinspired algorithms, including genetic algorithms, particle swarm optimization, and ant colony optimization, in optimizing key agricultural processes such as pest detection, irrigation scheduling, and machinery path planning has also been explored[88]. Although this study does not focus on AR, it highlights the importance of intelligent algorithms in guiding decision-making. Our review extends this line of research by positioning AR as the interface layer through which these algorithmic outputs can be delivered in real time and rendered actionable under field operations.

A recent survey of AR and VR in agriculture highlighted that while AR adoption is growing, it has been insufficiently examined in terms of low-cost, mobile-based deployment[89]. The reported usability barriers and technical limitations support our emphasis on developing smartphone-based AR solutions for smallholder farmers—an option not sufficiently addressed in most technical literature but vital for global scalability. Our contribution builds on this by proposing embedded system architectures based on AR that balance performance with accessibility. Specifically, we reviewed recent AR implementations across various domains to identify system configurations, interface strategies, and hardware platforms that support cost-effective deployment in resource-limited agricultural settings.

AR has been shown to dominate extended reality applications in the agricultural sector, especially for decision-making tasks in real-world contexts[90]. The need for more field-validated, ergonomic systems highlights a disconnect between experimental tools and practical deployment. This supports our call for participatory design and user-centric development in future AR interfaces.

An analysis of extended reality trends in agriculture confirms the emergence of AR as the leading extended reality modality in precision farming[91]. Interoperability and data privacy are identified as top challenges—issues we also foreground in our limitations and future research sections. Although that review surveys extended reality broadly, our perspective differentiates itself by specifically framing AR as the interface layer that integrates and operationalizes intelligent embedded systems at scale.

While most existing reviews focus on AR capabilities in isolation, one study emphasizes that the true utility of AR emerges only when coupled with IoT, AI, and GPS-based technologies[9]. This study describes how AR allows farmers to interpret complex environmental data through overlays on crops or machinery, aligning with our argument that AR closes the loop between sensing, computation, and human action. We extend this by showing how AR could be integrated across edge computing, UAVs, and robotic systems to enable intelligent farming practices.

8.2 Research gaps in AR-embedded agricultural systems

Despite recent progress, significant technical and infrastructural challenges continue to limit the widespread adoption of AR in precision agriculture. Computational efficiency, data processing and scalability remain significant concerns, particularly in rural areas with limited digital infrastructure. The high computational demands of AI-powered AR models require efficient task offloading strategies to edge and cloud computing networks to reduce latency and power consumption[79]. AI-driven image recognition models used for disease detection and crop monitoring often struggle with domain adaptation, limiting their accuracy in real-world farming conditions compared to controlled data sets[58]. Coordination across AR platforms, IoT sensor networks and farm machinery also remains a key issue, as inconsistent hardware capabilities and a lack of standardization hinder seamless integration[20]. In addition, security vulnerabilities in wireless IoT-enabled AR networks pose risks for real-time data exchange, requiring lightweight authentication protocols to ensure safe and reliable communication[82].

Although both IoT and AR have demonstrated potential, a lack of scalable, low-latency integration pipelines that enable data processing across diverse farm environments persists[92]. Existing systems often suffer from inconsistent data exchange formats and insufficient support for visualization across different sensor types, environmental conditions and AR device platforms, particularly when dealing with thermal imaging, soil moisture data or multisource sensor fusion.

Most AR-enabled AI decision support tools rely on complex ML models whose outputs are difficult for farmers to interpret. The lack of transparency in how these AI models generate recommendations, often referred to as black-box or opaque reasoning, can undermine farmer trust, highlighting the need for built-in interpretability to explain diagnostic outputs[93]. Also, explainable AI techniques are largely absent from AR interfaces and farmers currently lack transparent justifications behind algorithmic recommendations.

The scarcity of diverse, field-representative data sets hampers model generalization across regions and crop types, limiting scalability. This also reduces the generalizability of AI models when applied to new regions, climates or crop types. The absence of comprehensive, publicly available data sets also restricts benchmarking and validation, making it difficult to assess system performance across contexts[94]. Region-specific data sets dominate current development efforts, limiting cross-domain AI training and weakening system performance in unfamiliar or diverse agricultural zones.

Although UAVs are increasingly used in combination with AR for field diagnostics, existing systems are typically designed for single-drone operation. Current UAV-AR systems lack well developed synchronized multidrone capabilities, restricting their effectiveness in large, complex farm networks[95]. Most current implementations also fail to exploit swarm intelligence or decentralized communication protocols, leaving drone coordination largely dependent on centralized planning or operator input.

Another significant research gap is the limited number of studies examining AR integration across multiple embedded systems simultaneously. Although there is growing literature on AR paired individually with technologies such as IoT, UAVs, robotics or AI, few studies explore how AR can interface with several of these systems in a coordinated manner. Most existing systems focus on isolated real-time visualizations or monitoring tasks, often lacking automation or feedback loops that connect sensing, actuation and decision support. This fragmentation hinders AR from fulfilling its potential as a central, integrative interface.

Additional research gaps include real-time processing limitations in AR-assisted autonomous machinery. Many AR interfaces in robotics and tractors rely on static sensor configurations, which restrict adaptability to changing terrain or equipment behavior[96]. Without dynamic, real-time edge AI support, current systems struggle with latency in environments where ML predictions must adjusted with immediacy.

Gesture-based AR control systems also face major limitations in field conditions. Environmental variability, including lighting changes, background clutter and operator movement inconsistencies, disrupts the reliability of gesture tracking and spatial inputs[97]. As a result, their deployment remains limited to highly controlled settings, undermining their potential for hands-free control in real farm environments.

Finally, disease detection using UAVs remains constrained by the limited predictive capabilities of current AR-overlay models. Most implementations visualize raw or basic detection outputs without offering real-time, AI-guided recommendations or early-stage indicators based on plant stress biomarkers[98]. DL-based systems lack sufficient integration with AR overlays to enable proactive decisions during UAV surveillance missions.

8.3 Limitations of the present review

Although this review offers a structured synthesis of AR integration within embedded agricultural systems, several limitations must be acknowledged. First, the scope of the review was intentionally focused on conceptual and architectural frameworks rather than empirical performance evaluations or field trials. As a result, the practical effectiveness of specific AR solutions in diverse agricultural contexts, particularly in smallholder and resource-constrained settings, could not be fully assessed. Also, the conceptual model proposed in this review, though grounded in current research, has not been field-tested and may require adaptation to specific regional, technological or infrastructural contexts before large-scale deployment.

Second, although this work draws on a carefully selected set of peer-reviewed sources, it may omit relevant studies published in non-indexed regional journals or in languages other than English. This introduces a potential bias toward research emerging from technologically advanced contexts, which may not fully capture the realities of AR deployment in semi-subsistence and small-scale agricultural systems.

Third, the literature reviewed primarily spans developments in AR, AI and IoT between 2020 and early 2025. Given the rapid pace of technological change in smart farming, some recent innovations may have been excluded due to publication or indexing delays. As a result, certain observations may have limited long-term generalizability.

8.4 Future research directions of AR in embedded systems

Building on the identified research gaps and technological limitations, this section outlines key future directions to advance the integration of AR within embedded agricultural systems. First, improving real-time integration between AR and IoT systems remains a foundational challenge. Future studies should develop universal data exchange models and low-latency pipelines that support seamless sensor visualization across heterogeneous farm environments and hardware systems. Enhancing AR visualization of irrigation and fertigation data by incorporating thermal imaging, soil moisture data and environmental forecasting is also another research direction.

To improve automation in smart farm machinery, researchers should focus on reducing latency in AR-guided tractor and robotic interfaces by creating adaptive AI models that account for terrain and machinery variability. In parallel, gesture-based AR control systems require refinement for improved deployment in outdoor conditions, particularly regarding lighting inconsistencies and sensor noise. Expanding robotic harvesting capabilities to include real-time ripeness detection, damage assessment and cooperative human-robot interaction will be essential for advancing autonomous farm operations.

For the aerial contexts, future research should advance UAV-based AR automation by applying reinforcement learning and adaptive AI for flight path and treatment optimization. Further work on UAV systems should explore swarm coordination frameworks and collaborative AR visualization. Additionally, early-stage crop stress prediction using UAVs with hyperspectral and multispectral imaging should be paired with AR overlays for more precise, pre-symptomatic diagnosis.

To enhance model generalization and performance across regions, future efforts should establish global, standardized agricultural data sets for training and validating AI-AR systems across diverse crops, climates and geographies. These systems could also benefit from further research into ultra-low-power AI models and neuromorphic computing, enabling on-the-spot inference on embedded AR devices with minimal energy use.

As farms become increasingly data-driven, multitier edge computing architectures must be optimized using adaptive task scheduling across AR wearables, UAVs and ground systems. Simultaneously, researchers should improve data security through federated learning and lightweight encryption protocols that maintain privacy across AR-IoT platforms.

Another future research priority is the interoperable integration of AR, IoT, UAVs, robotics and AI into unified frameworks that allow seamless data flows and system-wide coordination. Supporting this integration, AI-enhanced AR interfaces should be developed to facilitate responsive interaction between autonomous machinery, drones and sensor networks.

The conceptual model proposed in this review illustrates one possible pathway for realizing this type of unified, AR-centered smart farming system. Future research should focus on validating the model through simulation studies, field deployments and participatory trials that measure both technical feasibility and user interaction. Key priorities include designing middleware that enables communication between system components, evaluating model responsiveness under varying connectivity conditions and studying how farmers interact with AR interfaces when switching between manual and autonomous control modes.

To ensure equitable access, future work must prioritize the design of smartphone-based AR tools that reduce reliance on costly headsets or proprietary hardware, particularly for use in smallholder and resource-constrained contexts. Additionally, the field would benefit from participatory design research involving farmers, agronomists and technicians to align AR system development with real-world agricultural workflows. Building on this agenda, we are currently implementing and evaluating the proposed AR-centered model in smallholder rice farms in Meta, Colombia.

9 Conclusions

This review addressed the core research question of how AR visualization and overlays can serve as a central interface for integrating and enabling the functional deployment of diverse embedded technologies in data-driven precision agriculture. It answers this through four key contributions. First, it introduces a conceptual model to illustrate how AR unifies anomaly detection, diagnostics and responsive action through an AR-centered architecture integrating IoT, UAVs, edge AI and robotic systems. Second, it offers systems-level synthesis of the role of AR across these technologies to demonstrate how AR supports visualization, interaction and decision-making through real-world applications and embedded technical frameworks. Third, it discusses these contributions within the broader literature to emphasize that while research has often examined components in isolation, there is potential for AR to provide an integrative interface for precision agriculture. Finally, the research gaps and future directions are given to reinforce the value of this work by identifying the key challenges that must be addressed to fully implement AR-centric smart farming systems.

This review underscores the importance of accessible, scalable and farmer-centered AR innovations, particularly in resource-limited contexts. By highlighting low-cost deployment strategies, such as smartphone-based AR interfaces, lightweight edge-AI models and modular system design, it helps bridge the gap between emerging technologies and their practical implementation in actual agricultural environments.

Overall, this review endeavors to contribute to the broader knowledge base by proposing a structured roadmap for developing integrated AR systems, grounded in the demonstrated feasibility of AR within individual embedded technologies. The identified gaps and proposed directions point to high-impact opportunities for interdisciplinary collaboration, such as AR-IoT interoperability, lightweight edge AI and explainable AI interfaces. Policymakers could draw on these findings to inform regulatory and funding frameworks that promote inclusive, secure and scalable AR innovations in agriculture. Practitioners, including agronomists and technology developers, could use the conceptual model and literature synthesis provided to assess the readiness and adaptability of AR solutions.

References

[1]

Miftahushudur T, Sahin H M, Grieve B, Yin H. A survey of methods for addressing imbalance data problems in agriculture applications. Remote Sensing, 2025, 17(3): 454

[2]

Louta M, Banti K, Karampelia I. Emerging technologies for sustainable agriculture: the power of humans and the way ahead. IEEE Access: Practical Innovations, Open Solutions, 2024, 12: 98492–98529

[3]

Kumar V, Sharma K V, Kedam N, Patel A, Kate T R, Rathnayake U. A comprehensive review on smart and sustainable agriculture using IoT technologies. Smart Agricultural Technology, 2024, 8: 100487

[4]

Mana A A, Allouhi A, Hamrani A, Rehman S, el Jamaoui I, Jayachandran K. Sustainable AI-based production agriculture: exploring AI applications and implications in agricultural practices. Smart Agricultural Technology, 2024, 7: 100416

[5]

Javaid M, Haleem A, Singh R P, Suman R. Enhancing smart farming through the applications of Agriculture 4.0 technologies. International Journal of Intelligent Networks, 2022, 3: 150–164

[6]

Zheng M, Lillis D, Campbell A G. Current state of the art and future directions: augmented reality data visualization to support decision-making. Visual Informatics, 2024, 8(2): 80–105

[7]

Sara G, Todde G, Pinna D, Caria M. Investigating the intention to use augmented reality technologies in agriculture: will smart glasses be part of the digital farming revolution. Computers and Electronics in Agriculture, 2024, 224: 109252

[8]

Phupattanasilp P, Tong S R. Augmented reality in the integrative internet of things (AR-IoT): application for precision farming. Sustainability, 2019, 11(9): 2658

[9]

Hurst W, Mendoza F R, Tekinerdogan B. Augmented reality in precision farming: concepts and applications. Smart Cities, 2021, 4(4): 1454–1468

[10]

Jafar A, Bibi N, Naqvi R A, Sadeghi-Niaraki A, Jeong D. Revolutionizing agriculture with artificial intelligence: plant disease detection methods, applications, and their limitations. Frontiers in Plant Science, 2024, 15: 1356260

[11]

Upadhyay A, Chandel N S, Singh K P, Chakraborty S K, Nandede B M, Kumar M, Subeesh A, Upendar K, Salem A, Elbeltagi A. Deep learning and computer vision in plant disease detection: a comprehensive review of techniques, models, and trends in precision agriculture. Artificial Intelligence Review, 2025, 58(3): 92

[12]

Poonia A, Garg T, Mishra O, Batra E. Agriculture R G. 4.0 - Integrated Smart Irrigation System. In: 2024 15th International Conference on Computing Communication and Networking Technologies (ICCCNT). Kamand, India. IEEE, 2024, 1–8

[13]

Khalifeh A F, Alqammaz A, Khasawneh A M, Abualigah L, Darabkh K A, Zinonos Z. An environmental remote sensing and prediction model for an IoT smart irrigation system based on an enhanced wind-driven optimization algorithm. Computers & Electrical Engineering, 2025, 122: 109889

[14]

Bryceson K P, Leigh S, Sarwar S, Grøndahl L. Affluent Effluent: visualizing the invisible during the development of an algal bloom using systems dynamics modelling and augmented reality technology. Environmental Modelling & Software, 2022, 147: 105253

[15]

Baek J, Kanampiu M W. A strategic sensor placement for a smart farm water sprinkler system: a computational model. In: 2021 23rd International Conference on Advanced Communication Technology (ICACT), PyeongChang, South Korea. IEEE, 2021, 53–57

[16]

Sankarasubramanian P. Enhancing precision in agriculture: a smart predictive model for optimal sensor selection through IoT integration. Smart Agricultural Technology, 2025, 10: 100749

[17]

Salimath M, Kaliannan N, Prabhakar V, Iyyakutty R, Jeyabaskaran K J. IoT and sensor technologies: increased water and nutrient savings and profit in Banana cv. Grand Nain (AAA) production. Scientia Horticulturae, 2025, 341: 113982

[18]

Ueda S, Ye X. A smartphone-based method for assessing tomato nutrient status through trichome density measurement. IEEE Access: Practical Innovations, Open Solutions, 2024, 12: 171304–171327

[19]

Ahmed S, Marwat S N K, Ben Brahim G, Khan W U, Khan S, Al-Fuqaha A, Koziel S. IoT based intelligent pest management system for precision agriculture. Scientific Reports, 2024, 14(1): 31917

[20]

Kang C, Mu X, Novaski Seffrin A, Di Gioia F, He L. A recursive segmentation model for bok choy growth monitoring with Internet of Things (IoT) technology in controlled environment agriculture. Computers and Electronics in Agriculture, 2025, 230: 109866

[21]

Contreras-Castillo J, Guerrero-Ibañez J A, Santana-Mancilla P C, Anido-Rifón L. SAgric-IoT: an IoT-based platform and deep learning for greenhouse monitoring. Applied Sciences, 2023, 13(3): 1961

[22]

Saini A K, Yadav A K, Dhiraj . A comprehensive review on technological breakthroughs in precision agriculture: IoT and emerging data analytics. European Journal of Agronomy, 2025, 163: 127440

[23]

Bayih A Z, Morales J, Assabie Y, de By R A. Utilization of internet of things and wireless sensor networks for sustainable smallholder agriculture. Sensors, 2022, 22(9): 3273

[24]

Abidin Z, Falah R, Setyawan R A, Wardana F C. Wireless sensor network using nRF24L01+ for precision agriculture. Bulletin of Electrical Engineering and Informatics, 2025, 14(2): 1003–1013

[25]

Maitra A, Damle M. Revolutionizing Plant Health Management with Technological Digital Transformation to Enhance Disease Control & Fortifying Plant Resilience. In: 2024 3rd International Conference for Innovation in Technology (INOCON), Bangalore, India. IEEE, 2024, 1–8

[26]

Ghanem S I, Zaalouk M, Elbanna A. Improving Crop Recommendations with Augmented Reality and Sensor Data Analysis. In: 2024 Intelligent Methods, Systems, and Applications (IMSA), Giza, Egypt. IEEE, 2024, 462–466

[27]

Lachhwani R, Aslekar A. Augmented Reality in Agriculture. In: 2022 International Interdisciplinary Humanitarian Conference for Sustainability (IIHC), Bengaluru, India. IEEE, 2022, 150–153

[28]

Lohan S K, Prakash C, Lohan N, Kansal S, Karkee M. State-of-the-art in real-time virtual interfaces for tractors and farm machines: a systematic review. Computers and Electronics in Agriculture, 2025, 231: 109947

[29]

Nwobodo O J, Wereszczyński K, Kuaban G S, Skurowski P, Cyran K A. An adaptation of fitts’ law for performance evaluation and optimization of augmented reality (AR) interfaces. IEEE Access: Practical Innovations, Open Solutions, 2024, 12: 169614–169627

[30]

Elena N, Oleg S, Miguel A C, Jonathan T, Valerii S, Viktor R, Dzmitry T. CobotAR: interaction with robots using omnidirectionally projected image and DNN-based gesture recognition. arXiv Preprint, 2021, arXiv:2110.10571

[31]

Gunturu S, Munir A, Ullah H, Welch S, Flippo D. A spatial AI-based agricultural robotic platform for wheat detection and collision avoidance. AI, 2022, 3(3): 719–738

[32]

Mucchiani C, Chatziparaschis D, Karydis K. Augmented-reality enabled crop monitoring with robot assistance. arXiv Preprint, 2024, arXiv:2411.03483

[33]

Tamura S, Buayai P, Mao X. BerryScope: AR Strawberry Picking Aid. In: 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Sydney, Australia. IEEE, 2023, 118–121

[34]

Li H, Gao F, Zuo G. Research on the agricultural machinery path tracking method based on deep reinforcement learning. Scientific Programming, 2022, 2022(1): 6385972

[35]

Sitompul T A, Wallmyr M. Using Augmented Reality to Improve Productivity and Safety for Heavy Machinery Operators: State of the Art. In: Proceedings of the 17th International Conference on Virtual-Reality Continuum and Its Applications in Industry, Brisbane QLD Australia. ACM, 2019, 1–9

[36]

Virk H. The role of drones in monitoring and managing large-scale horticultural operations. International Journal of Horticulture and Food Science, 2023, 5(2): 87–91

[37]

Neupane K, Baysal-Gurel F. Automatic identification and monitoring of plant diseases using unmanned aerial vehicles: a review. Remote Sensing, 2021, 13(19): 3841

[38]

Rao J S, Choragudi K, Bansod S, Paidipalli V V S C, Singh S K, Pal P A I. AR Enabling on Embedded Systems for Agricultural Drones. In: 2022 International Conference on Futuristic Technologies (INCOFT), Belgaum, India. IEEE, 2022, 1–4

[39]

Carlson B, Wang C, Han Q. RAREST: Emulation of Augmented Reality Assisted Multi-UAV-UGV Systems. In: Proceedings of the First Workshop on Metaverse Systems and Applications, Helsinki Finland. ACM, 2023, 21–26

[40]

Potena C, Khanna R, Nieto J, Siegwart R, Nardi D, Pretto A. AgriColMap: aerial-ground collaborative 3D mapping for precision farming. IEEE Robotics and Automation Letters, 2019, 4(2): 1085–1092

[41]

Zhou J, Xu Y, Gu X, Chen T, Sun Q, Zhang S, Pan Y. High-precision mapping of soil organic matter based on UAV imagery using machine learning algorithms. Drones, 2023, 7(5): 290

[42]

Pereira G W, Valente D S M, de Queiroz D M, Santos N T, Fernandes-Filho E I. Soil mapping for precision agriculture using support vector machines combined with inverse distance weighting. Precision Agriculture, 2022, 23(4): 1189–1204

[43]

Kishor I, Goyal D, Khushboo K, Gupta K, Mamodiya U, Tiwari K. An Intelligent Farming Revolution System Based on IoT, AI & Augmented Reality Drone Technology. In: Proceedings of the 5th International Conference on Information Management & Machine Intelligence, Jaipur India. ACM, 2023, 1–7

[44]

Ponnusamy V, Natarajan S. Precision agriculture using advanced technology of IoT, unmanned aerial vehicle, augmented reality, and machine learning. In: Gupta D, Hugo C. De Albuquerque V, Khanna A, Mehta P L, eds. Smart Sensors for Industrial Internet of Things. Cham: Springer International Publishing, 2021, 207–229

[45]

Huuskonen J, Oksanen T. Soil sampling with drones and augmented reality in precision agriculture. Computers and Electronics in Agriculture, 2018, 154: 25–35

[46]

Hossen M A, Diwakar P K, Ragi S. Total nitrogen estimation in agricultural soils via aerial multispectral imaging and LIBS. Scientific Reports, 2021, 11(1): 12693

[47]

Haidar Ahmad A, Zahwe O, Nasser A, Clement B. Path planning for unmanned aerial vehicles in dynamic environments: a novel approach using improved A* and grey wolf optimizer. World Electric Vehicle Journal, 2024, 15(11): 531

[48]

Mourtzis D, Angelopoulos J, Panopoulos N. Unmanned Aerial Vehicle (UAV) manipulation assisted by Augmented Reality (AR): the case of a drone. IFAC-PapersOnLine, 2022, 55(10): 983–988

[49]

Konstantoudakis K, Christaki K, Tsiakmakis D, Sainidis D, Albanis G, Dimou A, Daras P. Drone control in AR: an intuitive system for single-handed gesture control, drone tracking, and contextualized camera feed visualization in augmented reality. Drones, 2022, 6(2): 43

[50]

Negus K L, Li X, Welch S M, Yu J. The role of artificial intelligence in crop improvement. In: Sparks D L, ed. Advances in Agronomy. Academic Press, 2024, 1–66

[51]

Zhao H, Di L, Guo L, Zhang C, Lin L. An automated data-driven irrigation scheduling approach using model simulated soil moisture and evapotranspiration. Sustainability, 2023, 15(17): 12908

[52]

Yao Q, Zhang H. Improving agricultural product traceability using blockchain. Sensors, 2022, 22(9): 3388

[53]

Chen H Y, Sharma K, Sharma C, Sharma S. Integrating explainable artificial intelligence and blockchain to smart agriculture: research prospects for decision making and improved security. Smart Agricultural Technology, 2023, 6: 100350

[54]

Ma L, Gao F, Li X. Harnessing Blockchain for Agricultural Product Traceability. In: 2024 4th International Conference on Computer Science and Blockchain (CCSB), Shenzhen, China. IEEE, 2024, 507–514

[55]

Dolatabadian A, Neik T X, Danilevicz M F, Upadhyaya S R, Batley J, Edwards D. Image-based crop disease detection using machine learning. Plant Pathology, 2025, 74(1): 18–38

[56]

Chauhan S, Kumar R, Kumar B. Harmonizing Precision Agriculture: Augmented Insights into Plant Disease Detection Using Deep Learning. In: 2024 International Conference on Integrated Circuits, Communication, and Computing Systems (ICIC3S), Una, India. IEEE, 2024, 1–6

[57]

Thangaraj R, Vadivel B, Sadesh S, Kumar S M, Karuppusamy S, Prakash P. Identification of Black Gram Plant Leaf Diseases Using Deep Learning Models. In: Proceedings of the 3rd International Conference on Optimization Techniques in the Field of Engineering (ICOFE 2024), 2024

[58]

Tunio M H, Li J P, Zeng X, Ahmed A, Shah S A, Shaikh H U, Ali Mallah G, Yahya I A. Advancing plant disease classification: a robust and generalized approach with transformer-fused convolution and Wasserstein domain adaptation. Computers and Electronics in Agriculture, 2024, 227: 109574

[59]

Aldakheel E A, Zakariah M, Alabdalall A H. Detection and identification of plant leaf diseases using YOLOv4. Frontiers in Plant Science, 2024, 15: 1355941

[60]

Joseph D S, Pawar P M, Chakradeo K. Real-time plant disease dataset development and detection of plant disease using deep learning. IEEE Access: Practical Innovations, Open Solutions, 2024, 12: 16310–16333

[61]

Roopa D, Bose S. Leaf’s Diseases and Its Characteristics Visualization Using Augmented Reality. In: 2022 1st International Conference on Computational Science and Technology (ICCST), Chennai, India. IEEE, 2022, 1019–1024

[62]

Prashanth J S, Moparthi N R, Krishna G B, Krishna Prasad A V, Sravankumar B, Rao P R. MPCSAR-AHH: a hybrid deep learning model for real-time detection of cassava leaf diseases and fertilizer recommendation. Computers & Electrical Engineering, 2024, 119: 109628

[63]

Nwaneto C B, Yinka-Banjo C, Ugot O. An object detection solution for early detection of taro leaf blight disease in the West African sub-region. Franklin Open, 2025, 10: 100197

[64]

Rehman A, Abunadi I, Alamri F S, Ali H, Bahaj S A, Saba T. An intelligent deep augmented model for detection of banana leaves diseases. Microscopy Research and Technique, 2025, 88(1): 53–64

[65]

Kanaga Priya P, Vinu M M S, Jeevitha S V, Rathina Kumar N, Kanimozhi N, Jayachitra S. Augmented Insights: a Qualified Evaluation of Deep Learning Representations for Enhancing Banana Leaf Spot Disease Detection. In: 2024 International Conference on Social and Sustainable Innovations in Technology and Engineering (SASI-ITE), Tadepalligudem, India. IEEE, 2024, 96–101

[66]

Sheng H, Yao Q, Luo J, Liu Y, Chen X, Ye Z, Zhao T, Ling H, Tang J, Liu S. Automatic detection and counting of planthoppers on white flat plate images captured by AR glasses for planthopper field survey. Computers and Electronics in Agriculture, 2024, 218: 108639

[67]

Mahenthiran N, Sittampalam H, Yogarajah S, Jeyarajah S, Chandrasiri S, Kugathasan A. Smart Pest management: an augmented reality-based approach for an organic cultivation. In: 2021 2nd International Informatics and Software Engineering Conference (IISEC), Ankara, Turkey. IEEE, 2021, 1–6

[68]

Balaji Prabhu B V, Shashank R, Shreyas B, Jois Narsipura O S. ARIA: augmented reality and artificial intelligence enabled mobile application for yield and grade prediction of tomato crops. Procedia Computer Science, 2024, 235: 2693–2702

[69]

Chai J J K, Xu J L, O’Sullivan C. Real-time detection of strawberry ripeness using augmented reality and deep learning. Sensors, 2023, 23(17): 7639

[70]

Kong J, Hua C, Jin X, Guo N, Peng L. An effective object detector via diffused graphic large selective kernel with one-to-few labelling strategy for small-scaled crop diseases detection. Crop Protection, 2024, 182: 106705

[71]

Iyoubi E M, El Boq R, Izikki K, Tetouani S, Cherkaoui O, Soulhi A. Revolutionizing smart agriculture: enhancing apple quality with machine learning. Data and Metadata, 2024, 3: 592

[72]

Wu Z, Jiang Y, Li X, Chung K L. Enhancing Precision Agriculture: yolov8 for Accurate Corn Disease and Pest Detection. In: 2024 IEEE 7th International Conference on Electronic Information and Communication Technology (ICEICT), Xi’an, China. IEEE, 2024, 980–984

[73]

Moreira G, Neves dos Santos F, Cunha M. Grapevine inflorescence segmentation and flower estimation based on Computer Vision techniques for early yield assessment. Smart Agricultural Technology, 2025, 10: 100690

[74]

El Akrouchi M, Mhada M, Bayad M, Hawkesford M J, Gérard B. AI-based framework for early detection and segmentation of green Citrus fruits in orchards. Smart Agricultural Technology, 2025, 10: 100834

[75]

Yousafzai S N, Nasir I M, Tehsin S, Fitriyani N L, Syafrudin M. FLTrans-Net: transformer-based feature learning network for wheat head detection. Computers and Electronics in Agriculture, 2025, 229: 109706

[76]

Joshi R C, Burget R, Dutta M K. AgriDeep-net: an advanced deep feature fusion-based technique for enhanced fine-grain image analytics in precision agriculture. Ecological Informatics, 2025, 86: 103069

[77]

Choi T, Would O, Salazar-Gomez A, Liu X, Cielniak G. Channel randomisation: self-supervised representation learning for reliable visual anomaly detection in speciality crops. Computers and Electronics in Agriculture, 2024, 226: 109416

[78]

Sonkoly B, Nagy B G, Dóka J, Kecskés-Solymosi Z, Czentye J, Formanek B, Jocha D, Gerő B P. An edge cloud based coordination platform for multi-user AR applications. Journal of Network and Systems Management, 2024, 32(2): 40

[79]

Nwogbaga N E, Latip R, Affendey L S, Rahiman A R A. Attribute reduction based scheduling algorithm with enhanced hybrid genetic algorithm and particle swarm optimization for optimal device selection. Journal of Cloud Computing, 2022, 11(1): 15

[80]

Hao J, Chen Y, Gan J. Delay-guaranteed mobile augmented reality task offloading in edge-assisted environment. Ad Hoc Networks, 2024, 161: 103539

[81]

Alturif G, Saleh W, El-Bary A A, Osman R A. Towards efficient IoT communication for smart agriculture: a deep learning framework. PLoS One, 2024, 19(11): e0311601

[82]

Kwon D, Park Y. Design of secure and efficient authentication protocol for edge computing-based augmented reality environments. Electronics, 2024, 13(3): 551

[83]

Na M, Lee J. Generative AI-enabled energy-efficient mobile augmented reality in multi-access edge computing. Applied Sciences, 2024, 14(18): 8419

[84]

Amzil A, Hanini M, Zaaloul A. Modeling and analysis of LoRa-enabled task offloading in edge computing for enhanced battery life in wearable devices. Cluster Computing, 2025, 28(3): 201

[85]

Soundararaj A J, Sathianesan G W. Task offloading scheme in mobile augmented reality using hybrid Monte Carlo tree search (HMCTS). Alexandria Engineering Journal, 2024, 108: 611–625

[86]

Duhan S, Gulia P, Gill N S, Narwal E. RTR_Lite_MobileNetV2: a lightweight and efficient model for plant disease detection and classification. Current Plant Biology, 2025, 42: 100459

[87]

Maraveas C. Image analysis artificial intelligence technologies for plant phenotyping: current state of the art. AgriEngineering, 2024, 6(3): 3375–3407

[88]

Maraveas C, Asteris P G, Arvanitis K G, Bartzanas T, Loukatos D. Application of bio and nature-inspired algorithms in agricultural engineering. Archives of Computational Methods in Engineering, 2023, 30(3): 1979–2012

[89]

de Oliveira M E, Corrêa C G. Virtual Reality and Augmented Reality Applications in Agriculture: a Literature Review. In: 2022nd Symposium on Virtual and Augmented Reality (SVR), orto de Galinhas, Brazil. IEEE, 2020, 1–9

[90]

Anastasiou E, Balafoutis A T, Fountas S. Applications of extended reality (XR) in agriculture, livestock farming, and aquaculture: a review. Smart Agricultural Technology, 2023, 3: 100105

[91]

Bigonah M, Jamshidi F, Pant A, Poudel S, Reddy Nallapareddy S, Charmchian Langroudi A, Marghitu D. A systematic review of extended reality (XR) technologies in agriculture and related sectors (2022–2024). IEEE Access: Practical Innovations, Open Solutions, 2025, 13: 49721–49734

[92]

Kalatzis N, Marianos N, Chatzipapadopoulos F. IoT and Data Interoperability in Agriculture: a Case Study on the GaiasenseTM Smart Farming Solution. In: Global IoT Summit (GIoTS), Arhus, Denmark. IEEE, 2019, 1–6

[93]

Cartolano A, Cuzzocrea A, Pilato G. Analyzing and assessing explainable AI models for smart agriculture environments. Multimedia Tools and Applications, 2024, 83(12): 37225–37246

[94]

Heider N, Gunreben L, Zürner S, Schieck M. A survey of datasets for computer vision in agriculture. arXiv preprint, 2025, arXiv:2502.16950

[95]

Peladarinos N, Piromalis D, Cheimaras V, Tserepas E, Munteanu R A, Papageorgas P. Enhancing smart agriculture by implementing digital twins: a comprehensive review. Sensors, 2023, 23(16): 7128

[96]

Etezadi H, Eshkabilov S. A comprehensive overview of control algorithms, sensors, actuators, and communication tools of autonomous all-terrain vehicles in agriculture. Agriculture, 2024, 14(2): 163

[97]

Chakraborty B K, Sarma D, Bhuyan M K, MacDorman K F. Review of constraints on vision-based gesture recognition for human–computer interaction. IET Computer Vision, 2018, 12(1): 3–15

[98]

Reyes-Hung L, Soto I, Zamorano-Illanes R, Adasme P, Ijaz M, Azurdia C, Gutierrez S. Crop Stress Detection with Multispectral Imaging Using IA. In: 2023 South American Conference on Visible Light Communications (SACVLC), Santiago, Chile. IEEE, 2023, 59–64

RIGHTS & PERMISSIONS

The Author(s) 2025. Published by Higher Education Press. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0)

AI Summary AI Mindmap
PDF (1506KB)

2699

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/