AgriKD: Cross-Architecture Knowledge Distillation for Efficient Leaf Disease Classification
arXiv cs.CV / 5/5/2026
📰 NewsDeveloper Stack & InfrastructureTools & Practical UsageIndustry & Market MovesModels & Research
Key Points
- AgriKD is a cross-architecture knowledge distillation framework that transfers knowledge from a Vision Transformer (ViT) teacher to a lightweight convolutional (CNN) student for leaf disease classification on edge devices.
- The method reduces the representational gap between Transformer and CNN by applying multiple distillation objectives at the output, feature, and relational levels, preserving transformer-like global representations.
- Experiments across multiple leaf disease datasets show the distilled student matches teacher performance closely while dramatically improving efficiency (about 172× fewer parameters, 47.57× lower compute, and 18–22× lower latency).
- The optimized model is exported to multiple deployment formats (ONNX, TFLite Float16, TensorRT FP16) with consistent predictions and negligible accuracy loss.
- Real-world tests on NVIDIA Jetson edge hardware and a mobile application demonstrate reliable real-time inference, supporting practical deployment for AI-enabled agriculture in resource-constrained settings.
Related Articles

Black Hat USA
AI Business

Backed by Y Combinator and 20 unicorn founders, Moritz lands $9M
Tech.eu

Why Retail Chargeback Recovery Could Be AgentHansa's First Real PMF
Dev.to

Anthropic Launches AI Services Company with Blackstone & Goldman Sachs
Dev.to

Why B2B Revenue-Recovery Casework Looks Like AgentHansa's Best Early PMF
Dev.to