Store

No products in the cart.

Home Blog Retail Shelf Edge AI Out-of-Stock Detection
Retail AI Series · Blog 2 of 5

Retail Shelf Monitoring: Real-Time Out-of-Stock Detection
with Edge AI Cameras

Why traditional inventory systems miss most shelf gaps, how computer vision solves it on-device in under 100ms, and what hardware you need to deploy a working OOS alert system today — without a cloud subscription.

$984B
Estimated annual revenue lost globally to retail out-of-stocks
43%
Of OOS events caused by phantom inventory — items ERP thinks are on the shelf
<100ms
End-to-end alert latency with edge AI inference vs. 2–8s via cloud
By CamThink Team March 2026 ~2,100 words · 10 min read

The Business Cost of Out-of-Stock Events

An empty shelf is not just a missed sale. When a shopper reaches for a product and finds nothing there, the ripple effect extends well beyond that single unit. Research consistently shows that roughly 30% of shoppers leave the store without buying anything when they encounter an OOS item, while around 25% switch to a competitor's brand on the spot — and some never come back. A single high-velocity SKU going out of stock for three hours on a busy Saturday afternoon is a concrete, calculable revenue event.

Scale that dynamic across hundreds of SKUs, dozens of store locations, and 52 weeks of peak trading, and you reach IHL Group's estimate of $984 billion in annual global out-of-stock losses. For a mid-sized grocery chain running on 2–3% net margins, even a 10% improvement in OOS recovery translates to millions in recovered revenue annually.

⏱️
The Detection Lag Problem

Most stores conduct physical shelf walks 1–3 times per day. A gap that opens at 10:15 AM may not be discovered until 1:00 PM — nearly 3 hours of stockout during peak foot traffic. Real-time AI detection collapses that window to seconds. The difference between a 3-hour stockout and a 12-minute stockout on a popular beverage SKU is a meaningful fraction of that day's category revenue.

Why Traditional Inventory Systems Miss Shelf Gaps

Here is the counterintuitive reality of retail OOS: most stockouts happen when your inventory system says stock is available. This is phantom inventory — and it is the core structural reason why ERP and WMS platforms alone cannot solve the out-of-stock problem, no matter how sophisticated their forecasting becomes.

Enterprise inventory systems track the flow of units through receiving, storage, and checkout. What they cannot observe is physical shelf position. A product that rolled behind a fixture, was abandoned in the wrong aisle, or is sitting in an unopened case in the stockroom is counted as available stock — right up until the moment an associate physically looks at the shelf and discovers otherwise.

What the ERP / WMS sees
System says: 12 units available
  • 48 units received at dock — recorded
  • 36 units scanned at checkout — recorded
  • Net on-hand inventory: 12 units
✓ Fully stocked per system
What the shelf actually looks like
Shopper sees: empty shelf face
  • 6 units displaced behind shelf fixture
  • 3 units abandoned in a different aisle
  • 3 units in unopened case, stockroom floor
✗ Zero visible facings for the shopper

Industry research attributes 20–43% of all OOS events to phantom inventory — gaps the system has no mechanism to detect because it lacks physical visibility of the shelf. The table below shows which root causes of OOS are structurally invisible to enterprise systems and directly detectable by an AI camera.

OOS Root CauseShare of OOS EventsVisible to ERP/WMS?Detectable by AI Camera?
Phantom inventory (displaced or inaccessible stock)20–43%NoYes — real time
Demand surge (depletion faster than forecast)~25%No — hours lagYes — immediately
Replenishment delay (stock in back room, not stocked)~20%PartialYes — triggers alert
Forecasting / ordering error~14%Post-facto onlySupports detection

An edge AI shelf camera gives you visibility into 65–88% of OOS root causes that enterprise systems structurally cannot see. That coverage, applied to high-velocity SKUs, is where the ROI of shelf monitoring hardware is concentrated.

How Computer Vision Detects Out-of-Stock Items

A real-time OOS detection system runs a five-stage pipeline entirely on the edge camera — from image capture to MQTT alert publish — completing in under 100 milliseconds. No cloud hop, no round-trip latency, no per-image API cost.

OOS Detection Pipeline · On-Device · End to End
01 · Capture
📷
Image Capture
1–5 FPS · scheduled or motion-triggered · consistent shelf framing
~0 ms
02 · Infer
🧠
On-Device Inference
YOLOv8 INT8 · Neural-ART NPU · products + gaps detected simultaneously
~40 ms
03 · Analyze
📐
Gap Analysis
Fill-level % · gap area vs. threshold · confidence scoring per slot
~10 ms
04 · Alert
🚨
Alert Trigger
Threshold exceeded · MQTT publish · JSON payload with shelf ID + fill % + snapshot URL
~10 ms
05 · Act
📲
Notification
Associate mobile alert · Home Assistant · ERP webhook · replenishment task
~30 ms
Total end-to-end latency < 100 ms on-device

YOLO-Based Shelf Detection: Two Classes, One Model

The detection model is trained on two primary classes: product_present (a visible product facing in a shelf slot) and shelf_gap (an empty or near-empty slot). Post-processing calculates the fill-level percentage — the ratio of product-occupied area to total monitored shelf width within the camera's field of view.

When fill percentage drops below a configured threshold — typically 30%, adjustable per SKU category — the system fires an OOS alert. Fast-moving beverages may be configured to alert at 20% fill; slow-moving specialty items at 50%. The threshold is the primary tuning parameter between first deployment and production accuracy.

Why On-Device Inference Is Essential for OOS Alerting

DimensionCloud ProcessingEdge AI (NE301)
Alert latency2–8 seconds — network + API queue< 100ms — all on-device
Bandwidth per camera2–10 Mbps — continuous video stream< 1 MB/day — JSON alerts only
Works without internetNo — outage stops all monitoringYes — local inference always on
Shopper data privacyRisk — video transmitted offsiteOn-premises — video never leaves the store
Ongoing cost modelPer-image fees accumulate 24/7Hardware only — no per-inference fee
The Latency Gap Has Real Operational Consequences

A shelf that gaps at 10:47 AM during pre-lunch traffic sends an edge alert at 10:47:00. A cloud-dependent system, processing a backlogged queue, may surface that alert at 10:47:06 — or 10:49:30 during peak API load. At scale, across hundreds of SKUs and stores, that latency difference directly translates into minutes of unnecessary OOS time compounding daily into measurable lost revenue.

Hardware Requirements for OOS Detection

Camera Placement: FOV, Resolution, and Lighting

Camera placement quality is the most common reason for underperforming detection models. A great model trained on poorly-framed images will consistently miss detections that a mediocre model would catch with clean, consistent framing. The fundamentals:

  • Mounting height: 100–130cm (mid-shelf, eye level) — the same perspective a shopper has when reaching for a product. This is the angle your model was trained on.
  • Distance from shelf face: 50–90cm for the 88° lens. Closer increases per-SKU resolution; farther increases section coverage per camera.
  • Lens selection: 88° covers approximately 90–120cm of shelf width at 80cm distance — typically one to two bays. Use 137° for wide end-cap displays; 51° for premium sections requiring high per-product resolution.
  • Lighting stability: Avoid positions where overhead spotlights cast time-varying shadows across the shelf face. LED shelf strip lighting dramatically improves model consistency across shifts — eliminating the single most common cause of false positives in live deployments.
CamThink NeoEyes NE301
Recommended · Per-Shelf OOS Sensor Node
Processor
STM32N6 · Cortex-M55
AI Performance
0.6 TOPS Neural-ART NPU
Inference Speed
25 FPS YOLOv8 on-device
Field of View
51° / 88° / 137° selectable
Connectivity
Wi-Fi · PoE · LTE Cat.1
Durability
IP67 · –20°C to +50°C
Why the NE301 fits OOS detection precisely: The Neural-ART NPU runs quantized INT8 YOLOv8 at 25 FPS — well above the 1–5 FPS needed for shelf monitoring — with headroom to simultaneously run shelf_gap and product_present detection in a single pass. The selectable FOV lens system covers every shelf configuration from a single hardware SKU. The PoE variant provides always-on 24/7 monitoring from one cable per camera. And the open-source AI Tool Stack gives you full ownership of your model, your training data, and your inference pipeline — with no cloud subscription required at any stage of the workflow.

Multi-Camera Setups: NG4500 as Store Edge Server

For deployments covering 20+ shelf cameras per store, or when running OOS detection and planogram compliance simultaneously, the CamThink NG4500 (NVIDIA Jetson Orin NX/Nano, 20–100 TOPS) serves as a store-level inference hub. NE301 cameras handle first-pass per-shelf detection; the NG4500 runs heavier analytics, aggregates data across the full camera fleet, and manages the ERP integration layer for the entire store. The two-tier architecture is detailed in our planogram compliance guide.

Building an OOS Alert System: NE301 + MQTT + Home Assistant

The following six steps walk through a complete OOS detection system from hardware setup to live store associate alerts. This is the same architecture demonstrated in CamThink's published Hackster.io project.

1
Mount NE301 and connect to store Wi-Fi

Install at shelf level with 88° lens. Power via USB-C bank (prototype) or PoE switch (production). Hold button 2–3s to activate the built-in AP → connect → navigate to 192.168.10.10 → configure your store Wi-Fi SSID. Verify live image framing covers the target shelf section without obstruction.

~30 min
2
Collect training images via CamThink AI Tool Stack

Install AI Tool Stack (Docker). Bind the NE301 as a data source. Capture 200–400 shelf images across fill states — fully stocked, 75%, 50%, 25%, empty — at multiple times of day to capture lighting variation. Dataset diversity determines your model's accuracy ceiling.

2–4 hours
3
Annotate, train, and quantize

Label two classes — shelf_gap and product_present. Use AI Tool Stack's Auto Annotation after manually labeling 20–30 images. Train YOLOv8n at imgsz=256. One-click export to INT8 NE301 model package. Training time: 15–90 minutes depending on GPU availability.

3–5 hours total
4
Deploy model to NE301

NE301 Web UI → Application Management → Upload model package → Apply. Device reboots into new model (~30s). Open Live View to confirm bounding boxes appear correctly on the shelf feed. Adjust mount angle if needed.

~15 min
5
Configure MQTT output and alert thresholds

NE301 Web UI → Settings → MQTT. Set broker IP, port 1883, uplink topic. Configure fill-percentage alert threshold (default 30%). The NE301 publishes a structured JSON event when a gap is detected — shelf ID, fill %, confidence score, and a snapshot URL included.

~30 min
6
Create Home Assistant automation for associate alerts

Subscribe to the NE301 MQTT topic. Build an automation that pushes a mobile notification — with the shelf snapshot attached — when a shelf_gap event arrives below your threshold. Alert reaches the associate's phone in under a second from camera detection.

~1 hour
JSON · MQTT NE301 shelf_gap alert payload
{
  "device_id":   "NE301_A3F2C1",
  "shelf_id":    "aisle-3-bay-2-left",
  "timestamp":   "2025-03-01T09:47:12Z",
  "event":       "shelf_gap_detected",
  "detections": [
    {
      "class":       "shelf_gap",
      "confidence":  0.94,
      "fill_pct":    14  // 14% fill — below 30% alert threshold
    }
  ],
  "image_url":   "http://192.168.1.45/snap/20250301_094712.jpg"
}
YAML · Home Assistant OOS alert → mobile push with shelf snapshot
alias: "Shelf OOS Alert — Aisle 3"
trigger:
  - platform: mqtt
    topic: "shelf/aisle-3-bay-2-left/events"
condition:
  - condition: template
    value_template: "{{ trigger.payload_json.detections[0].fill_pct | int < 30 }}"
action:
  - service: notify.mobile_app_store_associate
    data:
      title:   "⚠️ Low Stock — Aisle 3, Bay 2"
      message: "{{ trigger.payload_json.detections[0].fill_pct }}% fill. Please restock."
      data:
        image:   "{{ trigger.payload_json.image_url }}"
🔗
Live Reference · Hackster.io
Industrial Edge AI: Smart Warehouse Monitoring with CamThink NE301

CamThink's published Hackster project demonstrates the complete NE301 + AI Tool Stack + Home Assistant + MQTT architecture in a real inventory monitoring context — the same workflow that applies directly to retail shelf OOS detection.

View full project on Hackster.io
🛠️
Need every command and config file?

Our full technical build guide covers Docker setup, annotation workflow, YOLOv8 training flags, quantization, model deployment, and complete MQTT configuration — with all code ready to copy. Read: How to Build a Retail Shelf Monitoring System (Step-by-Step) →

Integration: Connecting OOS Alerts to ERP and Inventory Systems

Home Assistant is ideal for prototyping and small store deployments. For enterprise integration — routing OOS events to SAP, Oracle, NetSuite, or commercial task management platforms — the NE301's MQTT output feeds into a Node-RED or middleware layer that translates structured JSON events into HTTP REST calls to your existing systems stack.

📷
NE301
On-device inference · MQTT output
MQTT
🔀
MQTT Broker
Mosquitto · AWS IoT · HiveMQ
subscribe
⚙️
Node-RED / HA
Filter · transform · route
HTTP REST
🏢
ERP / WMS
SAP · Oracle · NetSuite · Custom

The middleware layer handles three tasks: event deduplication (suppressing repeat alerts within a cool-down window so associates aren't flooded), payload transformation (mapping NE301 JSON to ERP-expected schemas), and routing logic (directing alerts to the right store, team, or replenishment workflow based on shelf ID and product category).

🔒
No Vendor Lock-In on the Data Side

The NE301 outputs standard MQTT with a documented JSON schema. No proprietary cloud platform, no per-alert fee, no vendor dependency on the data pipeline. Route alerts to Home Assistant, SAP, a custom operations dashboard, or all three simultaneously — the integration is entirely yours to own and modify.

Real-World Benchmarks

Retailers who have deployed AI shelf monitoring report consistent improvement across three measurable dimensions. These figures reflect outcomes documented across computer vision OOS detection deployments in live retail environments.

Reported Outcomes · AI Shelf Monitoring Deployments
95–99%
OOS Detection Accuracy vs. 60–70% for manual shelf audits
20–30%
Reduction in Out-of-Stock Events within 90 days of deployment
4–8×
Faster alert-to-restock cycle vs. scheduled manual walkthrough

Schnuck Markets, a U.S. regional grocery chain, reported measurable same-store sales recovery on monitored SKUs after deploying computer vision shelf monitoring, attributing the improvement primarily to faster detection response time. The gap between when a stockout occurred and when an associate was notified collapsed from several hours under manual monitoring to a matter of minutes. That speed reduction, applied to high-velocity SKUs during peak traffic hours, translated directly into recovered basket completions.

For buyers building the ROI case: a single NE301 unit at $199.90 monitoring two or three fast-moving SKUs that recover $50/week in previously lost sales pays for itself in four weeks. On high-velocity beverage or snack categories with ten or more SKUs per section, the payback is typically measured in days, not months.

Frequently Asked Questions

How does AI detect out-of-stock items on retail shelves? +
An edge AI camera captures shelf images at regular intervals (1–5 FPS). A YOLO-based object detection model identifies products and empty areas within each frame. When detected gaps exceed a configured fill-percentage threshold, the system publishes a structured MQTT alert — entirely on-device, completing in under 100ms, with no cloud connectivity required.
What is phantom inventory, and why can only AI cameras solve it? +
Phantom inventory occurs when an ERP system records stock as available, but the shelf is physically empty — because items are displaced behind fixtures, abandoned in the wrong aisle, or sitting in an unopened back-room case. ERP systems track units, not shelf positions. They have no mechanism to see physical shelf state. An AI camera sees exactly what the shopper sees — detecting gaps regardless of what any inventory system believes is there. This structural gap is why phantom inventory causes 20–43% of OOS events that traditional systems cannot detect.
What hardware do I need to deploy real-time OOS detection? +
At minimum: a shelf-level camera with an NPU capable of running YOLOv8-class models locally. The CamThink NE301 (STM32N6, 0.6 TOPS Neural-ART NPU, 25 FPS YOLOv8, from $199.90) is the purpose-built solution for per-shelf monitoring. For stores with 20+ cameras or needing multi-model inference (OOS + planogram compliance simultaneously), add a CamThink NG4500 as a store-level edge server.
Can I build this without prior AI or machine learning experience? +
Yes. The CamThink AI Tool Stack provides a browser-based UI for every step — data collection, annotation, training, quantization, and deployment — without requiring you to write any model training code. The entire pipeline from raw images to a deployed, running model on the NE301 typically takes 4–8 hours for first-time builders. See our step-by-step build guide for the full walkthrough.
For Retailers & Solution Evaluators

Build a Real-Time Out-of-Stock Detection System with NE301

On-device YOLOv8. Selectable FOV. MQTT alerts in under 100ms. No cloud subscription, no per-inference fee, no vendor lock-in.

Welcome Offer