Robot Lawn Mower
Navigation & AI

The Shadow Problem: Why Vision-AI Robot Mowers Struggle at 3:00 PM

Hard shadow edges from low-angle sunlight fool camera-based obstacle detection into seeing phantom barriers. Here is the science, the scope, and the scheduling workaround.

RLM
Robot Lawn Mower
Editorial Team
In-Depth Research & Verified Owner Data
Definition: Vision-AI (Camera-Based Navigation)

An obstacle detection and navigation system that uses one or more cameras combined with machine learning computer vision models to identify objects, terrain features, and boundaries. Unlike LiDAR (which measures distances with laser pulses) or ultrasonic sensors (which use sound waves), vision-AI relies on visible light — making it vulnerable to lighting conditions that change how the scene appears to the camera.

What Causes the Shadow Problem

Camera-based object detection works by analyzing pixel contrast, edges, and patterns in the camera image and matching them to trained categories: "obstacle," "grass," "path," "boundary." The AI model has been trained on thousands of images of real obstacles (shoes, toys, rocks, branches).

A sharp shadow edge on a bright, sunlit lawn creates an extreme contrast pattern — a dark rectangle or line with a hard edge, set against bright green grass. To the AI model, this looks remarkably similar to a physical object lying on the grass. The model assigns a high probability to "obstacle" and the mower stops or reroutes.

When Is the Problem Worst?

ConditionShadow SeverityVision-AI Impact
Midday (11 AM – 1 PM)Low (short shadows)Minimal — best mowing window
Morning (8 – 10 AM)Moderate (long but soft)Low — shadows are diffuse
Afternoon (2 – 4 PM)High (long, sharp edges)Significant — most false positives
Late afternoon (4 – 6 PM)Very high (extreme contrast)Severe — mower may abort session
Overcast / cloudyNone (diffuse light)None — ideal mowing conditions
Dappled shade (under trees)Variable (complex patterns)Moderate — rapid light/dark transitions confuse AI
Full shadeNoneLow — but camera exposure may struggle in deep shade

Workaround: Schedule Around Shadows

The most reliable fix is to schedule mowing during low-shadow periods:

  1. Primary window: 9:00 AM – 1:00 PM. Sun is high, shadows are short.
  2. Secondary window: Overcast days. Most robot mower apps support weather-aware scheduling — enable "mow on cloudy days" if available.
  3. Avoid: 2:00 PM – sunset during sunny days (worst shadow conditions).

Other Mitigation Strategies

  • Reduce obstacle detection sensitivity — Some apps allow you to lower the vision-AI sensitivity from "high" to "medium" or "low." This makes the mower less reactive to shadows but also less reactive to real obstacles. Use with caution if children or pets are present.
  • Keep firmware updated — Manufacturers continuously improve shadow recognition in their AI models. Always install the latest firmware (after waiting 1–2 weeks for community testing).
  • Remove shadow-casting objects near boundaries — If a specific object (garden ornament, temporary structure) casts a problematic shadow, consider relocating it.
  • Map "no-go zones" in problem areas — If one section consistently triggers false positives at a specific time, create a scheduled no-go zone for that time period (if supported by the app).

LiDAR vs. Vision-AI in Shadows

LiDAR (Light Detection and Ranging) uses infrared laser pulses to measure distances — it does not "see" in the visual spectrum and is completely unaffected by shadows. Robot mowers with LiDAR-based obstacle detection (like certain Husqvarna NERA models) do not experience the shadow problem. However, LiDAR has its own blind spots: it cannot detect transparent obstacles (glass) and may miss very flat objects (a hose lying on the grass). For a deeper comparison, see our LiDAR vs. Vision-AI navigation guide.

Will AI Solve This Eventually?

Yes, with caveats. Computer vision models improve as they are trained on more edge-case data. Tesla's Autopilot, which faces similar shadow-on-road challenges, has improved significantly over multiple years of data collection. Robot mower manufacturers with large installed bases (Husqvarna, Mammotion) are actively collecting vision data from real-world deployments to retrain their models.

Expect meaningful improvements over 2–3 firmware update cycles. But the fundamental physics of camera-based perception — reliance on reflected visible light — means extreme lighting conditions will always be harder for cameras than for active sensors like LiDAR or ultrasonic.

Frequently Asked Questions

The Luba 2 uses a combination of RTK-GPS and vision-AI (camera) for navigation and obstacle detection. In the afternoon (roughly 2:00–5:00 PM depending on latitude and season), the sun is low enough to cast long, sharp-edged shadows from fences, trees, and buildings. The vision system may interpret these high-contrast shadow edges as physical obstacles, causing the mower to stop, reroute, or return to dock prematurely.

To varying degrees, yes. Any robot mower that uses a camera for obstacle detection — Mammotion Luba 2, Ecovacs Goat G1, Dreame Roboticmower A1 — can be affected by extreme lighting conditions. LiDAR-based systems (like some Husqvarna models) are less affected by visual contrast but have their own limitations (transparent objects, very low-profile obstacles).

Yes — scheduling is the most practical workaround. Set mowing hours to avoid the worst shadow periods: mow in the morning (8:00 AM – 12:00 PM) when shadows are shorter and less defined, or during overcast conditions when shadows are soft. Avoid scheduling during the 2–3 hours before sunset when shadow contrast is highest.

Likely, over time. Vision-AI systems improve through machine learning — as manufacturers collect more training data of shadow conditions, the obstacle detection models get better at distinguishing shadows from real obstacles. Mammotion has already improved shadow handling in several firmware updates since the Luba 2 launch. The problem will diminish but may never be fully eliminated in challenging lighting.

No. RTK-GPS positioning is based on satellite signals, not visual information. Shadows do not affect GPS accuracy. The issue arises specifically with the vision-AI component used for obstacle detection — not the positioning system. On a lawn with no obstacles in the shadow zone, the Luba 2's RTK positioning works fine regardless of lighting.