Industry

AI, Retail

Client

DataSentics, an Eviden business

ShelfInspector

Two smartphone screen mockups on a light grey background displaying the ShelfInspector logo and app interface.
Two smartphone screen mockups on a light grey background displaying the ShelfInspector logo and app interface.
Two smartphone screen mockups on a light grey background displaying the ShelfInspector logo and app interface.

Executive summary

ShelfInspector is a DataSentics product that analyses retail shelves from photos to check placement, pricing, out-of-stocks, and planogram compliance. My task was to design the demo app experience, clear enough for client demos, simple enough to follow, and faithful to the computer-vision results.   There were constraints: no direct access to end-users, shifting requirements, and technical outputs that weren’t easy to read. I adapted by working closely with AI engineers, developers, and product stakeholders, iterating quickly in Figma and validating through internal reviews and demo walkthroughs. The outcome is a mobile-first, demo-ready flow that turns raw detections into a usable interface.

Why it needed a demo

We had a powerful CV model, but not a clear journey. Requirements lived in Slack threads and ad-hoc notes. The results (boxes, scores, errors) were technically correct but hard to interpret. The goal was a cohesive, client-friendly demo that tells the story from capture to analysis without overwhelming people.

Designing the flow

I mapped the flow in small, understandable steps: use-case → planogram → camera → upload → analysis → results. I kept copy short, defaults sensible, and transitions clear. I iterated quickly in Figma, shared prototypes, and refined wording and states after each internal review.

Turning CV into a readable UI

Raw detections aren’t self-explanatory. I defined a simple and well recognized visual language to make results scannable: green for correct, red for wrong, arrows for misplaced items, concise labels for missing SKUs. I presented Analyzed Photo next to Reference/Planogram so people immediately understand what the model saw vs what it expected.

Adapting without end-users

We couldn’t run research with merchandisers or store staff. I adapted. I used heuristic reviews, best-practice patterns, and prototype walkthroughs with AI, product, and dev stakeholders.

Outcome

We delivered a polished, demo-ready app that shows the model’s value without technical overhead. The flow is mobile-first and scalable to desktop. The UI is component-based and consistent, which made handover smoother for developers.

Still a foundation

Even though I’m no longer working on the project, my work on ShelfInspector set the groundwork for how the product could be demoed and understood. We turned complex AI outputs into a clear, mobile-first interface and created a flow stakeholders could confidently present to clients.