
STORESIGHT INTRODUCES
Share of Shelf
It's time to stop counting. Own a real-time, always on Share of Shelf engine. Accurate, AI-Powered, At Scale.


From Manual to Always On.
Say goodbye to outdated "snapshot" assessments requiring the manual, costly counting of facings. Storesight's Share of Shelf metric is calculated from a continuous live stream of shelf photos from real shoppers, scaled with AI.
No more small sample sizes, either. Get unmatched scale and coverage with our targeted 80% ACV, spanning 480 categories and 5,000 brands.

Real and Actionable.
Planogram data is sourced from real, unbiased, everyday shoppers. Filter your Share of Shelf by facings or linear space, brand or manufacturer, and more.
Get deeper analysis by viewing your share from heatmap and diamond "strikezone" lenses.
Share of Shelf Resources
FAQs
How do you actually measure space?
The base unit is every individual photo or planogram. We measure the percentage of image pixels that belong to a brand/product on the shelf. Output is a percentage of shelf pixels, not traditional inches/feet/cm.
How do you define the category?
Share of Shelf is calculated within a defined category context (ex: pasta sauce), where the metric compares a brand’s presence to the total available shelf positions within that category. Categories can also be analyzed in different scopes, including product category segments (ex: organic vs non-organic yogurt), retailer/store location, and other reporting cuts.
Can we look at Share of Shelf by our preferred sub-category or sub-categories?
Yes. Share of Shelf can be applied at different scopes, including sub-segments within a broader category (for example, organic vs. non-organic yogurt). During onboarding, we align the category/sub-category definitions to your reporting needs and how you want to view the shelf.
What’s the diamond score and how is it calculated?
Diamond Score measures strike zone positioning, which is how well products are positioned in the "sweet spot" (middle area) of shelf photos. It also evaluates product placement within the optimal viewing area that shoppers naturally focus their attention on when looking at shelves. Products positioned in this central "diamond" zone would score higher than those at the edges.
Is this metric comparable to the Share of Shelf I may already have today with another firm?
Yes and no. Yes: It addresses the same business question: How much space do I have on the shelf? No: The method is more automated and less manual than traditional approaches. It’s built on AI-based image analysis at scale rather than purely manual measurements, and it’s designed to reflect the real shelf conditions in stores (not just planogram intent).
How do you ensure data quality?
The team continuously tests with fresh, real-world photos. Quality improves rapidly when focused on specific categories. There is a deliberate 90-day onboarding and tuning period for every new client to get their categories and reference sets into “top tier” condition. Like the real world, accuracy depends on consistent photo capture and robust AI that can handle real-world variability like lighting, obstructions, rotated products, collapsed packaging, and store-to-store differences.
What’s your accuracy rate for brand recognition and facing counts?
Our goal is 95%+ brand identification accuracy, and we’re already in that range, with continued improvements in edge cases. For detecting facings, accuracy is ~99%. It will continue to improve over time.
How often do you update for new products and packaging changes?
Packaging and assortment change frequently (roughly quarterly), so the 90-day onboarding is the initial intensive phase, followed by ongoing updates. The system is designed to adapt to a dynamic marketplace where packaging refreshes and new products are introduced frequently.



