Consumer goods companies spend a lot of money to deploy supply chain applications that will help them operate more efficiently, improve service levels to retailers, and ultimately, deliver the ideal brand experience.
But the fact is, none of those investments deliver on their potential unless the data they’re using is accurate. Accurate data comes from well-crafted, consistent processes, executed correctly every time.
Consistent product data greases the wheels of commerce. As retail goes omni-channel, supply chains grow more complex and consumer expectations skyrocket, data about products - cases, inner packs, and consumer units - has to be consistent for supply chains to flow smoothly and marketing processes to do their work guiding customers along the path to purchase.
But that can be tough as consumer goods manufacturers work to keep up with the accelerating pace of change. Complete and up-to-date product information is difficult and expensive to compile. It’s even more onerous to maintain and distribute once it’s created, since data is generated by many different departments, often with their own systems – R&D, Regulatory, Engineering, Operations, Finance, Marketing, Packaging and Sales.
And newly created product data doesn’t stay accurate for long: Any of these departments may adjust specifications and data through development and even after product introduction to comply with their various agendas: regulatory compliance, promotions, reformulations, and so on.
The outcome of product data inconsistencies show up often, in many places:
- The product is still in development, but account teams need images, dimensions and ingredients data right now to support sales efforts. Six months later, packaging and ingredients data have been tweaked, but no one has circled back to those retailers to make sure they get the updates into their own systems, stalling movement to the shelf.
- The product management team heralds the first run of the newly redesigned packaging. But the shorter product height translates into smaller case sizes. When the DC staff builds the pallets, they’re a lot shorter than the load optimization software anticipated. So they end up under-filling the truck, making transportation spend less efficient.
- The distributor receives the first shipment of the new product, timed to hit stores just as the big ad campaign launches. But the bar code on the inner pack isn’t consistent with the one in the item record, causing the whole pallet to be set aside until the discrepancy can be worked out.
- Store associates carefully follow the new planograms for the shelf re-set. But the four facings of the newly repackaged item won’t fit in their allotted space. An on-the-fly decision to narrow the facings to three means the brand loses valuable exposure.
Driving Inconsistency from Product Data
The increase in the scale, velocity and importance of product data is outstripping manufacturers’ capacity to create, maintain and distribute it. Some retailers have even set up their own product data operations in an attempt to make up for deficits in the data they get from manufacturers, duplicating efforts and creating inconsistencies in data. Inconsistencies cause a ripple effect in supply chain and store operations, driving down revenue and negatively impacting the customer experience.
Andrew White, research VP at Gartner, described it this way in Master data! Master data! My supply chain for master data!, CSCMP Quarterly, Q2, 2013: “Supply chain performance is dependent on consistent definitions of customers, products, items, locations, and other master data objects. When data is poorly governed and inconsistent, supply chains become less competitive because more time and money is spent on managing information between systems and trading partners, and less is available for innovation. Good data leads to efficient supply chains, allowing resources to be spent on innovation rather than on coping with problems. Master data has always been necessary, but the importance of its consistency in supply chains is growing.”
Working with GS1, a group of leading consumer goods and retail companies came up with a best practice process to improve data accuracy called the Data Quality Framework. They identified five important foundational attributes of a consumer product as it is set up in their systems: GTIN, U.P.C., Brand, Net Weight, and Unit of Measure. They also created best practices, including the need to use standard and consistent processes to take measurements of production samples and communicate updates back to their trade customers through GS1’s Global Data Synchronization Network.
An effective process for capturing, maintaining and distributing product data includes:
- A centralized facility and staff, either outsourced or in-house, to avoid duplication of effort and variances in process and accuracy.
- Infrastructure and precision tools to capture the dimensions, angles, formats and views required for all uses, both operational and promotional.
- A set of refined, documented, repeatable and GS1-compliant processes. Steps include confirming dimensions and weights, capturing images and code-keying these attributes to deliver centralized views and consistent feeds to meet the needs of all constituents. Consistency is critical; processes lose their validity if there are variances in technique.
- Expertise and continuous training to keep pace with evolving requirements.
- Scale, to accommodate surges in demand.
Some manufacturers have attempted to create an internal department charged with all of these tasks. However, they often encounter challenges in feeding product data back into a centralized system that can support all constituencies and all uses.
Many continue to struggle. According to Aberdeen’s Reap What You Sow: Better Product Data Leads to Better Product Sales (June, 2013), “Managing product data has become one of the thorniest issues facing organizations — a high priority project, with wide-ranging implications, that is becoming more complex and costly.”
Recognizing the challenges, a growing number of consumer goods companies are outsourcing the creation, maintenance and distribution of their critical product data. Outsourcing provides access to experts who can deliver consistent, repeatable, industry-standard processes complete with multiple quality checks to ensure accuracy. They employ the latest in technology and are able to provide all needed formats, attributes and images, for the broad array of both supply chain and marketing functions. A third party provider also scales quickly when volume surges, so manufacturers can be assured access to complete, accurate data at all times.
Retail is undergoing a profound paradigm shift in the way consumers shop: They expect extensive product information and quick access to the products they choose. Highly accurate product data is an essential requirement to deliver on these expectations. For a growing number of consumer goods companies, outsourcing data quality and measurement services has become the most effective path to get there.
Isabel DuPont is the Vice President Content Production at Gladson.