(This post is part one of a three part series.)
Many PIM professionals will tell you the quickest way to solve your product data quality issues is to implement a Product Information Management (PIM) Tool. This seems like sage advice, as PIM plays a critical role in aggregating and disseminating product data, both to internal uses like websites and print and to external sources like retailers and search engines. These tools are the lynch pin between how you move data from your manufacturing systems to your customer-facing systems.
The same way you cannot cure the flu by getting the flu, you cannot fix product data quality issues with another tool.
However, that is only half the advice these professionals should be giving you. Installing another data tool in a web of data tools will only compound your existing product data issues. The same way you cannot cure the flu by getting the flu, you cannot fix product data quality issues with another tool. Fixing product data quality issues requires a strategy. The quickest way to make your product data quality issues worse is to try implementing a PIM tool without a strategy.
Your Product Data Strategy
A product data strategy is more than just a goalpost or a checkbox to say you’ve achieved good product data. There are many pieces to understand about how to normalize your product data aggregation process that are ongoing throughout the lifecycle of your product data ecosystem. These pieces must be nurtured to grow your data into an asset for your company to leverage instead of an expense on a budget line to manage.
Here are a few of the things you need to get started:
- An Enterprise Product Data Dictionary
- An Enterprise Product Data Flow Diagram
- An Enterprise Workflow Diagram
- An Accounting of your Existing Product Data Ecosystem
The importance of this type of documentation cannot be overstated for one simple reason: Implementing a PIM tool will expose every other bad product data collection and storage practice in and between every system that touches this data throughout your entire ecosystem. Looking at a PIM implementation in a vacuum is dangerous, as aggregating all your bad product data practices into a single nexus makes that bad data your book of record. If you needed a PIM tool to clean up your product data in the first place shouldn’t you know where your data is coming from and going to before you start?
Understanding your Product Data Ecosystem
The first step in any PIM implementation is the discovery sessions that map out your data inputs and outputs. The most common element of these sessions is the nearly universal understanding among everyone involved that the product data they are creating today has problems. The most common complaints are that systems are poor or that other teams do not take data quality seriously. In most discovery sessions these generally turn out to be the least of the issues causing data quality problems.
The biggest issue that causes product data issues is obviously the lack of a marketing tool like a PIM to store consistent data. Users resorting to passing spreadsheets by email is a point of failure in data quality that cannot be understated. However, the second leading cause of poor product data quality is bad integrations between systems. These integrations are where band-aids start being applied to product data. When the data stops flowing between the systems another band-aid is applied to start the flow again with no understanding of the root cause in the source or destination system.
This concept may seem ridiculous, but it happens on a regular basis. The implications are enormous, as source of record systems have their data manipulated between systems to feed a book of record system, invalidating the status of the source system as a source of record. The source is now the integration that performed the transformation, which has big implications on the validity and timeliness of the data being transferred.
The other common failing in enterprise ecosystems occurs when the same data point has multiple sources. It often occurs that a specific source is only valid for a specific time frame, especially when preparing data for 1WorldSync data feeds. However, using the same data point in a book of record system that has several different feeding systems is a glaring red flag for a product data issue. A single system should be used as a source of record for a single attribute in a destination system or your business will continually fight product data quality issues.
Lastly, human apathy cannot be overstated as a cause for concern for product data quality issues. Data entry resources will put the least amount of effort into a single data collection effort as possible, and will do nothing more than they are told is required to complete a task. This type of apathy is systematic and cultural: Nobody decides to be apathetic to collecting good product data. They do so because they are told to move faster within systems that do not enforce good product data practices. No tool will fix this on its own, although a good workflow-driven system is paramount to solving the overall issue.
Once your company understands its product data ecosystem it is much easier to adapt to the change required by the implementation of a PIM tool. An accurate data dictionary that properly defines each attribute, its source system, and any destination systems for that attribute data allows you to plan a better data flow. Knowing the timing of that data flow and lining systems in a linear flow rather than a web also makes the task of managing your product data quality easier within a PIM tool.
Click here to read Product Data Quality Issues Cannot be Solved by a Tool - PART 2.