As a company grows so does its product catalog. A Product Information Management System (or PIM) is a smart way to organize and consolidate product data, and is designed to be your company’s single source of truth for all product information.
Aware is partners with one of the premier PIM organizations, inRiver. inRiver PIM enables product marketers to collaborate on creating perfect product information and provides easy-to-use tools for controlling every aspect of PIM stages. When using inRiver tools, you will experience faster time-to-market for all your products while maintaining better product information quality. The information in this article details some factors you’ll want to consider before beginning your inRiver PIM implementation.
Speak the Same Language
Each organization uses specific terminology to define its product structure. Similarly, inRiver includes a base model that includes its own set of names. Since data modeling relies heavily on data mapping, it is important to begin any project by agreeing on common terminology so that everyone is speaking apples to apples. InRiver’s base model uses three terms to describe its core marketing model.
- Product – Products include information shared by a number of individual items and act as a category placeholder.
- Item – This is the lowest level in the PIM and typically defines an individual unit that is identified by a unique ID, sku or UPC. One or more items belong to a product.
- Resources – These include any supporting attributes and usually consist of images, manuals, videos, etc. Resources exist at both the product and item level.
How Well Do You Know Your Data?
A great way to get a PIM implementation off to a sound start is to have good understanding of your data. As companies grow and systems evolve it’s not unusual to see some information stored in such a manner that, although not ideal, it got the job done. It is helpful to identify any data inconsistencies and be flexible enough to recognize that the corrections may need to be made in order to develop a sound data model in the PIM. Maybe it’s time to convert all those string fields that only contain numeric data to integers. Recognize that preparing for a PIM implementation may require a bit of house cleaning.
Where Does Your Data Live Today?
The reason for implementing a PIM is to create a single source of truth for all your product information. Prior to implementation it is common to see aspects of product data residing in several systems. Once all these systems have been identified it is possible the PIM may replace one or more of them. Of the systems that remain (ERP, PLM, DAM), one or more inbound integration points may be required. Developing and sharing your data roadmap will help clarify the work that needs to be done.
Supplying the PIM with Data
What are your expectations for importing information into the PIM? At the highest level data import is categorized as either manual or scripted. Although scripted is the option most clients desire, the development effort is usually client specific and has an impact on scope and budget. A scripted import may only be a one-time process and overall complexity depends largely on where the data is coming from and how easily it can be extracted and mapped. Depending on size of your product catalog a manual entry strategy may prove be the most cost effective option.
Enriching PIM Data
The blessing of the inRiver PIM is that it is a highly configurable product. It ships with a base configuration but there are many options available in order to deliver a setup that tailored to the needs of your organization. Below is a list of a few things to consider when configuring the PIM for your organization.
- Field Level Data – What fields are mandatory or require validation? Are fields grouped into logical categories?
- Cross Sell/Upsell – What types of relationships exist across products?
- Completeness – Do rules need to be applies so that products or items are not considered complete until specific conditions have been met?
- Translation – Are your products represented in more than one language?
- Listeners – Do internal or external systematic actions need to be taken once defined events are triggered?
- Roles – What type of access levels are required to support the business?
- Specifications – Do your products include specifications? Are they standardized or vary by product type?
Publishing PIM Data
The backbone of the PIM is to provide consistent data to all your external systems by creating a single source of truth. Since each external system is uniquely different, sending information to them must be assessed individually. Publishing capabilities within inRiver allow for a great deal of flexibility when creating the data structure required to communicate with outbound systems. The level of effort required for each interface depends on a number of factors. Below are few to consider:
- Add/Delete – What is the expectation for managing product status in each destination system? Can destination system logic handle when products are added or deleted from the PIM?
- API Options – What type of deliver options are available?
- Data Transformations – Can the destination system consume data directly out of the PIM? If transformations are required, do they need to occur on publish?
- Data Restrictions – Is there information within a product that is restricted and not intended to be publically available?
- Publishing Rules – What publishing rules are required to meet business needs? Does information need to be sent near real-time or are there blackout periods when information cannot be sent? What level of product completeness is required to publish?
- Destination Systems – How many destination systems are currently planned for?
There are a wide variety of factors that may have impact on the scope and success of a PIM implantation project. As with any project, going into it with an open mind and flexible budget is ideal. If I can leave you with one piece of advice I’d ask you to do your homework and understand your data prior to engaging with a PIM implementation partner. If there are data challenges to address it will save budget knowing about them up front.