As a business intelligence consulting company, we pride ourselves on being able
to deliver on our projects as well as provide good quality content to our readers.

Webinar Q&A: The Evolution of Operational MDM

Posted by Bryson Dunn on Oct 9, 2018 1:37:51 PM

Thank you to Bryson Dunn for presenting on the Evolution of Operational MDM! In case you missed the live presentation, you can view the replay here, and read on for the webinar Q&A:

Q: Can you give insights around survivorship challenges encountered?

A: At Datasource we believe it is beneficial to think of survivorship design in two contexts: at the attribute level (for example, selecting the best first name) and at the entity level (for example, selecting which record survives in a downstream application). Good survivorship rules align with the way the business needs to consume and manage master data and how data flows between applications. Survivorship decisions can significantly impact business operations and should be carefully considered – extra attention and thought should be given to scenarios where values and/or records “lose”. Incorrect survivorship decisions can be interpreted as data loss or corruption by the business, which can be incredibly detrimental to the perceived trustworthiness of the MDM solution, directly impact business operations, and lead to challenges with the long term success and adoption of MDM in an organization.

Q: Is a separate Database system recommended or a best practice for Master Data instead of keeping it as part of operational system?

A: Yes. Some organizations attempt to master their data in monolithic applications, but this approach is inflexible and limited in a number of ways. We typically recommend an architecture where a standalone MDM data store (“hub”) is created for the purposes of consolidating data across siloed/disparate applications, supporting data stewardship, and acting as a data broker to downstream applications. Some refer to this as a coexistence or centralized MDM architecture. There are a number of MDM solutions in the marketplace that support this, and it is also possible to build your own.

Q: Can you recommend any good books that discuss the topic of Master Data Management?

A: Yes! Dalton Servo, a respected and well-known MDM practitioner and former Datasource consultant, co-authored the fantastic read “Multi-Domain Master Data Management: Advanced MDM and Data Governance in Practice”.

Q: In your experience, what is the most effective way to control data flow?

A: There are a number of approaches, and as with most (if not all) MDM design decisions, business process needs to inform the decision in your environment. This is especially true when data created or managed in one application needs to be synchronized with another application to support various capabilities and processes (for example, customer service and order to cash). A few common patterns include limiting data flow to include only active records, records of a particular type, and applying other filters based on the data being mastered. Another interesting approach is to skip a large initial data load and trickle-feed the MDM solution. This eliminates the need to process a large backlog of data steward tasks during go-live, but there are a number of limitations to this approach that require careful consideration.

Q: We are considering developing a reusable Master Data view. Do you have any recommendations on how to approach this?

A: This is a very common and extremely valuable capability of an MDM solution. As discussed in the webinar, it is important to consider the inclusion of transactional and analytical data that is typically not stored or managed in an MDM solution. Master data forms the foundation of these views, and providing insight into key metrics, relationships, and interactions associated with master entities can really take this pattern to the next level. On the technical side, consider developing embeddable UI components (from scratch, or perhaps using a reporting tool) in conjunction with a canonical web services layer.


Topics: Master Data Management

Written by Bryson Dunn