No Comments

Master Data Hubs - Trillium

Hubs, Master Data Management, October 2008, Trillium

Here at Trillium Software we are seeing a few interesting trends in the market today.

 

There is consistent growth in all types of MDM approaches and philosophies including registries, repositories and transaction hubs. Each approach to managing master data has its merits. The real issue is matching the MDM approach to a company’s organizational maturity and the business value required of the data at that life stage. An important aspect for any company to consider when it adopts technology is whether the desired solution is flexible enough to meet immediate needs and able to scale and transform over time to meet future needs. Flexibility and connectivity to source applications is key here.

 

A realization is resonating throughout the market. Companies are coming to understand that a profound issue skirts the edges of discussions relating to which MDM approach or philosophy is best suited for them. That realization relates to the nature of the data that will populate the MDM implementation itself.  MDM efforts are expensive, time-consuming, and labor-intensive. Companies want to understand what data they have to begin with and the overall quality of the data that will be migrated to the MDM.

 

We are also witnessing the birth of a new MDM driver: an awareness of “unaccounted for” risk and the dangers it represents. Due to the current economic climate, investors and regulators are demanding greater transparency and visibility into the financial disclosure process for large companies, especially those in the financial services sector. Unaccounted for risk has felled Wall Street titans, such as Lehman Brothers and Bear Stearns. For the companies that have survived, financial institutions and insurance companies alike, there is a swelling aversion to risk that is spurring these enterprises to embrace the concept of “data hubs.”

 

Financial service and insurance enterprises are sizeable companies that have patchworks of systems distributed all over the country. These systems are truly data silos with no automated process to reconcile and validate data related to risk exposure. So these firms employ teams of MBAs who sift through reams of system data to relate risk factors in manual processes. The result is that differences in data standards, formats, accuracy, naming conventions and completeness between source systems and applications require human interpretation of data significance and hamper accurate reporting and business intelligence.

 

By deploying data hubs, companies could establish a central repository to manage core master data – all information would be published to or updated from the hub. That, however, is the challenge. For a successful data hub implementation to happen reliably, all source systems must be modified to publish to and be updated from the data hub—a cumbersome, expensive, time-consuming, and daunting task.

 

Rather than jump head-first into the MDM process, many companies we speak with are choosing to get their toes wet by first tackling a huge issue: data quality. Companies that strive to dig into the real details of the data that drives their business and risk rating models, key performance indicators, and decision-making seek provably correct data based on verified facts. Master Data Management systems can improve data consolidation and make the viewing of customer and product information easier, but if the underlying data is inconsistent, incomplete, inaccurate, or error-prone, all business decisions based on that data are at risk. Widely held beliefs that data and information is correct foster false confidence in decision-making, and the intended results of those decisions are never achieved. As a result, these companies are looking closely at the current state of their data (using best-of-breed automated profiling and investigation tools) and ways to improve it. These strategies often begin at the project level with a charter to scale enterprise-wide within specific time frames.

 

This ties into what we think is the biggest challenge facing companies that are moving forward to implement MDM: data governance. Implementation and integration issues can be resolved from a technical perspective, but data governance is a business strategy based on a best-fit process for every enterprise that optimizes the data over its useful life. The actual data governance plan will vary from one company to another based on its maturity, individual needs, goals and internal competencies.  The demands of data governance will swiftly confront any company exploring the benefits of MDM, because the investment to create master data is so intense that companies acknowledge they need to set up processes to improve and maintain accurate data over time.  Clearly, to be successful, data governance initiatives need to involve people other than just IT, including the business users of the data and executive management. Involving users cross-functionally requires a cultural paradigm shift in many organizations.

 

admin @ November 24, 2008

Sorry, the comment form is closed at this time.