Showing posts with label Master Data Management (MDM). Show all posts
Showing posts with label Master Data Management (MDM). Show all posts

Tuesday, November 24, 2009

Master Data Management: Building a Foundation for Success


Evan Levy wrote a nice white paper, provided by Informatica, called Master Data Management: Building a Foundation for Success (Free registration required to download the white paper). In the white paper, he told about the challenges to implement a master data management. He said that in his book Customer Data Integration: Reaching a Single Version of the Truth, co-written with Jill Dyché, his partner and co-founder of Baseline Consulting, MDM is defined as: “The set of disciplines and methods to ensure the currency, meaning, and quality of a company’s reference data that is shared across various systems and organizations.” Below is a summary of the white paper:

Depending on the maturity and size of an IT organization, there may be a sizable collection of infrastructure services associated with managing data and systems. In their experience working with firms on new MDM programs, there are typically five technical functions that are core to MDM processing. Frequently, these capabilities are already part of the IT infrastructure. The five functions are:

1. Data cleansing and correction

Data cleansing is fairly common within most IT organizations, particularly when data is highly prone to data entry errors. Most data quality environments have been set up to focus on the cleansing and correction of well-understood data values. Customer name and address values are frequently the starting point for many companies seeking to clean up their “bad data.”

2. Metadata

Metadata isn’t limited to identifying the individual data element names and their definitions. It also identifies the data values and the origin of that data (its lineage).But metadata isn’t MDM. Metadata focuses on the descriptive detail about data.

The most visible metadata features that are useful in an MDM environment include: Terminology • y or data element names—for instance, the data element name is “ItemColor”
- Data values—for instance, the acceptable values are “red, green, or blue”
- Value representation—for instance, CC0000, 00CC00, and 0066CC
- Lineage details—for instance, “Created: 01/12/2009. System of origin: Order System”
- Data definitions—for instance, “the color of the item”

The use of existing metadata structures can dramatically simplify other systems’ abilities to adopt and use MDM because they may already recognize and work with the existing metadata.

3. Security and access services

One of the lesser-known functions within MDM is the need to manage access to individual reference data elements. MDM systems typically manipulate data using CRUD processing—to create, read, update, and delete—so centralizing the permissions to access and change key data is an important part of maintaining master data. MDM can support a very detailed or granular level of security access to the individual reference data elements.

Because of the granular level of data access that MDM affords—CRUD processing against individual reference values—an MDM system should avoid having its own siloed proprietary security solution. Instead it needs to interface to existing security services that an IT department relies on to manage application access.

4. Data migration

Data migration technologies alleviate the need to develop specialized code to extract and transfer data between different systems. Companies have invested heavily in ETL tools and application messaging technologies such as enterprise service bus (ESB) because of the need to move large volumes of data between their various application systems. The challenge in moving data between systems exists because the data is rarely stored in a simple, intuitive structure.

Regardless of the specific tools and technologies used in transporting data into or out of an MDM system, there are two basic ways of moving data: large volume bulk data (loads or extracts) and individual transactions. To load the MDM server initially with data from an application system requires bulk data loading capabilities. ETL tools are very well equipped to handle the application data extraction and MDM system loading.

5. Identity resolution

Most MDM solutions also benefit from the ability to uniquely identify and manage individual reference details. In the situations of simple subject areas (like color or size), managing and tracking the various reference values is very straightforward. This simplicity comes from the fact that the values are easy to differentiate; there’s not much logic involved in determining if the value matches “red”, “green”, or “blue.” Complexity occurs when the subject details being mastered have more variables, and variables that can be vague—such as a person. In the instances of complex subject areas, mastering the reference value requires more sophisticated analysis of the numerous attributes associated with the individual reference value such as their name or address.

The Inventory for an MDM Foundation

Here are some techniques for you to use to assess how to move forward on your own MDM development effort.

1. Data Cleansing and Correction
- Identify data cleansing tools that are already implemented within your company on other projects; pay special attention to applications that are likely to link to the MDM system
- Determine the data cleansing rules for the master data subject area and whether they have been documented
- Establish how you will interface your MDM system to the data cleansing tool (e.g., Web services or APIs)
- Contact the product vendor to determine which MDM product(s) it may already support

2. Metadata
- Review existing data warehouse metadata to discover whether there is metadata content that may apply to the master data subject area in question
- Determine whether there are metadata standards in place that apply to the given master data
- Find out if any of the key application systems use existing metadata standards
- If your company has a data management team, see if there is already a sanctioned process for identifying and developing new metadata content

3. Security and Access Services
- Identify the level of security requirements that need to apply to core master data
- Talk to the application architecture group to determine if there are application-level security standards in place
- Investigate whether your company has already defined a set of security and access Web services that may be leveraged for MDM

4. Data Migration
- Identify the bulk data migration standards that are in place—for example, how is Informatica® technology used in implementations other than data warehouse or BI implementations?
- Determine the current mechanism for application-to-application connectivity: Is it EAI? ESB? Point to point?
- Clarify which application systems can accept reconciled and corrected data
- Determine which legacy systems can only support bulk data extract versus transaction access

5. Identity Resolution
- Understand if your company already has identity resolution software in place to identify the existence of multiple customers, products, or other subject area items; this type of capability is most likely to exist in a customer-facing financial system
- Investigate whether the existing identity resolution system is configurable with new or custom rules
- Determine how the identity resolution technology interfaces to other systems (e.g., embedded code? API? Web services?)

6. MDM Functional Requirements
- Identify a handful of high-profile systems that depend on the timely sharing and synchronization of reference data; if you have a BI environment, we recommend looking at the predominant source systems as well as the most-used data marts
- Decide which reference data is shared and synchronized between multiple systems; the need for operational integration typically reflects higher business value
- Determine the individual system’s functional requirements for its reference data (e.g., create/ read/update or data quality/correction)
- Categorize the individual data elements by subject area to identify the specific subject areas requiring mastering. It’s only practical to implement MDM one subject area at a time.

His conclusion: "When it comes to the question of build versus buy, for MDM the answer is as complex as your company’s IT infrastructure, functional requirements, and business needs. The considerations are multifaceted and require an intimate knowledge of what you can do today versus what you’ll need to accomplish tomorrow. When it comes to integrating master data, one thing is clear: The first step to ROI is avoiding unnecessary expenses. Investigate and use the capabilities that already exist within your IT infrastructure, enabling you to make an informed decision on launching MDM the right way."

Evan Levy is an expert in MDM, and wrote a nice white paper, with a good explanation about how to build a MDM foundation. For those interested in the subject, he and Jill Dyché wrote a good book called: Customer Data Integration: Reaching a Single Version of the Truth.

Tuesday, December 9, 2008

BI: Is One Version of the Truth Still Out There?


I read an article called BI: Is One Version of the Truth Still Out There? , in CRM Buyer, written by David Hatch, vice president and principal analyst of Aberdeen Group's business intelligence practice. He talks about the common issue that the organizations have, when spend months and endure significant costs to obtain the reporting and analysis capabilities that BI (business intelligence) technology promises, only to find that different "versions of the truth" still exist without any definite way of determining which one is real or accurate.

He answered "Yes" to the question: Is "One Version of the Truth" achievable?, based on a research from The Aberdeen Group.

Managers are questioning their level of trust in corporate data: Do the reports, charts and analytic tools in use today represent "the truth?" During August and September 2008, Aberdeen Group surveyed over 200 professionals from 155 companies and interviewed a diverse range of senior executives and operational management professionals working in different industries and geographies. Aberdeen published the research study, called One Version of the Truth 2.0: Are Your Decisions Based on Reality?.

He said that the majority of problems arise at the data source and integration levels, which explains why master data management (MDM) and data warehousing technologies and services are at the forefront of technology investment today, and also emphasized the importance of "data stewardship."

He recommended some actions:
- Start with end-user information requirements
- Build a working group or committee
- Focus on understanding data relationships to the end application
- Establish a formal data stewardship role
- Apply integration techniques to all data types from internal and external sources
- Security is not a job
- Develop and manage milestones.

In my opinion, the main challenge to the companies obtain a single version of the truth is to have a data management well implemented, it includes the concepts of data governance, data integration, data quality, and master data management.

Thursday, December 4, 2008

Enterprise Information Management


Lyndsay Wise published last month two articles about Enterprise Information Management in Dashboard Insight. In the first article, she provides an overview of information management and in the second article, she looks the Business Objects/SAP view of EIM.

She said that whether it is called enterprise information management, data management, or information management, the general understanding is that managing information across the organization includes the concepts of data governance, data integration, data quality, and master data management.

The Basics Of Information Management


According her, Enterprise Information Management (EIM) is gaining momentum. Organizations are hard pressed to find ways to adequately manage their data without affecting production systems and operational processes.

Enterprise information management takes MDM and other data related initiatives to the next level by enabling organizations to manage their data across sources, ensure a level of quality at each stage of the integration process, and enable organizations to govern the various processes that are associated with the various data points.

With a unified view of data, organizations no longer see separate views of that result from business units, but see how they relate to the overall picture of performance as well.

Data quality adds the final touch to the integration and beginning of MDM mix. To maintain these initiatives, information constantly needs to be validated to preserve valid and accurate data being entered into and moved across various operational systems and within multiple data stores.

This expansion of how data is managed across the organization will continue to converge. Organizations will begin to look at data management as an overall solution that includes integration and data quality initiatives as an extension of current MDM related projects.

Organizations are clearly starting to move towards EIM. Part of this means not only adopting an EIM solution but adopting a data governance framework which includes developing a process for managing data and having a committee of stakeholders that help define taxonomy, hierarchy, etc. and managing these processes across the organization.

The concept of a holistic approach to data management enables organizations to develop end to end solutions that take into account disparate business units and how the data processed within those units translates into valuable information at different touch points across the organization.

I think with the complexity of the companies nowadays, the necessity of information management is increasingly important to enable the companies to manage their data effectively, transforming data into valuable information to make better decisions.

The Role of Data Quality With The Business Objects/SAP EIM Platform


About the approach of the Business Objects, she said that is to combine their data services for data integration and data quality. This removes a barrier of cleaning and integrating data separately and creates a single environment. The solution works by identifying how a change in the source system will affect the various processes and give users the ability to look at calculations and identify where numbers come from, which expands confident decision making based on data as well as compliance and audit requirements.

Wednesday, October 15, 2008

The Business Value of Master Data Management


Tomorrow, October 16th at 3PM ET, will happen a live Web broadcast presentation entitled The Business Value of Master Data Management,provided by DM Review and hosted by Eric Kavanagh with Jim Ericson, in its program called DM Radio.



According DM Review: "Achieving the coveted 360-degree view of the customer -- or even of a product, line of business or other entity -- is more possible than ever these days, thanks in large part to the maturation of Master Data Management. With advances in automated data quality and matching software, coupled with ever-faster data delivery mechanisms, this relatively new discipline is taking the information management industry by storm.

Unlike traditional, tightly coupled information systems, MDM solutions use a loose coupling of enterprise applications and a master data hub to deliver near real-time master records about customers, products, locations and other dimensions. These MDM solutions help improve the efficiency of enterprise systems by managing the maintenance of master records. This improves overall data quality, because master data records need not be maintained multiple times throughout the enterprise.

Tune into this episode of DM Radio to learn how MDM solutions are helping organizations align their information systems with business goals and strategies. We'll talk to Jill Dyche of Baseline Consulting, Darren Peirce of Kalido, Judy Ko of Informatica, and Anurag Wadehra of Siperian.

Attendees will learn:
- How MDM can yield significant business value
- The basics of Customer Data Integration
- The fundamentals of Product Information Management
- Why data quality must be “baked in” as opposed to “bolted on”
- Trends in MDM design and deployment."

In the DM Review website, you can register for this live Web broadcast.

You also can check out the DM Radio archives to hear previous programs with a variety of other issues.

The Master Data Management is one of the most important issues for companies nowadays. According Gartner: The truth is that achieving a single view across the enterprise, is key to running your business. The effective management and governance of master data is both an opportunity and challenge for many large enterprises. MDM poses unique challenges and requires new relationships between business and IT in areas such as workflow, governance, stewardship and data integration. To be successful, organizations must understand the role of information governance and the impact MDM has on their applications portfolio and information infrastructure. Organizations use MDM to accelerate enterprise agility, promote operational efficiency, achieve competitive differentiation, support enterprise transparency, make SOA work more effectively, and ensure corporate compliance.

Wednesday, August 27, 2008

TDWI's Technology Poster about Master Data Management


The Data Warehousing Institute (TDWI) published Technology Poster about Master Data Management, designed by Philip Russom, senior manager of TDWI Research. According TDWI: the Master Data Management poster sorts out the complex layers of the MDM stack, illustrating how people, practices, and software automation are coordinated in a mature MDM implementation.

This poster will help explain:
- The tools, technologies, and techniques that go into the MDM technology stack.
- How the tech stack adjusts to practices like operational MDM and analytic MDM.
- How the MDM technology stack is influenced by pre-existing systems, architectural approaches, growth over time, and build-versus-buy decisions.

This kind of poster is very interesting, because it can be used to explain how MDM works in two ways, first as an overview and after detailing each layer of the process.


Philip Russom tells what he imagined to create the poster:
"On a break during the TDWI World Conference in Chicago this May, I left the hotel and walked up the street to the Museum of Contemporary Art. As soon as I entered, I saw mobiles by the great American sculptor Alexander Calder. My head was spinning from the technical presentations I’d seen at the conference, and it struck me that Calder mobiles resemble the way we draw technology stacks and system architectures. Both are compartmentalized, yet the parts are connected and interactive. Both move and evolve slowly as the winds of change brush them.

Later, when I needed a metaphor for the many pieces that master data management connects and coordinates, I naturally thought of Calder’s elegant and organic mobiles.

As you look at the poster, try to imagine the pieces in motion like a Calder mobile, with the operational, analytic, and enterprise practices spinning, and the balance shifting from collaboration to implementation and back again.

My thanks to Deirdre Hoffman for translating my ideas into the graphic images of this poster."

You can request a free print copy (US and Canada Only) or download a PDF version, in the TDWI website (registration required).

The following companies are sponsors of this technology poster: Baseline Consulting, BizGui Inc, EasyAsk, Exeros, MicroStrategy, Syncsort, Talend.

Tuesday, August 26, 2008

The Relationship Between Master Data Management and Data Quality


I read a good article called The Relationship Between Master Data Management and Data Quality, written by Dan Power, in DM Review, where he told about the importance of a strong approach of data quality when you are adopting a MDM initiative.

He touched in an interesting point, when he said: "Keep in mind that data quality, like many other things in life, is only noticed when it’s missing." and also said: "When people start saying “Well, I don’t know if that’s necessary, our data is pretty good, I’m sure,” gently agree with them and suggest looking at the data with a data quality tool just to confirm how good it is."

When you are doing consulting to a company, you should to approach the data quality issue carefully, for don't create embarrassment, because in the most of companies the people think the quality of data is good. An effective approach of data quality, using good data quality tools, is an important factor of success where you are adopting a MDM initiative.

Friday, August 22, 2008

Understanding Master Data Management and Service-Oriented Architecture


"How do you understand MDM from an architectural perspective - not just the products that implement MDM, but how the pieces fit together?". That is the Dan Wolfson's question, based on that successful MDM implementations involve more than just the right technology.

Dan Wolfson gave a podcast interview to SearchDataManagement last month, where he talks about:

- Learn more about the relationship between MDM and SOA and what value MDM systems can provide SOAs.
- Find out what many organizations reported about their MDM experiences, which inspired Wolfson and his colleagues to write the book.
- Hear what a "reference architecture" is and its relevance to MDM deployments.
- Learn two reasons why many organizations don't seem to be ready for the MDM technology being offered today.

For those interested in MDM and SOA working together, Dan Wolfson co-authored a book, with his colleagues from IBM Allen Dreibelbis, Eberhard Hechler, Ivan Milman, Martin Oberhofer,and Paul van Run, called: Enterprise Master Data Management: An SOA Approach to Managing Core Information

Wednesday, April 23, 2008

Articles about Master Data Management

Recently, I have read several interesting articles related with the importance of Master Data Management for the companies. Some of them are:


Last week, the DM Review published, in its online edition, an article entitled The Intrinsic Value of Master Data Management, by Lyndsay Wise. In the article, she wrote the benefits to IT and business to build a MDM solution in a corporation.


Last month, Scott Lee wrote in online edition of BI Review, the experience in implementing MDM in the company where he works. In the article, called Master Data Management is a Program; he explains the adopted strategy, considering the MDM doesn't a single project, but a continuous program.


TDWI published last month, an article by Todd Goldman: Best Practices in MDM: Cross-System Data Analysis, where he wrote about Cross-system source-data discovery and data mapping, that he thinks being importants steps to a well implementation of MDM.