Executive Summary: Enterprise Metadata Framework
Published: April 1, 2005
Published in TDAN.com April 2005
Over the past few years, I've had the opportunity to discuss enterprise metadata to a wide variety of audiences and much of this conversation is captured in this "Best Practices" implementation framework. The model has evolved over the past few years as our program continues to do the same. Of course, this summary can only be a few pages long so the depth of the content here will be a tad shallow but you should be able to get the basics from the diagram and the description that follows. Figure 1 provides the new framework and the content follows to describe each section.
Figure 1: Metadata's Best Practice Framework
Dimensions of Importance
Around the outer rim of the frame is a collection of activities, dimensions, or aspects of a metadata implementation. Some of these elements are one time activities (hiring solid leadership, in theory) while others are an ongoing battles (quality). The review of these dimensions is a daily activity that should be taken seriously by every project or program that deploys enterprise metadata. The list below provides a basic description of each of these dimensions.
Developing a strategy must occur at the very beginning of the project and begins by building a business case that supports the investment in metadata technologies. A business case is a tool that forces the business to look ahead, allocate resources, focus on key points, and prepare for problems and opportunities. The business case consists of various activities that should be addressed including:
Beyond the business case, the Metadata Services Group should deliver an annual plan which includes a yearly summary, major accomplishments, next year's plans and objectives. Additionally, the annual plan should include architecture reviews and supporting metrics from the prior year; specifically focusing on content and usage of the metadata information. Like the business case, the annual plan can be used to support budgeting requests, educating the masses, act as a guide for the project, motivate the players, and provide ongoing feedback on the progression of metadata.
Architecture is the basic building block or raw material of the metadata environment. Organizations must review, document, and deploy architectures that include data, functional, technical, and application. Data architecture focuses on the data quality, data management, content, usage, modeling, storage, and standards. Technical architectures review the hardware, software, and vendors while the functional architecture reviews the business processes. The final architecture is the application architecture which works as a conduit between the functional and technical specifications. Architectures define the rules of the game within the corporate environment and these rules can make or break your implementation. For example, your organization is reviewing XML and questions keeps cropping up about vocabularies and how they should be implemented in order to ensure reuse. Metadata must be invited to help define vocabulary standards, schema standards, and provide reusable products that help organizations move to the SOA environment.
While there are many products in the world of metadata, the fundamental one is the repository. The repository defines how the metadata will be stored, presented, and integrated into the corporate environment. The repository design will emerge from the various artifacts already produced which include the user requirements, research from the business case and the architectures described in the previous paragraph. The repository should be based on solid design principles and subjected to usability studies. As described in the DMReview article (Feb, 2005), the repository environment is not just the repository but an entire collection of applications:
Knowledge management, information architecture, content management, search engine technology, and portalization are just a few of the evolutionary benefits of implementing metadata at the enterprise level. The metadata product line serves as the foundation from which processes and services can be built.
While the repository may get the majority of the attention in our environment, true value expands beyond broker-type functionality. Data must be collected and loaded from the producer as well as utilized by the consumer. Quality cannot be a random event and can only be delivered by well defined procedures. Consistency is the key and we want our support systems to enable a steady delivery of information and services from the Metadata Services Group. Many people forget that the repository collection requires operations, application support and content management in order to run effectively. Ideally, we should try to move our processes to the online environment which enables the support group to be more efficient and effective in the long run.
The natural progression of any organization is to move beyond the product and into value-add services. Services are what make the difference between doing the job and creating a "Metadata" cause. Metadata is more than products and procedures; metadata is a philosophy that must be supported by a service-oriented staff and value offering. The Metadata Services Group (notice the middle name) can offer many types of "Best Practices" including vocabulary management, domain standardization, term inventories, ontology development, etc. Value-add services can also be added to the portfolio like PDF conversion, metatag standardization, subscriptions, reservations, inventory management, XML vocabulary management, and records management methodologies.
Services are not always physical; many services are more Subject Matter Expertise (SME) oriented. How do you ensure that content can be located within the corporate search engine, how do you manage assets within the technical communities or how do you enable reuse in a non-development oriented organization? SOX, SOA, and EAI continue to ask the question of how enterprise ready is your metadata team.
One of the most difficult things to recognize is when you become successful. Managers will continuously push you, the business will change, and the environment will evolve. I wish I could tell that you if you do a, b, and c then you will be successful. But, the reality is that each company is different. Many people believe that what we have done is overkill while others think we haven't gone far enough. Still others say that we don't do metadata at all, rather some form of enterprise resource management. In the end, we have discovered that metadata is not delivered by deadlines or project plans, it is delivered each and everyday by the Metadata Services Group. When thinking about success of your final solution, the following are important elements to think about: metadata maturity, metadata's brand, level of integration, subject matter expertise, and definable value-based ROI.
Marketing, Branding, and Selling
When we bring up the topic of branding to information technology professionals, the usual response is to "tune" us out. Marketing is something that is done for the business is not typically associated with Information Technology (IT) organizations. The problem with this thought process is that it's not 1980 any more and organizations have a whole lot more choices with outsourcing, over seas sourcing, and automation. Marketing and branding are about managing the perceptions of value. I am not talking about lying or manipulation, I am saying that perceptions are powerful and I would rather have a perception that was managed instead of a random one. At the core of a branding strategy is the foundation of products, services and expertise. You cannot, in good faith, begin to build a marketing program until you have delivered in these areas.
The second step is ensuring that you have a consistent message. You can't be perfect at everything and more importantly, you wouldn't want to be. How many messages should you have? Probably more than one but less than four. Your choices are almost endless: quality, knowledge, technical ability, responsiveness, speed, reliability, etc. Now the hard part, suppose you choose quality to hang your metadata shingle on that single attribute. The challenge will be that everything you do must reflect this quality decision. Take a look at your web site, collaboration site, documentation, process definitions, and your service procedures; are they simple, effective, and scream quality? Quality must be everywhere, not just in the core metadata. Quality metadata is simply the price of entry into the world of metadata services.
Only when you have the foundation of products and services and then decide on your core message is it possible to work with the various vehicles to deliver the message. The options available to you include training classes, newsletters, educational opportunities, web promotion, etc.
Evolution of Value
Does our Metadata Services Group of 1999 resemble the one in 2005? Of course not and neither will your team's. People have been know to say, "you're either getting better or you're getting worse". Alternately, if the other guy is getting better faster than you're getting better then you're getting worse (Peters, 2000). Value must be evolutionary or the value is fleeting. Remember, neither the corporation nor technology remains the same for very long. Even if you have the very best implementation today, tomorrow it will be at best, average. How does evolution occur inside a technology program? The easy answer is keeping your eyes open. Open to the possibilities of applying metadata to other areas within the enterprise. The very best news is that every organization needs to manage information, they need to organize and apply information architecture principles, and they need someone to help them move their business processes to the online environment. The practice of continually evolving to meet business requirements and objectives may be considered the most important "Best Practice" of them all.
Is implementing enterprise metadata as easy as 1-2-3? Of course not, metadata requires an enormous commitment but the payoff can be huge. The beauty of this framework is that it really doesn't matter if you are a two person shop or have a dozen talented individuals. The framework is simply a guide, not a step by step process that should be deployed with blinders on. Think of the framework as a box of Cracker Jacks where you can dip in anywhere and find a morsel to chew on. And every now and then, you can pull out a valuable prize that really takes your metadata implementation to the next level.
Recent articles by R. Todd Stephens
R. Todd Stephens -
Dr. Todd Stephens is the Technical Director of the Collaboration and Online Services for the AT&T Corporation. Todd has served as the technical director since 1999 and is responsible for setting the corporate strategy and architecture for the development and implementation of the Enterprise Metadata Repositories (knowledge stores), Online Ordering for internal procurement, and the integration of Web 2.0 technologies utilizing SharePoint. For the past 24 years, Todd has worked in the Information Technology field including leadership positions at BellSouth, Coca-Cola, Georgia-Pacific and Cingular Wireless.
Todd holds degrees in Mathematics and Computer Science from Columbus State University, an MBA degree from Georgia State University, and a Ph.D. in Information Systems from Nova Southeastern University. The majority of his research is focused on Metadata Reuse, Repository Design, Enabling Trust within the Internet, Usability and Repository Frameworks. In addition, Todd has co-authored various books on Service Oriented Architectures, Open Source, Virtual Environments, and Integrating Web 2.0 technologies.
Todd can be reached at email@example.com