Agile Data Design – November 2012

I am currently developing a five-year architecture roadmap for data and information management at the company I work for (a global Fortune 500 manufacturer). As I go through the process of defining the roadmap and getting buy-in from our various stakeholders, I’m reminded yet again of how collaboration is a central tenet in the practice of data management (just as it has become central to application development since the advent of Agile). Data management, at its core, consists of engaging business and IT stakeholders in a continuing discussion of business and application data needs and requirements – of what is known or assumed to be true about the data we have, of the gaps that exist between the data we have and the data we need, and of how the data we have can be effectively used and reused to drive innovation and business value. The purpose of data management is not to manage data per se (I don’t even see most of the data that drives our company), but to manage the expectations and requirements of data stakeholders.

As I discussed in an earlier column (What is a Data Model? ), even the data management artifacts that we create (for example, conceptual and logical data models) exist primarily to guide and direct these conversations. As I’ve said before, I think we often misunderstand the Zachman Framework: We look at the Framework and see nothing but the artifacts, and we think of the Framework in terms of producing the artifacts. But that’s not really what the process of architecture is about! At each level of architecture, a two-way conversation (a dialog) is occurring between stakeholders in the architectural process. The architect, for example, talks to the clients about their needs and requirements. The clients, in turn, talk with the architect about what is needed to get their house built. The architect talks to the contractor about how the clients’ requirements can be realized; the contractor talks to the architect to get a better understanding of what those requirements are. The artifacts that are produced (architectural drawings, contractor’s plans, shop specifications, etc.) are the records of these conversations.

Which brings us back to the data architecture roadmap. The purpose of the roadmap is to define a scope, a vision, and a set of requirements for data and information management (based on a collection of stakeholder interviews), and then develop a 3-5 year program to achieve these desired goals. Developing the roadmap involves four sets of conversations:

  • Conversations with the Enterprise Architecture group at my company (and with my manager) to initiate the project, develop the framework for the roadmap (scope, vision, etc.), and prepare for the stakeholder interviews.
  • Interviews with high-level business and IT stakeholders to develop the set of data and information requirements that will be addressed in the roadmap.
  • Conversations with my manager and other IT resource managers to develop a realistic 5-year schedule of projects and deliverables to achieve the goals of the roadmap.
  • A final set of conversations with the Enterprise Architecture group to get the roadmap approved.

The scope and vision for our data architecture roadmap is centered around Enterprise Information Management (EIM), defined as “a set of disciplines, technologies and best practices to manage information as an enterprise asset.”1 The vision for EIM at our company embraces the following goals:

  • To create a semantic layer of common business data definitions across our entire company (this could include a high-level enterprise data model and a set of common business data definitions).
  • To define an effective data governance process to manage data at all levels of our company (one in which data content would be governed by the business units, with oversight at the enterprise level).
  • To create a set of shared data services that enable quick and easy reuse of data to support enterprise information processes (this would include a set of master/reference databases and canonical XML schema definitions).
  • To move toward the adoption of industry-standard best practices for data and information management (as codified, for example, in DAMA International’s Data Management Body of Knowledge).

I’m looking forward to the ensuing stakeholder conversations that will help determine the final form of our data architecture roadmap. My initial goal will be to identify our stakeholders’ data and information pain points and arenas of opportunity, as well as to document which subsets of our enterprise data are managed by each of our business units. Subsequent steps will involve identifying sets of master or reference data that can (and should) be shared across the company and that therefore should fall under the auspices of data governance.

From these many conversations, our path forward will be discovered!

I’d like to make this a dialogue, so please feel free to email your questions, comments and concerns to me at Larry_Burns@comcast.net. Thanks for reading!

Reference:

  1. For more information on EIM, see the EIM Institute’s website.

Share this post

Larry Burns

Larry Burns

Larry Burns has worked in IT for more than 40 years as a data architect, database developer, DBA, data modeler, application developer, consultant, and teacher. He holds a B.S. in Mathematics from the University of Washington, and a Master’s degree in Software Engineering from Seattle University. He most recently worked for a global Fortune 200 company as a Data and BI Architect and Data Engineer (i.e., data modeler). He contributed material on Database Development and Database Operations Management to the first edition of DAMA International’s Data Management Body of Knowledge (DAMA-DMBOK) and is a former instructor and advisor in the certificate program for Data Resource Management at the University of Washington in Seattle. He has written numerous articles for TDAN.com and DMReview.com and is the author of Building the Agile Database (Technics Publications LLC, 2011), Growing Business Intelligence (Technics Publications LLC, 2016), and Data Model Storytelling (Technics Publications LLC, 2021).

scroll to top