Guest Blog from Niels Lademark Heegaard - Is Your DG/EA/BPM Approach Mature and How Can You Tell?

It’s a pleasure to feature a guest post from Niels Lademark Heegaard, a talented friend and former colleague from my early days at Platon, the first consultancy I worked with. Niels has an exceptional ability to make complex ideas clear and practical - something I’ve admired since we first worked together.

In this blog, Niels explores how to define, measure, and apply maturity in Data Governance and Enterprise Architecture.

Yet another AI free blog post (except for correcting my spulling and the cover image).

This time around I muse about Data Governance and Enterprise Architecture maturity. How do you answer the age old question: How mature is your Data Governance efforts?

Claimer: A search and replace and you can use this for EA, MDM, BPM, AI, etc.

Maturity…

Maturity is a very frequently aired term that is thrown around in the context of both Data Governance and Enterprise Architecture (and a host of other disciplines). Statements such as “This company is immature,” or “The Data Governance model is immature,” are encountered, but little is said about how this label is defined.

What is maturity anyhow?

So, what is maturity; how do you measure it, and what can you use it for?

Definition: Maturity is the extent of your capabilities within an area.

You need a lot of capabilities to be successful in Data Governance, Enterprise Architecture, Master Data Management, Business Process management etc., etc. Since the disciplines differ the capabilities do as well, however there is a surprising amount of overlap between fields.

Know your measures

Whenever I do a maturity assessment, I categorize these capabilities and subdivide them. One of the maturity indicators for all the mentioned fields is “Governance,” which I’ll use in my example below.

Governance

  1. Strategic recognition

  2. Funding

  3. Organization

  4. Assigned resources

Organization

  1. Placement in the organization

  2. Size and skills

  3. Team education and training

  4. Influence

Business alignment

  1. Business Drivers/Priority management

  2. Business Process alignment

  3. Ownership

  4. Co-development

Methodology

  1. Framework/Policies

  2. Methodologies/Guidelines

  3. Project alignment

  4. Budget process alignment

Technology support

  1. Requirements for IT Application Design

  2. Master Data Archtecuture

  3. Dedicated tooling

  4. Tool coverage/Tool adoption

Change Management

  1. IT project embedding

  2. Continuous discipline development

  3. Education (outside the team)

  4. Communication (outside the team)

Mind you there are other potential subtopics (governance could e.g., include “Audit”). I usually end up with 6-10 different dimensions, but I’ll go with the four under “Governance.”

Six dimensions and a lot of questions

Just to avoid (or add to) the confusion: “Governance” in this context is about how your enterprise’s management approaches your particular discipline, e.g., BPM, EA, etc. It is not about the execution of Data Governance, but about the conditions that are set to perform Data Governance in the first place.

There is a point to this: Always explain what the overarching dimensions cover. It will also help you define the sub-topics.

Be consistent, get it right, and get value

I usually measure each sub discipline individually on a score from one to five. The snag is how do you decide if the score is two or three, consistently? Is it really that important to get it exactly right?

The answer to the latter question is resounding: Yes, it is very important!

I’ll explain why that is later, but I can allude that getting it right will enhance the value of the maturity assessment immensely.

To assign the scores exactly right you need to define them. This is hard work which is why I’ll limit my example to a few examples namely “Strategic Importance” and “Funding”:

Explaining step by step

Strategic recognition

Level 1 (a.k.a. What??) Data governance is not recognized as a value adding discipline in the company. The term is largely unknown, and the benefits are unrecognized (stop wasting my time).

Level 2 (a.k.a. Why?) Data governance is familiar to those that experience low data quality directly, but not as a trans-organizational effort, there is a departmental acknowledgment of the need, and a few understand the discipline and perceive the benefit of a collective effort . One or two projects may occasionally take Data Governance into account (why is this necessary??).

Level 3 (a.k.a. OK…) The term data governance is generally a familiar term and is recognized at the SVP level in some divisions. It is somewhat seen as an enabler of better execution within a division or department. There is a local knowledge of what needs to be done, but also some resistance. Some business plans and most projects take Data Governance into account (too much bureaucracy… will do it if I have to).

Level 4 (a.k.a. Yes!) The term data governance is a familiar term and is recognized at the CxO level. It is seen as a prerequisite for better execution both within and across departments. There is an understanding of what Data Governance entails and acceptance of the actions that need to be taken. All business plans and projects take Data Governance into account or will need explicit permission not to (OK, will go do).

Level 5 (a.k.a. Of course!!) The term data governance is a familiar term and is recognized at the CxO and executive board level. It is seen as the foundation for business execution both within and across departments. Data governance is understood by everyone, and its consequential actions are seen as beneficial. The business strategy and business plans and projects take Data Governance into account or are paused until they do (Darn I forgot… it is top of my list!).

Let’s do funding

Level 1 (a.k.a. Go away (I need to manage these errors!). Data governance is not funded. It is perceived mostly as an unnecessary distraction that lures some untoward employees away from value-adding work. Enthusiastic front runners and local advocates can hope that their manager turns a blind eye.

Level 2 (a.k.a. CAPEX not OPEX (We did it... are you still here?)). A data governance initiative, usually in the form of a project, is funded. The resulting recommendations might result in part-time allocation of some employees (but remember that you also have to deliver on [insert day job]). Funding is based on a yearly allocation in a local department. Project budgets will generally not have funds for taking Data Governance into account.

Level 3 (a.k.a. OPEX (OK, We’ll continue... for now)). A data governance department (or part of a larger department) is funded with a few (too few) people. The funding is given by an SVP or equivalent, subject to unchanged division budgets. The function is centralized and there might or might not be some funded liaison officers in other divisions (with day jobs). There is little in the way of dedicated technology support. Project budgets will sometimes have funds for taking Data Governance into account (especially migration projects).

Level 4 (a.k.a. OPEX (We got it fully covered)). A data governance department (that exists in its own right) is fully funded (usually slightly understaffed). The funding is given by a CxO but as a minor expense it is rarely listed in the annual report. There are funded liaison officers in most divisions. There might be some dedicated technological support (specialist tools and/or one-stop shops for all things DG). Project budgets have funds for taking Data Governance into account.

Level 5 (a.k.a. OPEX and CAPEX (No, we don’t... )). A usually slightly understaffed data governance department is fully funded (not a copy-paste error 😊). The funding is given by a CxO and is listed in the annual report as a strategic effort. There are funded liaison officers in all divisions. There is dedicated technological support (specialist tools and one-stop shops for all things DG). Project budgets have funds for taking Data Governance into account. There are ongoing project(s) to develop DG further.

Benefits

I stated that it is essential to define and by extension, measure maturity precisely and consistently. Given 24 questions or twice that it is a lot of work, however there are three good reasons.

The first reason: If you wish to report on the progress of your Data Governance efforts, you will want to score the same parameters next year. This should be consistent and use the same measuring stick. On a related note, I only give a score if ALL the parameters for that score are met. Your mileage may vary.

The second reason: This maturity assessment is an action plan in disguise. It tells you what to improve if you wish to take your capabilities up a notch. On another related note (and this is important), level five is NOT necessarily where you want to be. It depends on your organization’s needs. Those needs might not call for a level five capability.

The third reason: You can also set the goals using the same scoring (in the same sessions). Where do you want to be? Voilà, you just made half a gab analysis. List up what it would take to close the gap, and you’ve made a full gap analysis.

Do it together and initialize change, do it alone and stay alone

A word of caution, do not attempt to measure the maturity without discussing the scores with relevant stakeholders. You could base it on a pain point analysis, workshops, or other ways of dialogue. Be prepared to explain the nuances. Once the hand wringing is done ask: Where do you want to be?

Show then what they have got

Final note. This is easy to communicate. Both the current and the target maturity can be shown in one easy to understand diagram. This is appealing to most people (or appalling depending on the gap). Use a spiderweb diagram with the main topics as the spokes. Plot in the as-is and the... optimistic ambition.


Niels started his career as a master of agriculture, but soon realized his mistake and changed to the IT industry. Niels started working with data governance in 1997, before the term was coined. In the summer 1997 he became master data manager, responsible for collecting and reporting the total research and conveyance of science done at the University of Copenhagen, from papers to museum exhibitions in one unambiguous format.

After a tenure at the Danish State Railways as information and enterprise architect, he joined a dedicated information management consultancy and later Deloitte by merger. The project tally as information management consultant ended at 28. Currently, he is working as the enterprise architect in a small company that calculates the electric grid capacity across Scandinavia.

Comment