How to build a Roadmap – Define End State

ChangesAn earlier post (How to Build a Roadmap) discussed the specific steps required to develop a well thought out road map. This method identified specific actions using an overall pattern ALL roadmaps should follow. The steps required to complete this work:

1) Develop a clear and unambiguous understanding of the current state

2) Define the desired end state

3) Conduct a Gap Analysis exercise

4) Prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies

5) Discover the optimum sequence of actions (recognizing predecessor – successor relationships)

6) Develop and Publish the Road Map

I have discussed a way to quickly complete step one (1) Define current State . This post will discuss how to craft a suitable desired End State definition so we can use the results from the Current State work and begin our gap analysis. Our intent is to identify the difference (delta) from where we are to what we aspire to become. I know this seems obvious (and maybe a little redundant). This baseline is critical to identify what needs to be accomplished to meet the challenge.

The reality of organizational dynamics and politics (we are human after all) can distort the reality we are seeking here and truly obscure the findings. I think this happens in our quest to preserve the preferred “optics”. This is especially so when trying to define our desired end state. The business will have a set of strategic goals and objectives that may not align with the individuals we are collaborating with to discover what the tactical interpretation of this end state really means. We are seeking a quick or structured way to define a desired end state that can be reviewed and approved by all stakeholders when this activity gets underway.  The tactical realization of the strategy (and objectives) is usually delegated (and rightly so) to the front line management. The real challenge is eliciting, compiling, and gaining agreement on what this desired end state means to each of the stakeholders. This is not an easy exercise and demands a true mastery of communication and facilitation skills many are not comfortable with or have exercised on a regular basis.  A clear understanding of the complex interaction of any organization (and their un-intended consequences) is critical to a clear understanding of the desired end state.

Define Desired End State

graph_galbraith_star-model1There are a couple of ways to do this. One interesting approach I have seen is to use the Galbraith Star Model as an organizational design framework. The model is developed within this framework to understand what design policies and guidelines will be needed to align organizational decision making and behavior. The Star model includes the following five categories:

  • Strategy: Determine direction through goals, objectives, values and mission. It defines the criteria for selecting an organizational structure (for example functional or balanced Matrix). The strategy defines the ways of making the best trade-off between alternatives.
  • Structure: Determines the location of decision making power. Structure policies can be subdivided into: specialization: type and number of job specialties; shape: the span of control at each level in the hierarchy; distribution of power: the level of centralization versus decentralization; departmentalization: the basis to form departments (function, product, process, market or geography).
  • Processes: The flow of information and decision processes across the proposed organization’s structure. Processes can be either vertical through planning and budgeting, or horizontal through lateral relationships (matrix).
  • Reward Systems: Influence the motivation of organization members to align employee goals with the organization’s objectives.
  • People and Policies: Influence and define employee’s mindsets and skills through recruitment, promotion, rotation, training and development.

The preferred sequence in this design process is composed in the following order:

  • strategy;
  • structure;
  • key processes;
  • key people;
  • roles and responsibilities;
  • information systems (supporting and ancillary);
  • performance measures and rewards;
  • training and development; and
  • career paths.
StrategyModel

Strategy Model – Click to enlarge

A typical design sequence starts with an understanding of the strategy as defined. This in turns drives the organizational structure. Processes are based on the organization’s structure. Structure and Processes further refine reward systems and policy. Beginning with Strategy we uncover a shared set of goals (and related objectives) to define the desired end state organized around the following categories:

  • People/Organization considers the human side of Information Management, looking at how people are measured, motivated and supported in related activities.  Those organizations that motivate staff to think about information as a strategic asset tend to extract more value from their systems and overcome shortcomings in other categories.
  • Policy considers the message to staff from leadership.  The assessment considers whether staff is required to administer and maintain information assets appropriately and whether there are consequences for inappropriate behaviors.  Without good policies and executive support it is difficult to promote good practices even with the right supporting tools.
  • Process and Practice considers whether the organization has adopted standardized approaches to Information Management.  Even with the right tools, measurement approaches and policies, information assets cannot be sustained unless processes are consistently implemented.  Poor processes result in inconsistent data and a lack of trust by stakeholders.
  • Technology covers the tools provided to staff to properly meet their Information Management duties.  While technology on its own cannot fill gaps in the information resources, a lack of technological support makes it impractical to establish good practices.

Goal Setting

smart_goals_bwGoal setting is a process of determining what the stakeholder’s goals are, working towards them and measuring progress to plan. A generally accepted process for setting goals uses the SMART acronym (Specific, Measurable, Achievable, Realistic, and Timely). Each of these attributes related to the goal setting exercise is described below.

  • Specific: A specific goal has a much greater chance of being accomplished than a general goal.  Don’t “boil the ocean” and try to remain as focused as possible. Provide enough detail so that there is little or no confusion as to what exactly the stakeholder should be doing.
  • Measurable: Goals should be measurable so we can measure progress to plan as it occurs. A measurable goal has an outcome that can be assessed either on a sliding scale (1-10), or as a hit or miss, success or failure. Without measurement, it is impossible to sustain and manage the other aspects of the framework.
  • Achievable: An achievable goal has an outcome that is realistic given the organization’s capability to deliver given the necessary resources and time. Goal achievement may be more of a “stretch” if the outcome is more difficult to begin with. Is what we are asking the organization possible?
  • Realistic: Start small and remain sharply focused with what the organization can and will do and let the stakeholder’s experience the joys of meeting their goals.  Gradually increase the intensity of the goal after having a discussion with the stakeholder’s to redefine the goal.  Is our goal realistic given the budget and timing constraints?  If not, then we might want to redefine the goal.
  • Time Bound: Set a timeframe for the goal: for next quarter, in six months, by one year. Setting an end point for the goal gives the stakeholders a clear target to achieve.  Planning follow-up should occur within the 6-month period (best practice) but may occur within one year period or prior based on progress to plan.

Defining the desired end state is accomplished through a set of questions used to draw participants into the process to meet our SMART objectives.  This set of questions is compiled, evaluated, and presented in a way that is easy to understand. Our goal here is to help everyone participating in the work to immediately grasp where the true gaps or shortcomings exist and why this is occurring when we get to step three (3) in the gap analysis phase.  This is true if we are evaluating Information Strategy, our readiness to embrace a SOA initiative, or launching a new business initiative. We can complete the design process by using a variety of tools and techniques. I have used IDEF, BPMN or other process management methods and tools (including RASIC charts describing roles and responsibilities for example). Whatever tools you elect to use, they should effectively communicate intent and used to validate changes with the stakeholders who must be engaged in this process.

Now this is where many of us come up short.  Where do I find the questions to help drive SMART goals? How to I make sure they are relevant? What is this engine I need to compile the results? And how do I quickly compile the results dynamically and publish for comment every time I need to?

One of the answers for me came a few years ago when I first saw the MIKE 2.0 quick assessment engine for Information Maturity. The Information Maturity (IM) Quick Scan is the MIKE 2.0 tool used to assess current and desired Information Maturity levels within an organization. This survey instrument is broad in scope and is intended to assess enterprise capabilities as opposed to focusing on a single subject area. Although this instrument focuses on Information Maturity I realized quickly I had been doing something similar for years across many other domains. The real value here is in the open source resource you can use to kick start your own efforts.  I think it is also a good idea to become familiar with the benchmarks and process classification framework the American Productivity and Quality Center (APQC) has made available for a variety of industries. The APQC is a terrific source for discovering measures and quantifiable metrics useful for meeting the need for specific, measurable objectives to support the end state definition.

How it Works

The questions in the quick scan are organized around six (6) key groups in this domain to include Organization, Policy, Technology, Compliance, Measurement, and Process/Practice.  The results are tabulated based on responses (in the case of the MIKE 2.0 template) ranging from zero (0 – Never) to five (5 – Always).  Of course you can customize response the real point here is we want to quantify the responses received.  The engine component takes the results builds a summary, and produces accompanying tabs where radar graphs plots present the Framework, Topic, Lookup, # Questions, Total Score, Average Score, and Optimum within each grouping.  The MS Word document template then links to this worksheet and grabs the values and radar charts produced to assemble the final document. If all this sounds confusing, please grab the templates and try them for yourself.

Define Current State Diagram

Define Desired End State Model – Click to Enlarge

The groupings (and related sub-topics) are organized out of the box like this to include the following perspectives:

  • Compliance
  • Measurement
  • People/organization
  • Policy
  • Process/Practice
  • Technology

Each of these perspectives is summarized and combined into a MS Word document to present to the stakeholders.  The best part of this tool is it can be used periodically augment quantitative measures (captured in a dashboard for example) to assess progress to plan and improvement realized over time. Quantifying improvement quickly is vital to continued adoption of change. Communicating the results is stakeholders in quick, easy-to-understand format they are already familiar with is just as important using the same consistent, repeatable tool we used to define current state with.

Results

I think you can see this is valuable way to reduce complexity and gather, compile, and present a fairly comprehensive view of the desired end state of the domain in question. Armed with this view we can now proceed to step three (3) and begin to conduct the Gap Analysis exercise. The difference (delta) between these two (current and desired end state) becomes the basis for our road map development.  I hope this has answered many of the questions about step two (2) Define End State. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to define a desired end state quickly in my practice.  Understandings the gaps between the current and the desired end-state across the business, information, application, and technical architecture make development of a robust solution delivery road map possible.

Advertisements

5 thoughts on “How to build a Roadmap – Define End State

  1. Intriguing approach to roadmap transition planning. I like the clear thinking and the inclusion of structure, as power plays a bit part in the transition and spelling it out, along with your assumptions will increase the changes for success.

  2. Hi James,

    Thanks so much for taking the time to blog this series, you’re totally correct when you say there’s a real lack of this kind of material ‘out there’. So much wheel re-invention going on!

    I’m studying your material very closely, and am trying to grasp the connection between asking quantitative questions (MIKE 2 style) and definition of SMART goals… are we talking about questions along the lines of ‘how important is XYZ to you’, ‘how strongly do you agree with the statement ABC’ etc? In other words, questions that ‘draw out’ people’s priorities, concerns, and so on?

    I can easily see the value of the quantitative questions when objectively assessing a current state – but wondering about drawing out future state. Do you see a place for subjective, ‘open’ questions in all this? How do we allow for input, ideas and information outside the scope of what was pre-conceived when drafting the questionnaire?

    Thanks!

    MIke

    • Mike:
      Sure, see a place for eliciting subjective, ‘open’ questions in all this… and the end-game (IMHO) always resolves somehow into how to use this very valuable insight to get to the quants… actionable activity and measures to we use to evaluate progress to plan. This is why we practice a craft (art?) and not a precise science. The real trick to this is helping others take subjective thoughts (emotions, aspirations) and translate them into something we can really use to achieve our goals. There is always a place for ideas and information outside the scope of any pre-conceived notions of what a perfect, academic sound world looks like. As you already seem to recognize “no plan survives contact with the enemy”, so we need to remain flexible and accommodate what we encounter. And then help all of us to include and quantify what is important. Sorry to come across as a heartless quant… just trying to translate the subjective into something we can use to make our hopes and dreams a reality. Thanks for listening and good luck with your efforts.

      PS: stay tuned, I’m going to dive pretty deep into NLP and text engineering to try to illustrate how parsing and processing subjective, unstructured text (or voice to text) can be used to support and enrich the approach we are discussing today. Hope this doesn’t bore you, think this is a really interesting topic with applications everywhere… and I mean everywhere we look;

      -jdp (Parnitzke)

      • Thanks James. This makes sense. Kind of a 3 stage process?

        1. Open/qualitative questions to pull in the raw material (through a BMM lens.. ‘goal’ and ‘strategy’?)
        2. Creating concrete targets (BMM ‘objectives’?) by re-framing (1) through a set of closed/quantitative questions
        3. Using (2) to inform individual projects/initiatives (BMM ‘tactics’ ?) on our roadmap.

        Recognising that real life is never this neat of course, and that there is no substitute for knowledge and experience!

        No need to apologise for emphasising quantitative stuff – in my opinion not enough time is spent on this side… meaning that beyond the standard numbers (budgets and headcount) often ‘success’ is measured by perception and opinion only. Not to say the latter isn’t important but as you’ve pointed out defensibility and transparency is key to long term success!

        Looking forward to the upcoming blogs

        Thanks
        Mike

  3. Pingback: Architecture Without An End State | Great Architecture Fan

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s