An earlier post (How to Build a Roadmap) discussed the specific steps required to develop a well thought out road map. This method identified specific actions using an overall pattern ALL roadmaps should follow. The steps required to complete this work:
2) Define the desired end state
I have discussed a way to quickly complete step one (1) Define current State . This post will discuss how to craft a suitable desired End State definition so we can use the results from the Current State work and begin our gap analysis. Our intent is to identify the difference (delta) from where we are to what we aspire to become. I know this seems obvious (and maybe a little redundant). This baseline is critical to identify what needs to be accomplished to meet the challenge.
The reality of organizational dynamics and politics (we are human after all) can distort the reality we are seeking here and truly obscure the findings. I think this happens in our quest to preserve the preferred “optics”. This is especially so when trying to define our desired end state. The business will have a set of strategic goals and objectives that may not align with the individuals we are collaborating with to discover what the tactical interpretation of this end state really means. We are seeking a quick or structured way to define a desired end state that can be reviewed and approved by all stakeholders when this activity gets underway. The tactical realization of the strategy (and objectives) is usually delegated (and rightly so) to the front line management. The real challenge is eliciting, compiling, and gaining agreement on what this desired end state means to each of the stakeholders. This is not an easy exercise and demands a true mastery of communication and facilitation skills many are not comfortable with or have exercised on a regular basis. A clear understanding of the complex interaction of any organization (and their un-intended consequences) is critical to a clear understanding of the desired end state.
Define Desired End State
There are a couple of ways to do this. One interesting approach I have seen is to use the Galbraith Star Model as an organizational design framework. The model is developed within this framework to understand what design policies and guidelines will be needed to align organizational decision making and behavior. The Star model includes the following five categories:
- Strategy: Determine direction through goals, objectives, values and mission. It defines the criteria for selecting an organizational structure (for example functional or balanced Matrix). The strategy defines the ways of making the best trade-off between alternatives.
- Structure: Determines the location of decision making power. Structure policies can be subdivided into: specialization: type and number of job specialties; shape: the span of control at each level in the hierarchy; distribution of power: the level of centralization versus decentralization; departmentalization: the basis to form departments (function, product, process, market or geography).
- Processes: The flow of information and decision processes across the proposed organization’s structure. Processes can be either vertical through planning and budgeting, or horizontal through lateral relationships (matrix).
- Reward Systems: Influence the motivation of organization members to align employee goals with the organization’s objectives.
- People and Policies: Influence and define employee’s mindsets and skills through recruitment, promotion, rotation, training and development.
The preferred sequence in this design process is composed in the following order:
- key processes;
- key people;
- roles and responsibilities;
- information systems (supporting and ancillary);
- performance measures and rewards;
- training and development; and
- career paths.
A typical design sequence starts with an understanding of the strategy as defined. This in turns drives the organizational structure. Processes are based on the organization’s structure. Structure and Processes further refine reward systems and policy. Beginning with Strategy we uncover a shared set of goals (and related objectives) to define the desired end state organized around the following categories:
- People/Organization considers the human side of Information Management, looking at how people are measured, motivated and supported in related activities. Those organizations that motivate staff to think about information as a strategic asset tend to extract more value from their systems and overcome shortcomings in other categories.
- Policy considers the message to staff from leadership. The assessment considers whether staff is required to administer and maintain information assets appropriately and whether there are consequences for inappropriate behaviors. Without good policies and executive support it is difficult to promote good practices even with the right supporting tools.
- Process and Practice considers whether the organization has adopted standardized approaches to Information Management. Even with the right tools, measurement approaches and policies, information assets cannot be sustained unless processes are consistently implemented. Poor processes result in inconsistent data and a lack of trust by stakeholders.
- Technology covers the tools provided to staff to properly meet their Information Management duties. While technology on its own cannot fill gaps in the information resources, a lack of technological support makes it impractical to establish good practices.
Goal setting is a process of determining what the stakeholder’s goals are, working towards them and measuring progress to plan. A generally accepted process for setting goals uses the SMART acronym (Specific, Measurable, Achievable, Realistic, and Timely). Each of these attributes related to the goal setting exercise is described below.
- Specific: A specific goal has a much greater chance of being accomplished than a general goal. Don’t “boil the ocean” and try to remain as focused as possible. Provide enough detail so that there is little or no confusion as to what exactly the stakeholder should be doing.
- Measurable: Goals should be measurable so we can measure progress to plan as it occurs. A measurable goal has an outcome that can be assessed either on a sliding scale (1-10), or as a hit or miss, success or failure. Without measurement, it is impossible to sustain and manage the other aspects of the framework.
- Achievable: An achievable goal has an outcome that is realistic given the organization’s capability to deliver given the necessary resources and time. Goal achievement may be more of a “stretch” if the outcome is more difficult to begin with. Is what we are asking the organization possible?
- Realistic: Start small and remain sharply focused with what the organization can and will do and let the stakeholder’s experience the joys of meeting their goals. Gradually increase the intensity of the goal after having a discussion with the stakeholder’s to redefine the goal. Is our goal realistic given the budget and timing constraints? If not, then we might want to redefine the goal.
- Time Bound: Set a timeframe for the goal: for next quarter, in six months, by one year. Setting an end point for the goal gives the stakeholders a clear target to achieve. Planning follow-up should occur within the 6-month period (best practice) but may occur within one year period or prior based on progress to plan.
Defining the desired end state is accomplished through a set of questions used to draw participants into the process to meet our SMART objectives. This set of questions is compiled, evaluated, and presented in a way that is easy to understand. Our goal here is to help everyone participating in the work to immediately grasp where the true gaps or shortcomings exist and why this is occurring when we get to step three (3) in the gap analysis phase. This is true if we are evaluating Information Strategy, our readiness to embrace a SOA initiative, or launching a new business initiative. We can complete the design process by using a variety of tools and techniques. I have used IDEF, BPMN or other process management methods and tools (including RASIC charts describing roles and responsibilities for example). Whatever tools you elect to use, they should effectively communicate intent and used to validate changes with the stakeholders who must be engaged in this process.
Now this is where many of us come up short. Where do I find the questions to help drive SMART goals? How to I make sure they are relevant? What is this engine I need to compile the results? And how do I quickly compile the results dynamically and publish for comment every time I need to?
One of the answers for me came a few years ago when I first saw the MIKE 2.0 quick assessment engine for Information Maturity. The Information Maturity (IM) Quick Scan is the MIKE 2.0 tool used to assess current and desired Information Maturity levels within an organization. This survey instrument is broad in scope and is intended to assess enterprise capabilities as opposed to focusing on a single subject area. Although this instrument focuses on Information Maturity I realized quickly I had been doing something similar for years across many other domains. The real value here is in the open source resource you can use to kick start your own efforts. I think it is also a good idea to become familiar with the benchmarks and process classification framework the American Productivity and Quality Center (APQC) has made available for a variety of industries. The APQC is a terrific source for discovering measures and quantifiable metrics useful for meeting the need for specific, measurable objectives to support the end state definition.
How it Works
The questions in the quick scan are organized around six (6) key groups in this domain to include Organization, Policy, Technology, Compliance, Measurement, and Process/Practice. The results are tabulated based on responses (in the case of the MIKE 2.0 template) ranging from zero (0 – Never) to five (5 – Always). Of course you can customize response the real point here is we want to quantify the responses received. The engine component takes the results builds a summary, and produces accompanying tabs where radar graphs plots present the Framework, Topic, Lookup, # Questions, Total Score, Average Score, and Optimum within each grouping. The MS Word document template then links to this worksheet and grabs the values and radar charts produced to assemble the final document. If all this sounds confusing, please grab the templates and try them for yourself.
The groupings (and related sub-topics) are organized out of the box like this to include the following perspectives:
Each of these perspectives is summarized and combined into a MS Word document to present to the stakeholders. The best part of this tool is it can be used periodically augment quantitative measures (captured in a dashboard for example) to assess progress to plan and improvement realized over time. Quantifying improvement quickly is vital to continued adoption of change. Communicating the results is stakeholders in quick, easy-to-understand format they are already familiar with is just as important using the same consistent, repeatable tool we used to define current state with.
I think you can see this is valuable way to reduce complexity and gather, compile, and present a fairly comprehensive view of the desired end state of the domain in question. Armed with this view we can now proceed to step three (3) and begin to conduct the Gap Analysis exercise. The difference (delta) between these two (current and desired end state) becomes the basis for our road map development. I hope this has answered many of the questions about step two (2) Define End State. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to define a desired end state quickly in my practice. Understandings the gaps between the current and the desired end-state across the business, information, application, and technical architecture make development of a robust solution delivery road map possible.