How to Build a Roadmap – Gap Analysis Update

update-stock-imageI have received a number of requests about the tools and methods used to complete the gap analysis steps from earlier posts in the series How to Build a Roadmap. In this series I have discussed the specific steps required to develop a well thought out road map where one of the key tasks was conducting a gap analysis exercise. Understanding the limits of a medium like this I have posted this update to explore the questions in a little more detail. Believe this will be extremely useful to anyone building a meaningful road map. The internet is full of simply awful templates and tools which range from the downright silly to extremely dangerous in their simple assumptions where there is no attempt to quantify results. Even more distressing is lack of understanding of how to use and leverage the best data sources you already have – the professionals within your own organization. Save yourself some time and read on.

Recall the road map development identified specific actions using an overall pattern ALL road maps should follow. The steps required to complete this work:

  1. Develop a clear and unambiguous understanding of the current state
  2. Define the desired end state
  3. Conduct a Gap Analysis exercise
  4. Prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies
  5. Discover the optimum sequence of actions (recognizing predecessor – successor relationships)
  6. Develop and Publish the Road Map

The Gap Analysis step discussed how to develop a robust analytic to find any significant shortcomings between the current and desired end states. We use these findings to begin develop strategy alternatives (and related initiatives) to address what has been uncovered. Our intent is to identify and quantify the difference (delta) from where we are to what we aspire to become. This exercise is critical to find what needs to be accomplished. The gap analysis leads to a well-organized set of alternatives and practical strategies we can use to complete the remaining work. You can review the full post here.

Gap Analysis

Gap Analysis

The goal? Seek a quick and structured way to define actionable activities to be reviewed and approved by all stakeholders. We would like focus on the important actions requiring attention. This includes identifying a set of related organizational, functional, process, and technology initiatives needed. The gap closure recommendations give a clear line of sight back to what needs to be accomplished to close the “delta” or gaps uncovered in the analysis.

What is needed than is a consistent, repeatable way to evaluate quickly where an organization is, where they want to go, and the level of effort needed to accomplish their goals with some precision. In short the delta between current and desired state is uncovered, quantified, and ready for a meaningful road map effort based on factual findings supported by real evidence captured in the field. Performing a successful gap analysis begins with defining what you are analyzing which could be processes, products, a region, or an entire organization. Even at the overall organizational level, knowing what aspect you are analyzing is crucial to find and understand the intent and findings of the effort. Quickly focusing at the desired level of detail means we can now:

  • Know where to go; what really needs attention
  • Pinpoint opportunity…quickly.
  • Uncover what is preventing or holding back an important initiative
  • Know what to do – and in what suggested order

This is where problem solving using some quick management diagnostic tools can be used across a variety of challenges met when developing a road map. Using these tools to perform the gap analysis delivers quick distinctive results and provides the key data and actionable insight needed to develop a meaningful road map. This method (and the tools) can be used to:

  • evaluate capability using a generally accepted maturity model specific to the business,
  • focus on a specific subject area or domain; Master Data Management or Business Intelligence are two examples were known proven practice can be used as starting point to support the findings compiled,
  • assess the readiness of an important program or evaluate why it is in trouble,
  • evaluate and uncover root cause issues with a struggling project,
  • detect and measure what requires immediate attention by uncovering weaknesses where proven practice has not been followed or adopted.

Quick Management diagnostic tools
The tool set I use follows the same general pattern and structure, only the content or values differ based on how focused the effort is and what is needed to complete the work successfully. The questions, responses, and data points gathered and compiled are usually organized in a structured taxonomy of topics. See the earlier post (Define End State) for more on this. The key is using the same engine to tabulate values based on responses that can range from zero (0 – Never or No) to five (5 – Always or Yes). Of course you can customize the responses. In fact I have done this with a Program Readiness Assessment and BigData Analytics tool. The real point is to quantify the responses received. The engine component takes the results builds a summary, and produces accompanying tabs where radar graphs plots present the Framework, Topic, Lookup, # Questions, Current State Scores, Desired End State Scores, and common statistical results within each grouping. The tool can be extended to include MS Word document templates which then link to the findings worksheet and grabs the values and charts produced to assemble the draft document ready for further editing and interpretation. If all this sounds confusing, a couple of examples may be helpful.

Using the Data Maturity Model (CMMI) to Evaluate Capability
The Data Maturity Model (DMM) was developed using the principles and structure of CMMI Institute’s Capability Maturity Model Integration (CMMI)—a proven approach to performance improvement and the gold standard for software and systems development for more than 20 years. The DMM model helps organizations become more proficient in managing critical data assets to improve operations, enable analytics and gain competitive advantage.

Using this body of knowledge and a library of source questions we can elicit current state and desired end state responses using a simple survey. This can be conducted online, in workshops, or traditional interviews as needed. The responses are compiled and grouped to evaluate the gap closure opportunities for an organization wishing to improve their data management practices by identifying and taking action to address shortcoming or weaknesses identified. The framework and topic structure of the 142 questions are organized to match the DMM model. DMM_Topics

Looking closer we find the nine (9) questions used to elicit responses related to Business Glossaries within the Data Governance topic.

1) Is there a policy mandating use and reference to the business glossary?
2) How are organization-wide business terms, definitions, and corresponding metadata created, approved, verified, and managed?
3) Is the business glossary promulgated and made accessible to all stakeholders?
4) Are business terms referenced as the first step in the design of application data stores and repositories?
5) Does the organization perform cross-referencing and mapping of business-specific terms (synonyms, business unit glossaries, logical attributes, physical data elements, etc.) to standardized business terms?
6) How is the organization’s business glossary enhanced and maintained to reflect changes and additions?
7) What role does data governance perform in creating, approving, managing, and updating business terms?
8) Is a compliance process implemented to make sure that business units and projects are correctly applying business terms?
9) Does the organization use a defined process for stakeholders to give feedback about business terms?

Responses are expected to include or more of the following values describing current state practice and what the respondent believes is a desired end state. These can simply be a placed on a scale where the following values are recorded for both current and desired outcomes.

Responses
0 – Never or No
1 – Awareness
2 – Occasionally
3 – Often
4 – Usually
5 – Always or Yes

In this example note how the relatively simple response can be mapped directly into the scoring description and perspective the DMM follows.

0 – No evidence of processes performed or unknown response.

1 – Performed Processes are performed ad hoc, primarily at the project level. Processes are typically not applied across business areas. Process discipline is primarily reactive; for example, data quality processes emphasize repair over prevention. Foundational improvements may exist, but improvements are not yet extended within the organization or maintained. Goal: Data is managed as a requirement for the implementation of projects.

2 – Managed Processes are planned and executed in accordance with policy; employ skilled people with adequate resources to produce controlled outputs; involve relevant stakeholders; are monitored and controlled and evaluated for adherence to the defined process. Goal: There is awareness of the importance of managing data as a critical infrastructure asset.

3 – Defined Set of standard processes is employed and consistently followed. Processes to meet specific needs are tailored from the set of standard processes according to the organization’s guidelines. Goal: Data is treated at the organizational level as critical for successful mission performance.

4 – Measured Process metrics have been defined and are used for data management. These include management of variance, prediction, and analysis using statistical and other quantitative techniques. Process performance is managed across the life of the process. Goal: Data is treated as a source of competitive advantage.

5 – Optimized Process performance is optimized through applying Level 4 analysis for target identification of improvement opportunities. Best practices are shared with peers and industry. Goal: Data is critical for survival in a dynamic and competitive market.

The key here is capturing both current state (what is being performed now) and the desired end state capability using this tool. The difference or delta between the two values now becomes a data set we can use analytic tools to reveal where the greatest challenges are. In this example the clear gaps (represented in orange and red visual cues) show where we should focus our immediate attention and call for further investigation. Yellow shaded topics are less urgent. All green shaded topics don’t need the same rigor when addressing the actions needed in the road map developed in later stages.

DMM_Focus

Specific Subject Area – Master Data Management Assessment
In this example we can extend and focus on Master Data Management using the same principles and structure of CMMI Institute’s Capability Maturity Model Integration (CMMI), adding proven practice in the Master Data Management domain. Note the framework and topic structure is far more focused to match the MDM model framework. And the library of survey questions used here (225 questions) are far more detailed and now very much focused on Master Data Management.

MDM_Topics

Using the same scoring engine we have captured both current state (what is being performed now) and the desired end state capability using this tool. The difference or delta between the two values now becomes a data set we can use analytic tools to reveal where the greatest challenges are. The clear gaps (represented in orange and red visual cues) pop off the page when the size and relative distance from desired or needed and current practice is measured. Now there is a good idea of what needs to be addressed in the road map developed in later stages.

MDM_Focus

This is a quick way to summarize our findings and give valuable clues and direction for further investigation. We can then focus on specific problem areas using detailed schedules based on the field work to date. Based on the gaps uncovered at the higher level summary (Current vs. Desired End State) further investigation should be performed by a professional with deep subject matter expertise and intimate knowledge of generally accepted proven practice. Using the same data set we can now begin to use an interactive exploration tools to uncover significant patterns and reveal further insight.

MDM_Explore_2

 

Results
I hope this has helped readers who have asked about how to develop and use gap analysis tools to find quickly what significant delta items (the difference between current and desired states) demand further attention. I think you can see this is valuable way to quickly gather, compile field work, and capture a fairly comprehensive view of the gaps uncovered between the current and desired end state of the subject in question. This method and set of tools can be used in a variety of management challenges across the business both big and small. Armed with this information we can now go ahead to step four (4) and begin to prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies.

This is an invaluable way to assemble and discover the best sequence of actions (recognizing predecessor – successor relationships) as we move to developing the road map. This difference (delta) between these two (current and desired end state) is the basis for our road map. I hope this has answered many of the questions about step three (3) Conduct a Gap Analysis exercise. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to perform a gap analysis quickly in my practice.


If you enjoyed this post, please share with anyone who may help from reading it. And don’t forget to click the follow button to be sure you don’t miss future posts. Planning on compiling all the materials and tools used in this series in one place, still unsure of what form and content would be the best for your professional use.Please take a few minutes and let me know what form and format you would find most valuable.

Suggested content for premium subscribers:

  • Topic Area Models (for use with Mind Jet – see https://www.mindjet.com/ for more)
  • Master Data Management Gap Analysis Assessment
  • Data Maturity Management Capability Assessment
  • Analytic Practice Gap Analysis Assessment
  • Big Data Analytic Gap Analysis Assessment
  • Program Gap Analysis Assessment
  • Program Readiness Assessment
  • Project Gap Analysis Assessment
  • Enterprise Analytic Mind Map
  • Reference Library with Supporting Documents
Advertisements

Eight powerful secrets for retaining delighted clients

Eight powerful secrets for retaining delighted clients
(What they don’t teach you in business school)

Success-Secrets-1Over the years I have come to believe there are few simple secrets to my consulting success with several organizations both large and small. These secrets are not taught in business schools. And it seems the larger firms have stopped investing in helping younger professional learn the craft and soft skills needed. Many of these tips are just common sense and represent proven practice. You can always choose to ignore one or more of them and expect a client experience that is less than expected. I do think carefully about each one of them now in every engagement. They have served me well. Think you will them invaluable in your professional life as well.

1) Help them listen to themselves
The golden rule of any client communication is to listen. Once you are done listening, repeat or paraphrase what you have heard. Helping a client hear what they’ve just said is invaluable. Not only does it ensure you haven’t misinterpreted anything, hearing their thoughts explained by someone else often highlights potential issues. It is always better for the client to recognize problems themselves instead having to point them out.

2) Never ignore or reject a bad idea
When the client says ‘my idea is to …’ avoid the instinct to point out why the hilariously awful suggestion won’t work. Instead, listen, take notes and say something like ‘I will take that into consideration’. When returning with your much better ideas, they probably won’t mention it again. If they do, just say it didn’t quite work. They’re usually happy that you will have considered this and are truly receptive to their ideas. No matter how bad.

3) You can have it cheap, fast or good. Pick any two.
Explain the two out of three rule. All clients will always want you to produce distinctive, high quality work in less than a week for next-to-no money. But they sometimes forget that these things come at a price, and their reluctance to pay your rates or pressure to work faster can make you question your own reasoning. The most demanding won’t even understand why it’s not possible. Help them to understand by explaining the ‘two out of three’ rule:

Good Distinctive Quality + Fast = Expensive
You will defer every other unrelated job, cancel all un-necessary tasks and put in ungodly hours just to get the job done. But, don’t expect it to be cheap.

Good Distinctive Quality + Cheap = Slow
Will do a great job for a discounted price, but be patient until we have a free moment from pressing and better paying clients.

Fast + Cheap = Inferior Quality
Expect an inferior job delivered on time. You truly get what you pay for, and in our opinion this is the least favorable choice of the three. In most cases I decline or disengage rather than commit to something that may result in damaging the relationship. See secret 5 about never presenting ideas you don’t believe in and who can believe in junk?

To summarize: You can have it cheap, fast or good. Pick any two and meet everyone’s expectations.

4) Don’t make delivery promises straight away
Clients want immediate delivery date commitments. As much as you would like to conclude meetings with a firm, ‘yes I can’, always check with your team first or just ask for some time to ensure you have thought about the commitment thoughtfully. Not only does this show you take deadlines seriously, the next time the same client rings with an urgent request, you can buy time before committing. It is remarkable how many must-have-it-yesterday emergencies can fix themselves or simply melt away within hours of the initial request. A commitment is a promise. When you break a promise, no matter how small it may seem to you it can damage the client relationship and your reputation (brand).

5) Never present ideas you don’t believe in (no junk please!)
It’s tempting when preparing solution options to add one more into the mix. I guess we sometimes think that including a Junknot-so-great idea highlights the effort we put into the preferred option and will make our other suggestions look even stronger. But what if the client picks the wrong option? Then you are stuck producing your own dumb idea you simply don’t believe in. If your main ideas are good enough, that’s all you should need. If the client hates them, you can always fall back on the rejects or junk if needed.

6) Don’t assume you’ll find ‘it’ alone
You worked on presenting your ideas alone with little time for collaboration. Looks great to you – what a genius you are. And then client see this and says I’ll know it when I see it and this is not what I expected’. These are the words that you never want hear as a creative, hard-working professional. You have no found it. If a client doesn’t like your ideas but can’t explain why, never assume that you can hit the mark next time. The client not knowing what they want is your problem so spend more time with them exploring other work they like. This helps get inside their heads and closer to finding the elusive ‘it’.

7) Who actually has final approval?
The final seal of approval may not come from the person you deal with every day. Managers have Directors, Directors have VPs, and VPs have a C-level they report to. So always uncover exactly who is the ultimate authority before proceeding with your ideas. Most people usually hate showing rough sketches to their boss, meaning you need to flesh out one or two elements in advance (see the secret number 6) with your client. Ensuring your work avoids a last minute thumbs-down is worth the time and effort.

8) If all else fails, raise your rates
Most clients are a joy to work with. Hopefully these secrets will help you deal with the most challenging parts of the creative process and leave you with a delighted client. But sometimes, you just know deep down someone is going to be impossible to work with. And you will know this quickly. If simply turning down the work is not an option or will create a poor perception with the client, raise your rates by 20%. At least then you can console yourself with cash during another long weekend of last minute revisions, rework, and unnecessary stress. And what about that time forever lost to your family and loved ones? This is a trap; the higher rate is almost never worth the cost to your nervous system. Manage your time wisely, it truly “is never found again”.

19017dgvl2m2ajpg

Big Data Analytics – Unlock Breakthrough Results: (Step 1)

tlmd_mitos_que_afectan_la_vida_de_tu_mascota_17You’ve made the big data investment. You believe Nucleus Research when it says that an investment in analytics return a whopping thirteen (13) dollars for every one (1) dollar spent. Now it’s time to realize value. This series of posts is going to provide a detailed set of steps you can take to unlock this value in a number of ways.  As a simple use case I’m going to address the perplexing management challenge of platform and tool optimization across the analytic community as an example to illustrate each step. This post addresses the first of nine practical steps to take.  Although lengthy, please stick with me, I think this you find this valuable. I’m going to use a proven approach for solving platform and tool optimization in the same manner that proven practice suggests every analytic decision be made.  In this case I will leverage the CRISP-DM method (there are others I have used like SEMMA from SAS) to put business understanding front and center at the beginning of this example.

Yes, I will be eating my own dog food now (this is why a cute puppy is included in a technical post and not the Hadoop elephant) and getting a real taste of what proven practice should look like across the analytic community.  Recall the nine steps to take summarized in a prior post.

1) Gather current state analytics portfolio, interview stakeholders, and compile findings.
2) Determine the analytic operating models in use.
3) Refine Critical Analytic Capabilities as defined to meet site specific needs.
4) Weight Critical Analytic Capability according to each operating model in use.
5) Gather user profiles and simple population counts for each form of use.
6) Gather platform characteristics profiles.
7) Develop platform and tool signatures.
8) Gather data points and align with the findings.
9) Assemble findings and prepare a decision model for platform and tooling optimization.

Using the CRISP-DM method as a guideline, we find that each of the nine steps corresponds to the CRISP-DM method as illustrated in the following diagram.

CRISP_StepAlignment

Note there is some overlap between understanding the business and the data. The models we will be preparing will use a combination of working papers, logical models, databases, and the Decision Model Notation (DMN) from the OMG to wrap everything together.  In this example the output product is less about deploying or embedding an analytic decision and more about taking action based on the results of this work.

Step One – Gather Current State Portfolio
In this first step we are going to gather a deep understanding for what exists already within the enterprise and learn how the work effort is organized. Each examination should include at a minimum:

  • Organization (including its’ primary and supporting processes)
  • Significant Data Sources
  • Analytic Environments
  • Analytic Tools
  • Underlying technologies in use

The goal is to gather the current state analytics portfolio, interview stakeholders, and document our findings. In brief, this will become an integral part of the working papers we can build on in the steps to follow.  This is an important piece of the puzzle we are solving for. Do not even think about proceeding until this is complete. Note the following diagram (click to enlarge) illustrates the dependencies between accomplishing this field work and each component of the solution.

UMLDependencyDiagram

Unlocking Breakthrough Results – Dependency Diagram

Organization
If form follows function, this is where we begin to uncover the underlying analytic processes and how the business is organized. Understanding the business by evaluating the organization will provide invaluable clues to uncover what operating models are in use.  For example, if there is a business unit organized outside of IT and reporting to the business stakeholder, you will most likely have a decentralized analytics model in addition to the centralized provisioning most analytic communities already have in place.

Start with the organization charts; but do not stop there. Recommend you get a little closer to reality in the interview process to really understanding what is occurring in the community. By examining the underlying processes this will become clear. For example, what is the analytic community really doing? Do they use a standard method (CRISP-DM) or something else? An effective way to uncover this beyond the simple organization charts (which are never up-to-date and notorious for mislabeling what people are actually doing) is using a generally accepted model (like CRISP-DM) to organize the stakeholder interviews. This means we can truly understand what is typically performed by whom, using what processes to accomplish their work.  And where boundary conditions exist or in the worst case are un-defined.  An example is in order.  Using the CRISP-DM model we see there are a couple of clear activities that typically occur across all analytic communities.  This set of processes is summarized in the following diagram (click to enlarge).

CRISP_DM_MindMap

Gathering the analytic inventory and organizing the interviews now becomes an exercise in knowing what to look for using this process model. For example, diving a little deeper we can now explore how modeling is performed during our interviews guided by a generally accepted method. We can structure questions around the how, who, and what is performed for each expected process or supporting activity. Following up on this line of questioning should normally lead to samples of the significant assets which are collected and managed within an analytic inventory. Let’s just start with the modeling effort and a few directed questions.

  • Which organization is responsible for the design, development, testing, and deployment of the models?
  • How do you select which modeling techniques to use? Where are the assumptions used captured?
  • How do you build the models?
  • Where do I find the following information about each model?
    •     Parameter, Variable Pooling Settings
    •     Model Descriptions
    •     Objectives
    •     Authoritative Knowledge Sources Used
    •     Business rules
    •     Anticipated processes used
    •     Expected Events
    •     Information Sources
    •     Data sets used
    •     Description of any Implementation Components needed
    •     A Summary of Organizations Impacted
    •     Description of any Analytic Insight and Effort needed
  • Are anticipated reporting requirements identified?
  • How is model testing designed and performed?
  • Is a regular assessment of the model performed to recognize decay?

When you hear the uncomfortable silence and eyes point to the floor you have just uncovered one meaningful challenge.  Most organizations I have consulted into DO NOT have an analytic inventory, much less a metadata repository (or even a simple information catalog) I would expect to support a consistent, repeatable process.  This is a key finding for another kind of work effort that is outside the scope of this discussion.  All we are doing here is trying to understand what is being used to produce and deploy information products within the analytic community.  And is form really following function as the organization charts have tried to depict? Really?

An important note: we are not in a process improvement effort; not just yet. Our objective is focused on platform and tool optimization across the analytic community.  Believing form really does follow function it should be clear after this step what platforms and tools are enabling (or inhibiting) effective response and solving for this important and urgent problem across the organization.

Significant Data Sources
The next activity in this step is to also gain a deeper understanding what data is needed to meet the new demands and business opportunities made possible with big data.  Let’s begin with understanding how the raw materials or data stores can be categorized.  Data may be sourced from any number of sources to include one or more of the following:

  • Structured data (from  tables, records)
  • Demographic data
  • Times series data
  • Web log data
  • Geospatial data
  • Clickstream data from websites
  • Real-time event data
  • Internal text data (i.e. from e-mails, call center notes, claims, etc.)
  • External social media text data

If you are lucky there will be an enterprise data model or someone in enterprise architecture who can point to the major data sources and where the system of record resides. These are most likely organized by subject area (Customer, Account, Location, etc.) and almost always include schema-on-write structures. Although the focus is big data, it still is important to recognize that vast majority of data collected originates in transactional systems (e.g. Point of Sale).  Look for curated data sets and information catalogs (better yet an up-to-date metadata repository like Adaptive or Alation) to accelerate this task if present.

Data in and of itself is not very useful until it is converted or processed into useful information.  So here is a useful way to think about how this is viewed or characterized in general. The flow of information across applications and the analytic community from sources external to the organization can take on many forms. Major data sources can be grouped into three (3) major categories:

  • Structured Information,
  • Semi-Structured Information and
  • Unstructured Information.

While modelling techniques for structured information have been around for some time, semi-structured and unstructured information formats are growing in importance. Unstructured data presents a more challenging effort.  Many believe up to 80% of the information in a typical organization is unstructured this must be an important area for focus as part of an overall information management strategy. It is an area, however, where the accepted best practices are not nearly as well-defined. Data standards provide an important mechanism for structuring information. Controlled vocabularies are also helpful (if available) to focus on the use of standards to reduce complexity and improve reusability. When we get to modeling platform characteristics and signatures in the later steps the output of this work will become increasingly valuable.

Analytic Landscape
I have grouped the analytic environments, tools, and underlying technologies together in this step because they are usually the easiest data points to gather and compile.

  • Environments
    Environments are usually described as platforms and can take several different forms. For example, you can group these according to intended use as follows:
    – Enterprise Data Warehouse
    – Specialized Data Marts
    – Hadoop (Big Data)
    – Operational Data Stores
    – Special Purpose Appliances (MPP)
    – Online Analytical Processor (OLAP)
    – Data Visualization and Discovery
    – Data Science (Advanced Platforms such as the SAS Data Grid)
    – NLP and Text Engineering
    – Desktop (Individual Contributor; yes think how pervasive Excel and Access are)
  • Analytic Tools
    Gathering and compiling tools is a little more interesting. There is such a wide variety of tools designed to meet several different needs, and significant overlap in functions delivered exists among them. One way to approach this is group by intended use.  Try using the INFORMS taxonomy for example to group the analytic tools you find.  There work identified three hierarchical but sometimes overlapping groupings for analytics categories: descriptive, predictive, and prescriptive analytics. These three groups are hierarchical and can be viewed in terms of the level of analytics maturity of the organization.  Recognize there are three types of data analysis:

    • Descriptive (some have split Diagnostic into it’s own category)
    • Predictive (forecasting)
    • Prescriptive (optimization and simulation)

This simple classification scheme can be extended to include lower level nodes and improved granularity if needed. The following diagram illustrates a graphical depiction of the simple taxonomy developed by INFORMS and widely adopted by most industry leaders as well as academic institutions.

INFORMS_Taxonomy

Source: INFORMS (Institute for Operations Research and Management Science)

Even though these three groupings of analytics are hierarchical in complexity and sophistication, moving from one to another is not clearly separable. That is, the analytics community may be using tools to support descriptive analytics (e.g. dashboards, standard reporting) while at the same time using other tools for predictive and even prescriptive analytics capability in a somewhat piecemeal fashion. And don’t forget to include the supporting tools which may include metadata functions, modeling notation, and collaborative workspaces for use within the analytic community.

  • Underlying technologies in use
    Technologies in use can be described and grouped as follows (and this just a simple example and is not intended to be an exhaustive compilation).

    • Relational Databases
    • MPP Databases
    • NOSQL databases
      • Key-value stores
      • Document store
      • Graph
      • Object database
      • Tabular
      • Tuple store, Triple/quad store (RDF) database
      • Multi-Value
      • Multi-model database
    • Semi and Unstructured Data Handlers
    • ETL or ELT Tools
    • Data Synchronization
    • Data Integration – Access and Delivery

Putting It All Together
Not that we have compiled the important information needed, where do we put this for the later stages of the work effort?  In an organization of any size this can be quite a challenge, just due to the sheer size and number of critical facets we will need later, the number of data points, and the need to re-purpose and leverage this in a number of views and perspectives.

Here is what has worked for me.  First use a mind or concept map (Mind Jet for example) to organize and store URIs to the underlying assets. Structure, flexibility, and the ability to export and consume data from a wide variety of sources is a real plus.  The following diagram illustrates an example template I use to organize an effort like this. Note the icons (notepad, paperclip, and MS-Office) even at this high level point to a wide variety of content gathered and compiled in the fieldwork (including interview notes and observations).

EA_MindMap

Enterprise Analytics – Mind Map Example

For larger organizations without an existing Project Portfolio Management (PPM) tool or metadata repository that supports customizations (extensions, flexible data structures) it is sometimes best to augment the maps with a logical and physical database populated with the values already collected and organized in specific nodes of the map.  A partial fragment of a logical model would look something like this, where some sample values are captured in the yellow notes.

Logical

Logical Model Fragment

Armed with the current state analytics landscape (processes and portfolio), stakeholder’s contributions, and the findings compiled we are now ready to move on to the real work at hand. In step (2) we will use this information to determine the analytics operating models in use supported by the facts.

If you enjoyed this post, please share with anyone who may benefit from reading it. And don’t forget to click the follow button to be sure you don’t miss future posts. Planning on compiling all the materials and tools used in this series in one place, still unsure of what form and content would be the best for your professional use.  Please take a few minutes and let me know what form and format you would find most valuable.

Suggested content for premium subscribers: 
Big Data Analytics - Unlock Breakthrough Results:(Step 1) 
CRISP-DM Mind Map (for use with Mind Jet, see https://www.mindjet.com/ for more)
UML for dependency diagrams.  Use with yUML (see http://yuml.me/)
Enterprise Analytics Mind Map (for use with Mind Jet)
Logical Data Model (DDL; use with your favorite tool)
Analytics Taxonomy, Glossary (MS-Office)
Reference Library with Supporting Documents

Big Data Analytics – Nine Easy Steps to Unlock Breakthrough Results

breakthrough1An earlier post addressed one of the more perplexing challenges to managing an analytic community of any size against the irresistible urge to cling to what everyone else seems to be doing without thinking carefully about what is needed, not just wanted.  This has become more important and urgent with the breath-taking speed of Big Data adoption in the analytic community. Older management styles and obsolete thinking have created needless friction between the business and their supporting IT organizations.  To unlock breakthrough results requires a deep understanding of why this friction is occurring and what can be done to reduce this unnecessary effort so everyone can get back to the work at hand.

There are two very real and conflicting views that we need to balance carefully.  The first, driven by the business is concerned with just getting the job done and lends itself to an environment where tools (and even methods) proliferate rapidly. In most cases this results in overlapping and redundant expensive functionality.  Less concerned with solving problems once, the analytic community is characterized by many independent efforts where significant intellectual property (analytic insight) is not captured and inadvertently placed at risk.

The second view, in contrast, is driven by the supporting IT organization charged with managing and delivering supporting services across a technology portfolio that values efficiency and effectiveness.  The ruthless pursuit of eliminating redundancy, leveraging the benefits of standardization, and optimizing investment drive this behavior.  So this is where the friction is introduced. Until you understand this dynamic be prepared to experience organizational behavior that seems puzzling and downright silly at times.  Questions like these (yes they are real) seem to never be resolved.

– Why do we need another data visualization tool when we already have five in the portfolio?
– Why can’t we just settle on one NoSQL alternative?
– Is the data lake really a place to worry about data redundancy?
– Should we use the same Data Quality tools and principles in our Big Data environment?

What to Do
So I’m going to share a method to help resolve this challenge and help focus on what is important so you can expend your nervous system solving problems rather than creating them. Armed with a true understanding of the organizational dynamics it is now a good time to revisit a first principle that form follows function to help resolve and mitigate what is an important and urgent problem. For more on this important principle see Big Data Analytics and Cheap Suits.

This method knits together several key components and tools to craft an approach that you may find useful.  The goal is to organize and focus the right resources to ensure successful Big Data Analytic programs meet expectations. Because of the content delivered believe I will just break this down into several posts, each building on the other to keep the relative size and readability manageable.  This approach seemed to work with earlier series on Modeling the MDM Blueprint and How to Build a Roadmap so think I will stick to this method for now.

The Method
FLW_QuoteFirst let’s see what the approach looks like independent of any specific tools or methods.  This includes nine (9) steps which can be performed concurrently by both business and technical professionals working together to arrive at the suggested platform and tooling optimization decisions. Each of the nine (9) steps in this method will be accompanied by a suggested tool or method to help you prepare your findings in a meaningful way.  Most of these tools are quite simple; some will be a little more sophisticated.  This represents a starting point on your journey and can be extended in any number of ways to create more refined uses to re-purpose the data and facts collected in this effort. The important point is all steps are designed organize and focus the right resources to ensure successful Big Data Analytic programs meet expectations.  Executed properly you will find a seemingly effortless way to help:

– Reduce unnecessary effort
– Capture, manage, and operationally use analytic insight
– Uncover inefficient tools and processes and take action to remedy
– Tie results directly to business goals constrained by scope and objectives

So presented here is a simplified method to follow to compile an important body of work, supported by facts and data to arrive at any number of important decisions in your analytics portfolio.

1) Gather current state analytic portfolio, interview stakeholders, and document findings
2) Determine the analytic operating model in use (will have more than one, most likely)
3) Refine Critical Analytic Capabilities as defined to meet site specific needs
4) Weight Critical Analytic Capability according to each operating model in use
5) Gather user profiles and simple population counts for each form of use
6) Gather platform characteristics profiles
7) Develop platform and tool signatures
8) Gather data points and align with the findings
9) Assemble findings and prepare a decision model for platform and tooling optimization

The following diagram illustrates the method graphically (click to enlarge).

MethodSummary

In a follow-up post I will dive into each step starting with gathering current state analytic portfolio, interviewing stakeholders, and documenting your findings.  Along the way I will provide examples and tools you can use to help make your decisions and unlock breakthrough results. Stay tuned…

How to build a Roadmap – Gap Analysis

An earlier post in this series How to Build a Roadmap  discussed the specific steps required to develop a well thought out road map. This roadmap identified specific actions using an overall pattern ALL roadmaps should follow. The steps required to complete this work:

  1. Develop a clear and unambiguous understanding of the current state
  2. Define the desired end state
  3. Conduct a Gap Analysis exercise
  4. Prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies
  5. Discover the optimum sequence of actions (recognizing predecessor – successor relationships)
  6. Develop and Publish the Road Map

This post will discuss how to develop a robust gap analysis to identify any significant shortcomings between the current and desired end states.  We use these findings to begin develop strategy alternatives (and related initiatives) to address what has been uncovered. Our intent is to identify the difference (delta) from where we are to what we aspire to become. This exercise is critical to identify what needs to be accomplished.  The gap analysis leads to a well-organized set of alternatives and viable strategies we can use to complete the remaining work.

Gap Analysis

Click to enlarge

We are seeking a quick and structured way to define actionable activities to be reviewed and approved by all stakeholders. This includes identifying a set of related organizational, functional, process, and technology initiatives needed (I will address the architectural imperatives in another post). The gap closure recommendations provide a clear line of sight back to what needs to be accomplished to close the “delta” or gaps uncovered in the analysis.

Addressing the gaps discovered can be grouped across three broad categories to include specific actionable activities and management initiatives related to:

  • Building organizational capability,
  • Driving organizational commitment, and
  • Right-Fitting the solution to ensure we do not try to build a system whose complexity exceeds the organization’s capability to deliver

While every organization’s “reach should always exceed its grasp” we also should understand the need to introduce this new discipline in a measured and orderly manner.  There are many good (and a little too academic for me) books and writers who have addressed this topic well. What is different about my approach is my need to identify specific actionable activities we undertake in a consistent way to change into what we aspire to be.

Gap Analysis

There are a couple of ways to do this. Remember all three dimensions referred to above should be considered in this analysis. A helpful outline should include questions related to each of the three aspects in our work.  Note, this is a generalized example, and represents three distinct entry points to evaluate.  Further exploration into any of these three is usually needed for developing the detailed planning products in later phases. Using this approach we can attempt to reveal quickly where significant gaps exist and explore further as needed. So let us focus on the key themes early to guide our work moving forward.

Organizational Capability

Is the organization is ready to embrace the initiative?

  • Have we identified baseline adoption of fundamental management capability across both business and technical communities that can be leveraged to drive the effort?
  • Is there executive consensus? and –
  • Do plans exist to execute immediate actions to address missing operational capability (gaps)?

Is there evidence of detailed planning to stage and execute the transformation?

  • Have we consciously chosen maturity jumps in a measured and controlled manner?
  • Do we understand the expected change in process consistency, discipline, and complexity may require some deep cultural shifts to occur? and
  • Have we clearly articulated the associated operational impacts to the stakeholders?

Are capability-based plans included or needed at this time?

  • Have we accounted for internal and external bandwidth in capability and core competency?
  • Have we factored in enough time to stabilize the management foundation and professional community?
  • Are there critical dependencies with other ongoing programs? – Is there an effort underway to secure the participation of critical roles and key personnel?

Organizational Commitment

Is there evidence that exists that marketing the compelling vision is occurring in an organized manner?

  • Is there an effort to quantify and repeatedly communicate the value to the organization?
  • Has the “what’s in it for me” messaging for critical stakeholders been developed?
  • Are goals and objectives communicated in a consistent, repeatable manner?

Is there a need to proactively Manage Stakeholder Buy-In?

  • Have we created opportunities for stakeholder involvement?
  • Need to design quantitative usage metrics? and
  • Are there incentives to align and reward desired behavior?

Do we need to develop and enhance change leadership?

  • Develop manager’s communication,
  • Develop expectation and capacity management skills,
  • Assign dedicated transition management resources to the effort.

Strong Governance and Oversight roles accepted and adopted as a critical success factor?

  • Is there active executive sponsor involvement?
  • Have we defined performance outcomes to direct and track success?  and
  • Are line and staff managers accountable for progress to plan?

Are goals and objectives communicated in a consistent, repeatable manner?

  •  Need to institute a comprehensive and open communication plan that publishes program information to the organization in a consistent manner?

Right-Fitting the Technical Solution

Has the operating model been defined?

  • Ensure that introducing and adopting new processes are aligned to business intent.
  • Balance the trade-offs between structure and process,
  • Formally assign decision rights,
  • Have the new roles been defined where needed? and
  • Can we leverage reuse of existing assets where possible?

What about developing necessary Management and Business User Skills? 

  • Enhance domain specific skills,
  • Improve decision management, and
  • Adopt  and refine fundamental technical and business skills related to operations

Are there significant gaps in the current architecture or environment that would prevent successful delivery? 

  • Information, Application, and Technical architecture can support the desired end state
  • Required ITIL (Information Technology Infrastructure Library) Service and Support practices exist, and
  • Complexity of the solution will not exceed the organization’s technical ability to deliver

The gap closure strategy should include specific recommendations for building organizational capability and driving commitment to this effort. In addition we should ensure to right-fit a technical solution the organization can grow into as it achieves widespread adoption across the enterprise.  The approach is carefully weighed to align the three perspectives to ensure our gap closure strategy recommendations are not used to build a system whose complexity exceeds the organization’s capability to deliver.

A typical gap analysis sequence starts with an understanding of the strategy as defined. This in turns drives the organizational structure. Processes are based on the organization’s structure. Structure and Processes further refine reward systems and policy. Beginning with strategy we uncover gaps where a shared set of goals (and related objectives) may not align with the desired end state. This gap analysis can be organized around the following categories:

  • People/Organization considers the human side of Information Management, looking at how people are measured, motivated and supported in related activities.  Those organizations that motivate staff to think about information as a strategic asset tend to extract more value from their systems and overcome shortcomings in other categories.
  • Policy considers the message to staff from leadership.  The assessment considers whether staffs are required to administer and maintain information assets appropriately and whether there consequences for inappropriate behaviours.  Without good policies and executive support it is difficult to promote good practices even with the right supporting tools.
  • Process and Practice considers whether the organization has adopted standardized approaches to Information Management.  Even with the right tools, measurement approaches and policies, information assets cannot be sustained unless processes are consistently implemented.  Poor processes result in inconsistent data and a lack of trust by stakeholders.
  • Technology covers the tools that are provided to staff to properly meet their Information Management duties.  While technology on its own cannot fill gaps in the information resources, a lack of technological support makes it impractical to establish good practices.

How it Works – An Example

The questions in the quick scan tool used in a prior post to define our end state were organized around six (6) key groups to include Organization, Policy, Technology, Compliance, Measurement, and Process/Practice.  This is a quick way to summarize our findings and provide valuable clues and direction for further investigation.  We can then focus on specific subject areas using detailed schedules based on the field work to date.  Based on the subject gaps uncovered at the higher level summary (Current vs. Desired End State) further investigation should be performed by a professional with deep subject matter expertise and intimate knowledge of generally accepted best practices.  In fact it is best to use prepared schedules in the early field work (if possible) to begin gathering and compiling the facts needed during the interview processes to mitigate “churn” and unnecessary rework.

For example, in the Process/Practice area we can use the Information Technology Infrastructure Library (ITIL) to uncover any significant gaps in the Service and Support delivery functions needed to support the defined end state.  Detailed schedules can be compiled and the organization’s current state evaluated against this Library and other best practices to ensure the necessary process and practices are in place to enable the proposed end state solution.

ITIL Service Delivery

  • Service Level Management
  • Capacity Management
  • Availability Management
  • Service Continuity Management
  • Financial Management

ITIL Service Support

  • Incident Management
  • Problem Management
  • Configuration Management
  • Change Management
  • Release Management

The following fragment illustrates an example schedule related to Service Continuity Management.  Using this schedule we capture our findings, suggested initiatives or projects, expected deliverables, level of effort, and relative priority of the gap identified.  This is a quick way to summarize our findings and prepare for the next step (4- Prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies).

Click to enlarge

Click to enlarge

In another example this small fragment form a Master Data Management (MDM) related gap schedule addresses specific Data Profiling activities expected within the context of a Party or Customer supporting function. What is clear from this schedule is that no evidence of profiling has been found. This is a significant gap in the MDM domain. We should have some idea of the relative quality of the data sourced into our platform and be able to keep our customers informed as to what level of confidence they should expect based on this analysis. This represents a clear gap and should be addressed in the roadmap we will develop in later stages.

Click to enlarge

Click to enlarge

Results

I think you can see this is valuable way to quickly gather, compile field work, and capture a fairly comprehensive view of the gaps uncovered between the current and desired end states of the domain in question. Armed with this information we can now proceed to step four (4) and begin to prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies.

This is an invaluable way to assemble and discover the optimum sequence of actions (recognizing predecessor – successor relationships) as we move to developing the road map. This difference (delta) between these two (current and desired end state) is the basis for our road map.  I hope this has answered many of the questions about step three (3) Conduct a Gap Analysis exercise. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to perform a gap analysis quickly in my practice.

How to build a Roadmap – Define End State

ChangesAn earlier post (How to Build a Roadmap) discussed the specific steps required to develop a well thought out road map. This method identified specific actions using an overall pattern ALL roadmaps should follow. The steps required to complete this work:

1) Develop a clear and unambiguous understanding of the current state

2) Define the desired end state

3) Conduct a Gap Analysis exercise

4) Prioritize the findings from the Gap Analysis exercise into a series of gap closure strategies

5) Discover the optimum sequence of actions (recognizing predecessor – successor relationships)

6) Develop and Publish the Road Map

I have discussed a way to quickly complete step one (1) Define current State . This post will discuss how to craft a suitable desired End State definition so we can use the results from the Current State work and begin our gap analysis. Our intent is to identify the difference (delta) from where we are to what we aspire to become. I know this seems obvious (and maybe a little redundant). This baseline is critical to identify what needs to be accomplished to meet the challenge.

The reality of organizational dynamics and politics (we are human after all) can distort the reality we are seeking here and truly obscure the findings. I think this happens in our quest to preserve the preferred “optics”. This is especially so when trying to define our desired end state. The business will have a set of strategic goals and objectives that may not align with the individuals we are collaborating with to discover what the tactical interpretation of this end state really means. We are seeking a quick or structured way to define a desired end state that can be reviewed and approved by all stakeholders when this activity gets underway.  The tactical realization of the strategy (and objectives) is usually delegated (and rightly so) to the front line management. The real challenge is eliciting, compiling, and gaining agreement on what this desired end state means to each of the stakeholders. This is not an easy exercise and demands a true mastery of communication and facilitation skills many are not comfortable with or have exercised on a regular basis.  A clear understanding of the complex interaction of any organization (and their un-intended consequences) is critical to a clear understanding of the desired end state.

Define Desired End State

graph_galbraith_star-model1There are a couple of ways to do this. One interesting approach I have seen is to use the Galbraith Star Model as an organizational design framework. The model is developed within this framework to understand what design policies and guidelines will be needed to align organizational decision making and behavior. The Star model includes the following five categories:

  • Strategy: Determine direction through goals, objectives, values and mission. It defines the criteria for selecting an organizational structure (for example functional or balanced Matrix). The strategy defines the ways of making the best trade-off between alternatives.
  • Structure: Determines the location of decision making power. Structure policies can be subdivided into: specialization: type and number of job specialties; shape: the span of control at each level in the hierarchy; distribution of power: the level of centralization versus decentralization; departmentalization: the basis to form departments (function, product, process, market or geography).
  • Processes: The flow of information and decision processes across the proposed organization’s structure. Processes can be either vertical through planning and budgeting, or horizontal through lateral relationships (matrix).
  • Reward Systems: Influence the motivation of organization members to align employee goals with the organization’s objectives.
  • People and Policies: Influence and define employee’s mindsets and skills through recruitment, promotion, rotation, training and development.

The preferred sequence in this design process is composed in the following order:

  • strategy;
  • structure;
  • key processes;
  • key people;
  • roles and responsibilities;
  • information systems (supporting and ancillary);
  • performance measures and rewards;
  • training and development; and
  • career paths.
StrategyModel

Strategy Model – Click to enlarge

A typical design sequence starts with an understanding of the strategy as defined. This in turns drives the organizational structure. Processes are based on the organization’s structure. Structure and Processes further refine reward systems and policy. Beginning with Strategy we uncover a shared set of goals (and related objectives) to define the desired end state organized around the following categories:

  • People/Organization considers the human side of Information Management, looking at how people are measured, motivated and supported in related activities.  Those organizations that motivate staff to think about information as a strategic asset tend to extract more value from their systems and overcome shortcomings in other categories.
  • Policy considers the message to staff from leadership.  The assessment considers whether staff is required to administer and maintain information assets appropriately and whether there are consequences for inappropriate behaviors.  Without good policies and executive support it is difficult to promote good practices even with the right supporting tools.
  • Process and Practice considers whether the organization has adopted standardized approaches to Information Management.  Even with the right tools, measurement approaches and policies, information assets cannot be sustained unless processes are consistently implemented.  Poor processes result in inconsistent data and a lack of trust by stakeholders.
  • Technology covers the tools provided to staff to properly meet their Information Management duties.  While technology on its own cannot fill gaps in the information resources, a lack of technological support makes it impractical to establish good practices.

Goal Setting

smart_goals_bwGoal setting is a process of determining what the stakeholder’s goals are, working towards them and measuring progress to plan. A generally accepted process for setting goals uses the SMART acronym (Specific, Measurable, Achievable, Realistic, and Timely). Each of these attributes related to the goal setting exercise is described below.

  • Specific: A specific goal has a much greater chance of being accomplished than a general goal.  Don’t “boil the ocean” and try to remain as focused as possible. Provide enough detail so that there is little or no confusion as to what exactly the stakeholder should be doing.
  • Measurable: Goals should be measurable so we can measure progress to plan as it occurs. A measurable goal has an outcome that can be assessed either on a sliding scale (1-10), or as a hit or miss, success or failure. Without measurement, it is impossible to sustain and manage the other aspects of the framework.
  • Achievable: An achievable goal has an outcome that is realistic given the organization’s capability to deliver given the necessary resources and time. Goal achievement may be more of a “stretch” if the outcome is more difficult to begin with. Is what we are asking the organization possible?
  • Realistic: Start small and remain sharply focused with what the organization can and will do and let the stakeholder’s experience the joys of meeting their goals.  Gradually increase the intensity of the goal after having a discussion with the stakeholder’s to redefine the goal.  Is our goal realistic given the budget and timing constraints?  If not, then we might want to redefine the goal.
  • Time Bound: Set a timeframe for the goal: for next quarter, in six months, by one year. Setting an end point for the goal gives the stakeholders a clear target to achieve.  Planning follow-up should occur within the 6-month period (best practice) but may occur within one year period or prior based on progress to plan.

Defining the desired end state is accomplished through a set of questions used to draw participants into the process to meet our SMART objectives.  This set of questions is compiled, evaluated, and presented in a way that is easy to understand. Our goal here is to help everyone participating in the work to immediately grasp where the true gaps or shortcomings exist and why this is occurring when we get to step three (3) in the gap analysis phase.  This is true if we are evaluating Information Strategy, our readiness to embrace a SOA initiative, or launching a new business initiative. We can complete the design process by using a variety of tools and techniques. I have used IDEF, BPMN or other process management methods and tools (including RASIC charts describing roles and responsibilities for example). Whatever tools you elect to use, they should effectively communicate intent and used to validate changes with the stakeholders who must be engaged in this process.

Now this is where many of us come up short.  Where do I find the questions to help drive SMART goals? How to I make sure they are relevant? What is this engine I need to compile the results? And how do I quickly compile the results dynamically and publish for comment every time I need to?

One of the answers for me came a few years ago when I first saw the MIKE 2.0 quick assessment engine for Information Maturity. The Information Maturity (IM) Quick Scan is the MIKE 2.0 tool used to assess current and desired Information Maturity levels within an organization. This survey instrument is broad in scope and is intended to assess enterprise capabilities as opposed to focusing on a single subject area. Although this instrument focuses on Information Maturity I realized quickly I had been doing something similar for years across many other domains. The real value here is in the open source resource you can use to kick start your own efforts.  I think it is also a good idea to become familiar with the benchmarks and process classification framework the American Productivity and Quality Center (APQC) has made available for a variety of industries. The APQC is a terrific source for discovering measures and quantifiable metrics useful for meeting the need for specific, measurable objectives to support the end state definition.

How it Works

The questions in the quick scan are organized around six (6) key groups in this domain to include Organization, Policy, Technology, Compliance, Measurement, and Process/Practice.  The results are tabulated based on responses (in the case of the MIKE 2.0 template) ranging from zero (0 – Never) to five (5 – Always).  Of course you can customize response the real point here is we want to quantify the responses received.  The engine component takes the results builds a summary, and produces accompanying tabs where radar graphs plots present the Framework, Topic, Lookup, # Questions, Total Score, Average Score, and Optimum within each grouping.  The MS Word document template then links to this worksheet and grabs the values and radar charts produced to assemble the final document. If all this sounds confusing, please grab the templates and try them for yourself.

Define Current State Diagram

Define Desired End State Model – Click to Enlarge

The groupings (and related sub-topics) are organized out of the box like this to include the following perspectives:

  • Compliance
  • Measurement
  • People/organization
  • Policy
  • Process/Practice
  • Technology

Each of these perspectives is summarized and combined into a MS Word document to present to the stakeholders.  The best part of this tool is it can be used periodically augment quantitative measures (captured in a dashboard for example) to assess progress to plan and improvement realized over time. Quantifying improvement quickly is vital to continued adoption of change. Communicating the results is stakeholders in quick, easy-to-understand format they are already familiar with is just as important using the same consistent, repeatable tool we used to define current state with.

Results

I think you can see this is valuable way to reduce complexity and gather, compile, and present a fairly comprehensive view of the desired end state of the domain in question. Armed with this view we can now proceed to step three (3) and begin to conduct the Gap Analysis exercise. The difference (delta) between these two (current and desired end state) becomes the basis for our road map development.  I hope this has answered many of the questions about step two (2) Define End State. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to define a desired end state quickly in my practice.  Understandings the gaps between the current and the desired end-state across the business, information, application, and technical architecture make development of a robust solution delivery road map possible.

How to build a Roadmap – Define Current State

Introduction

In an earlier post (How to Build a Roadmap) I discussed the specific steps required to develop a defensible, well thought out road map to identify specific actions using an overall pattern all roadmaps should follow. The steps required to complete this work:

I have received a lot of questions about step one (1) which is understandable given the lack of details about just how to quickly gather real quantifiable objectives and potential functional gaps. In the interest of simplicity and my attempt  to keep the length of the prior post manageable specific details about how to do this were omitted.

This post will provide a little more exposition and insight into one method I have found useful in practice. Done well, it can provide an honest and objective look in the mirror to successfully understand where we truly are as an organization and face the uncomfortable truth in some cases where we need to improve.  The reality of the organizational dynamic and politics (we are human after all) can distort the reality we are seeking here and truly obscure the findings. I think this happens in our quest to preserve the preferred “optics” without an objective and shared method all stakeholders are aware of and approve before embarking down this path. In the worst case, if left to the hands of outside consultants alone or in the hands of an unskilled practitioner we risk creating more harm than good before even starting. This is why I will present a quick, structured way to gather and evaluate current state that can be reviewed and approved by all stakeholders before the activity even gets underway.  Our objective is to develop a clear and unambiguous understanding of the current state. We should have a formal, well understood way to gather and evaluate the results.

Define Current State

First, we need to have a shared, coherent set of questions we can use to draw participants into the process that are relevant and can be quantified.  This set of questions should be able to be compiled, evaluated, and presented in a way which is easy to understand. Everyone participating in the work should be able to immediately grasp where the true gaps or shortcomings exist and why this is occurring.  This is true if we are evaluating Information Strategy, our readiness to embrace a SOA initiative, or launching a new business initiative. So, we need just a few key components.

A pool or set of relevant questions that can be answered quickly by the participants and results quantified An engine to compile the results A quick way to compile and summarize the results for distribution to the stakeholders

Now this is where many of us come up short.  Where do I find the questions and how to I make sure they are relevant? What is this engine I need to compile the results? And how do I quickly compile the results dynamically and publish for comment every time I need to? One of the answers for me came a few years ago when I first saw the MIKE 2.0 quick assessment engine for Information Maturity .

The Information Maturity (IM) Quick Scan is the MIKE 2.0 tool used to assess current and desired Information Maturity levels within an organization. This survey instrument is broad in scope and is intended to assess enterprise capabilities as opposed to focusing on a single subject area.  Although this instrument focuses on Information Maturity I realized quickly I had been doing something similar for years across many other domains. The real value here is in the open source resource you can use to kick start your own efforts.

So what does this do?

I’m going to focus on the Mike 2.0 tools here because they are readily available to you. The MS Office templates you need can be found at http://mike2.openmethodology.org/wiki/QuickScan_MS_Office_survey.

Who am I?Extending these templates into other subject areas is pretty simple once you understand how they work. The basic premise remains the same it really is just a matter of injecting your own subject matter expertise and organizing the results in a way that makes sense to you and your organization.  So here is what you will find there.

First a set of questions organized around the following categories:

  • People/Organization considers the human side of Information Management, looking at how people are measured, motivated and supported in related activities.  Those organizations that motivate staff to think about information as a strategic asset tend to extract more value from their systems and overcome shortcomings in other categories.
  • Policy considers the message to staff from leadership.  The assessment considers whether staffs are required to administer and maintain information assets appropriately and whether there consequences for inappropriate behaviours.  Without good policies and executive support it is difficult to promote good practices even with the right supporting tools.
  • Technology covers the tools that are provided to staff to properly meet their Information Management duties.  While technology on its own cannot fill gaps in the information resources, a lack of technological support makes it impractical to establish good practices.
  • Compliance surveys the external Information Management obligations of the organization.  A low compliance score indicates that the organization is relying on luck rather than good practice to avoid regulatory and legal issues.
  • Measurement looks at how the organization identifies information issues and analyses its data.  Without measurement, it is impossible to sustainably manage the other aspects of the framework.
  • Process and Practice considers whether the organization has adopted standardized approaches to Information Management.  Even with the right tools, measurement approaches and policies, information assets cannot be sustained unless processes are consistently implemented.  Poor processes result in inconsistent data and a lack of trust by stakeholders.

The templates include an engine to compile the results and a MS Word document template to render and present the results. Because it is based on MS Office the Assessment_Questions.xlsx, Assessment_Engine.xlsx, and Assessment_Report.docx are linked (rather than relative, they use MS’s way –really hardcoded to find linked files in the c:\assessment folder – yikes!) so that you open and score the Assessment_Questions first, then the Assessment_Engine picks these values and creates a nice tabbed interface and charts across all six subject areas. The Word document picks this up further and creates the customized report.

You can extend this basic model to include your own relevant questions in other domains (for example ESB or SOA related, Business Intelligence).  We are going to stick with the Information Maturity quick scan for now. Note I have extended a similar model to include SOA Readiness, BI/.DW, and Business Strategy Assessments.

How it Works

The questions in the quick scan are organized around six (6) key groups in this domain to include Organization, Policy, Technology, Compliance, Measurement, and Process/Practice.  The results are tabulated based on responses (in the case of the MIKE 2.0 template) ranging from zero (0 – Never) to five (5 – Always).  Of course you can customize response the real point here is we want to quantify the responses received.

The engine component takes the results builds a summary, and produces accompanying tabs where radar graphs plots present the Framework, Topic, Lookup, # Questions, Total Score, Average Score, and Optimum within each grouping.  The MS Word document template then links to this worksheet and grabs the values and radar charts produced to assemble the final document. If all this sounds confusing, please grab the templates and try them for yourself.

Define Current State Diagram

Define Current State Model

Each of these six (6) perspectives is then summarized and combined into a MS Word document to present to the stakeholders.

Results

I think you can see this is valuable way to reduce complexity and gather, compile, and present a fairly comprehensive view of the current state of the domain (in this case Information Management Maturity) in question. Armed with this quantified information we can now proceed to step 3 and conduct a Gap Analysis exercise based on our understanding of what is the desired end state. The delta between these two (current and desired end state) becomes the basis for our road map development.  I hope this has answered many of the questions about step one (1) Define Current State. This is not the only way to do this, but has become the most consistent and repeatable methods I’m aware of to quickly define current state in my practice.