With new technologies, its common to do a readiness assessment to make sure that you're ready for the go-live but what exactly are the measures that organizations can use to quantify how ready they are and what are the risks? That's what I want to talk about here today.
When organizations embark on their digital transformations, they inevitably come up on a window or that period within an implementation where they have to start thinking about go-live. Oftentimes, organizations will conduct a go/no-go assessment, a kind of a yellow/red/green sort of assessment of where things are and where they think the status and the risks are. That's typically a very qualitative exercise, it's usually based on gut feel, there's not a lot of measures or metrics behind that but what organizations should be doing instead, is looking at tangible measures that quantify how ready the organization is.
It's very rare that an organization is going to be 100% ready but we can at least quantify how close to 100% they are and of the areas where we're not at 100%, what are the risks, are we comfortable with those risks? Quantifying these measures is very important in terms of ensuring that the project team is comfortable and ready to go forward to go-live but also, so that your executive steering committee and your executive team can be comfortable as well.
Now, the reason this is so important, in addition to getting that alignment and understanding and quantification of risk and readiness, is because operational disruption is a very damaging and costly thing that affects too many organizations. Organizations go-live with new technology and they spend exponentially more fixing problems after go-live than what they spent on the original implementation and those costs are oftentimes hidden and unexpected, so that's why this whole concept of quantifying and identifying what the right measures are for go-live readiness is so important.
The first and probably easiest thing to measure leading up to go-live is user test acceptance. When you've gone through the technical testing and then gotten to the user acceptance testing, you want to measure what percentage of scenarios have successfully passed the user acceptance test process.
We're not expecting that we're gonna have a hundred percent conformity and 100% validation, that every process is working perfectly and that every stakeholder that's involved in the testing process has given 100% thumbs up but we at least want to have some sort of measure that tells us how close we are.
Now, the caveat to this is that this measure is heavily dependent on having the right denominator and the denominator in this equation of understanding what percentage of the test scenarios have successfully passed user acceptance testing is based on the assumption that you have the right scenarios in place. One of the problems that organizations often have is they don't have the right process and test scenarios or they don't have enough of them.
In other words, maybe it's only enough to cover 40 or 50 percent of their core operations but there's a whole host of other test scenarios that they haven't tested or measured, so they falsely believe that they are ready for go-live when they haven't considered some of these exceptions or these other business processes.
So, you want to make really sure that before you start relying on the this measure of user test acceptance that you have the right scenarios and that you are capturing all of the end-to-end processes that are going to be important to running your business after go-live.
Another aspect of user test acceptance that can be measured as well is requirements traceability. Presumably, early in the project, maybe even going all the way back to the evaluation and selection phase of your project where you first evaluated in selected technology, you presumably had a set of business requirements that you know you wanted the transformation to address and at some point later in the project you go back and you measure what percentage of our business requirements have we met. Ideally, you've met close to 100% of your must-haves or your critical business requirements. Maybe your lower and medium priorities aren't quite as high but you at least have a good understanding of what percentage we've accomplished and we can then look at the results and decide whether or not we're comfortable with that measure so that's another way to measure the results of user acceptance test as well throughout the project.
Another measure that can be very useful in identifying and mitigating risk is a measure of how clean your data is and how ready the data is. When you think about some of the showstoppers and things that can really disrupt a go-live, oftentimes what happens is you have technology that's been fully tested and vetted, the configuration works, the customization works, the transactional workflows work but now suddenly we do the mass data load of all of our legacy data into the system and it doesn't work, partly because the data isn't clean.
Technology might be working perfectly but the data is not clean and therefore we're ending up with bad transactions and inaccurate information. You really need to make sure you understand and measure how clean the data is, so for example, you might find that you have only cleansed and validated 10% of our entire data set that we're going to be migrating, that's a pretty high risk data migration proposition because there's 90% that we haven't tested. We have no idea if that data is ready and chances are it's probably not. On the other hand, we might find that we have validated, tested and cleansed 90% of the data, in which case we're not at 100 but we're pretty close. Maybe we're comfortable with that and we can live with that risk. If we're not comfortable, then that tells us that we need to go back and do some additional work to get those measures to where we need it to be to be comfortable to go-live.
Another area that's relatively easy to measure is user adoption and change readiness. There's a whole host of ways that we can measure people's readiness and key stakeholders’ readiness prior to go-live.
When you go through the training process and you have trained end users on how to use the technology and how their new business processes are going to work, you can actually measure what percentage compliance and understanding each person has within the organization. Inevitably, we're going to find that some parts of the organization have higher percentages than others or have more favorable results than others but at least we've quantified it.
Next, we can make educated decisions on whether or not we're ready or if we need to take any sort of remediation source of action or refresher training to ensure that we have the people ready before we go live now again the important thing here is to make sure that you have the right denominator in measuring this so in other words we need to make sure that we have the right training materials and the right training scenarios to ensure that we're measuring the right things.
If our training scenarios are flawed or incomplete, it may lead us to falsely believe that we're ready for go live when in fact we're not because there's other training processes and scenarios that we should have had up front.Tthat's the part that's hard is because you need to make sure you've got the qualitative understanding and comfort that you've got the right population that you're measuring but once you do then you can start to rely on those metrics to determine whether or not you're ready from a digital adoption perspective.
The next measure that's important to determine go-live readiness is the contingency planning. In other words, do we have contingency plans for different parts of the business or what percentage of our operations have some sort of contingency plan if things run into trouble at the time of go-live?
For example, you might identify shipping as a very important part of your go-live, something that cannot fail when we go-live. You might look at the shipping process and say “how ready is that part of the organization, what is the contingency plan if the software doesn't work or the system creates problems or operational disruptions, what's our fallback plan, what percentage of our operations have that sort of fallback plan?”
The higher that percentage is, the more comfortable executives are going to be in moving forward and taking that risk, the lower the percentage is the more concerned executives might be with some of those risks. Making sure that you have that solid understanding of what the contingency plans are as well as what percentage of the operations have some sort of contingency plan is another way we can measure collaborating this.
All of these measures we've talked about so far lead us to creating a risk assessment scorecard and it also provides some clarity and understanding for your executive team to make an educated decision on whether or not they should go-live and whether or not the organization is ready for go live. This is really important because go-live is a decision that should be ultimately made by the executive team, they should be fully aware of the risks, they should be fully aware of the quantitative metrics that we've talked about here today and they should be the ones to decide whether or not they're comfortable with the risks and trade-offs that go along with that.
No organization is ever 100% ready but they need to be comfortable that based on the risk tolerance of their organization and whatever time pressures they might be under, to move forward with the digital transformation they need to make sure that their strategic objectives and priorities are considered as well and they ultimately need to own that decision.
The project team can provide the scorecard, provide the recommendations for what they think but ultimately it should be the executive team that makes that final decision, so that risk assessment scorecard is a really important way to do this
In each of these areas this risk assessment is used from start to finish to identify what the risks are along the way so you can start to mitigate those risks and that's another framework that might give you some additional ideas for additional measures and metrics that you might use to ensure that you're ready for go live or at the very least you're measuring your readiness for go live.
If you are looking to strategize an upcoming transformation or are looking at selecting an ERP system, we would love to give you some insights. Please contact me for more information firstname.lastname@example.org
Be sure to download the newly released 2023 Digital Transformation Report to garner additional industry insight and project best practices.