Photo by Isaac Smith on Unsplash

Why technology business cases are so awfully tricky

Dawid Naude
11 min readApr 14, 2023

--

“What are the typical productivity benefits companies get from rolling out this system?” This is a question I receive at least every month.

What they’re hoping I’ll say is “it depends, but typically 10–15% productivity is a reasonable indicator”. In this post, I’ll explain why I pretend I have an urgent call and slowly sidle away when I get asked this question, instead of a heated debate as I used to.

The timing of this question is also telling if they’re looking for a clear business case or just reassurance that they’ve made the right decision well into the project implementation.

Firstly, there is absolutely no ‘typical’ anything in a technology project. Even if two companies seem like they’re doing the same thing, they will be fundamentally different. It’s like asking “what are the typical user benefits of all-purpose flour?”, and the response would be an awkward “well it kind of depends what you do with it”.

They’d like to know how much better users will do their job, “better” meaning faster, measured by average handling time, first touch resolution, more sales, less mistakes. More money in the door, some expenses out the door.

What I’ve learnt is that user behaviour is very difficult to predict. There’s no certainty that a new sales tool will have any impact on increasing sales, as it will still come down to how disciplined, thoughtful and skilled the sales team is. No tool will ever replace that. A contact centre team may already be very skilled in handling calls with 5 systems and simplifying it to 1 might only have a marginal improvement for reasons I’ll explain here.

Areas where there are a much more certain benefit is technology cost, if you’re replacing 5 systems with 1, then the associated licences, infrastructure and support should sum to a $ figure. I’ve also observed that platform stability is more certain, the new platform has less severe issues resulting from back-end and infrastructure operations, as these are typically now handled by a cloud platform who has system stability built into their service level agreement and value proposition.

What should also be highlighted more frequently in business cases is the impact to business agility, meaning how responsive the technology to changes in how a business operates. It’s not uncommon for a technology stack to determine the speed of a business. You may want to launch a new product that is designed and approved in 3 months but then takes 9 months to have it reflected in the billing, ERP, CRM, HR, marketing systems. Your new tech stack may significantly reduce this time, which is very hard to put a hard dollar metric on, but I challenge that it can be done with enough thought. It’s just hard to put in a business case, “we’ll be able to do things quicker in the future”, then the CFO will rightly ask “well what does that mean in $”.

Factors that impact the relationship between a new system and productivity

So now we’ve determined that technology costs, stability and possibly platform agility are good areas to put in a business case. However, one could argue that these costs pale in comparison to the wage bill, and so ultimately a company would like to reduce or repurpose headcount, do more with less. Where benefits become much messier is in predicting business behaviour. Here I outline my observations of considerations and biases to be aware of to avoid overly optimistic business cases.

For illustration here I’ll regularly use the concept of a new platform rolled out to a contact centre, with the hope that the average handling time of the calls will reduce. This system will record all interactions with customers, display their products, support them with helpful articles to help the phone call.

Exaggeration of the role of the system in the metric — Systems will typically tell you customer information, knowledge articles on how to service a request, productivity tools to verify a customers identity, tools to complete a sale or perform some operations. If a customer interaction has a typical average handling time of 10 minutes, then you really need to dive into why it’s taking 10 minutes to have any opinion on a possible improvement. In a recent inquiry of why calls were taking up to an hour after the roll out of a system, we did some observation and found that the systems only accounted for a few minutes of the hour, which was an improvement from the previous system. The rest of the time was interacting with 3rd parties, doing analysis and running a step by step diagnosis with the customer. It’s very easy to think that your tool is the centre of your users universe, but it’s typically just one small part.

Your transformation needs to be holistic to make a holistic change.

Baseline competence, resourcefulness, muscle memory — If you have 5 applications in a call centre to service a typical interaction, the agents have typically become extremely good at using those systems, which is evident when you observe them bouncing between applications, copy/paste at lightning speed, three screens and whilst the customer is still trying to explain their issue, the agent has already completed most of their steps, despite being on 5 systems. They’ve become extremely fast at their existing stack of tools. Moving to a new system will mean that muscle memory will require some time to adapt again, and that competent usage of the new application may not be as fast as hoped compared to the competent usage of the 5 applications prior.

Some benefits may be restricted to ‘ease of learning’- A highly

competent user will power through multiple applications at lightning speed without missing a beat, it’s a wonder to behold. If you rolled out a new system they’ll take a big step back as they establish trust, muscle memory and fluency in the new system. Once they’re up to full competence, the benefit may not be as big you as you think because they were already so well versed. Where there may be a benefit though is how quickly a user learns the system, so your ‘speed to competency’ metric may improve (this is a tough one to measure), but your ‘speed at competency’ metric might be similar to before. Depending on your attrition, ‘speed to competency’ could be a critical metric, or not.

Engagement layers, 80/20 rule- Technology transformations typically include consolidating several applications. They’ll also attempt to make it easier for end users by consolidating the interactions from the multiple remaining applications into one application, for example, having ERP information built into a CRM. Showing information from one system in another can be a complex thing, so a seemingly logical approach of 80/20 is applied. “Let’s put in the 20% of information the users typically need from the ERP 80% of the time into the CRM.” The thinking here is that the majority of the time you’re only getting things like billing information, the most recent invoices, accounts and notes from the ERP, so put those in the CRM and if they need to get any additional ERP information, then they use the ERP and not the CRM. So we’ve saved needing to use the ERP most of the time, sounds logical doesn’t it?

My thinking was aligned to this, until I visited a call centre and saw that the agents were still opening the ERP for every single interaction. I enquired why, and it was revealed that it’s just easier to source the information from the ERP every time as habit, instead of realising that the information you need isn’t in the CRM halfway through. They liked the way the ERP looked, the way it gave a holistic picture of all data, even though most wasn’t required, and it was easier to always open it instead of making a decision each time if you need to.

Not a rule, just something to be cautious of.

A ‘speed limit’ on the call — The speed at which the call is handled might also be limited by the speed of the customer being able to provide and confirm the query. The first few minutes of the call could be getting clear with the customer what the issue is, the final few minutes could be making sure the customer is very clear on the resolution, systems might partially help speed up the interaction, but if a customer needs 10 minutes to for you to full understand their issue, then that’s 10 minutes well spent.

Some thoughts on the creation of the business case, metrics and measurement.

Creating a business case should be undertaken very carefully, firstly consider if it’s required. If something has to be done, like a major upgrade, an unnecessary or rushed business case could cloud an otherwise successful program, and several years later you’ll still be chased by a finance team to explain why you haven’t achieved the benefits in the original business case.

Here are some common traps and considerations.

Exaggerated user impact of the project — The most common issue here is the impact on user behaviour. A new tool might be rolled out assuming every user will be 25% more productive and over time their headcount will reduce without losing any productivity, and then this doesn’t happen despite a successful roll out. Review the factors that impact the relationship between a new system and productivity I listed previously for examples of this.

Baseline establishment — The existing systems and process most likely don’t have accurate capture or reporting which makes it a very difficult picture to paint first up. Going back and revising the baseline of a business case after benefits aren’t realised severely damages confidence in the process and data as it feels like data will continue to be massaged until the answer is one that people wants. If you are committing to baseline data, make sure it’s accurate, even if this means needing to manually baseline over a period.

Unpenalised non-compliance — You may experience an uplift in productivity that isn’t reflected on the profit and loss statement, due to the increase now making non-compliance activity compliant, but that non-compliance previously not penalised. For instance, your regulator may require you to respond to customers and resolve queries within 48 hours, but you’ve rarely met this metric, but you haven’t been fined. Now you do meet it, but on the business case it’ll be reflected as “we weren’t compliant but now we are”, which won’t translate to $, which opens up the query “could we get compliant another way without this investment”.

Simultaneous events — You may roll out at a similar time to an expected or unexpected event. A regulatory change that requires a specific script to be read out on the phone, or a pandemic that drives significant call volumes. Any industry that is subject to frequent regulatory changes should apply caution to forecasts.

Confirmation bias — The company is already behaving as if there is significant value in doing this, and already preparing for the implementation. Now, somewhat after the fact, a business case is being developed to make the process compliant. This leads to hunting for any indicator of benefit, painting a rosy expectation of that benefit, and ignoring any other data that doesn’t support it.

Skin in the game- If I could pick only one it would be this. Many customers rely on partners to develop a business case where they get paid regardless of the realisation of that business case. If you’re requiring external help, make sure the parties involved have something at stake during implementation. Be particularly mindful of software vendors who commit to benefits, whilst getting paid regardless of whether they’re realised. Also significantly challenge any advised ‘typical’ benefits, as these are often self reported by simple surveys without the vendor revealing the underlying survey, sample group or methodology. In short, unless someone doesn’t get paid if benefits aren’t realised, be highly critical of their input.

The tasks change- You might launch your platform at the same time as a larger technology transformation which may include introducing new digital channels or shifting to a new offshore provider, or releasing a new product. Each of these introduces fundamental shifts in the work your team is doing to the underlying baseline. If you’ve introduced new digital channels, your low value transactions may now be handled by digital channels, and your calls now skewed to longer enquiry types. What will also likely happen in the early days of new digital and self service channels is that they may be missing functionality that users only realise half way through, which then drives a call to the contact centre.

Some ideas of a better way to go about predicting and measuring benefit

As stated earlier, firstly ask yourself if you really need to do a detailed business case to inform a decision. If you have to upgrade a system because it’s end of life, then consider simply doing due diligence on the vendors, approach, pricing than committing to a business case.

Have you already decided you are doing this initiative, or do you need to the business case to make the decision.

Once you’ve decided that a business case is required to make a decision, then you have a very big decision to make on the level of certainty. Are you looking for ‘reasonable heuristics’ or are you looking for predictions. Are you looking for ‘we notice that most agents enter the same information multiple times, so we’ve put a figure of 10% reduction in handling time’ or do you want ‘we forecast our new average handling time will be 517 seconds’.

If you want to develop a very robust business case to drive a critical decision, there is no way of getting around a team of skilled, dedicated individuals that need to spend a ton of time observing, measuring and calculating. If you’d like to know how much a system will improve productivity in the call centre, you need to observe where productivity is being lost at the moment. This where I’m highly critical of elaborate titles like Principal Enterprise Architect who have developed a business case but never spent one minute watching users work, it’s impossible to know the impact of something new without knowing the reality of the current state.

Look at what systems play a role, and what is making up call time. You might expect that the data will tell you this without observation but it’ll only show you a picture of what is captured within data, and limited to the accuracy and completion of that data. It won’t capture the agent writing in MS Notepad, scribbling on a post-it note, pinging teams in slack for help, an ERP bug that requires them to log in every 10 minutes. Only when you watch this process can you get a clear picture of what impact your new system will have. Absolutely do not rely on heuristics and data provided by software vendors without significant critique. If you do not have these skills in your firm, pay someone to do it for you. Encourage critique and being a devil’s advocate, your business case needs to be able to survive brutal critique.

What you might find when you do this is realising that you’re fixing symptoms and that the core issue might be agent training, the complexity in the products you sell, or operational procedures that drive calls into the contact centre. Also be prepared for the harsh realisation that your system has a risk of taking your users backwards. I’ve observed this when a company assumes that moving people from email and spreadsheets to a CRM will improve productivity, but then realise that users can be pretty fast and handy with their scrappy systems. The system instills process, consistency, compliance, reporting but not efficiency. All critical things that any mature organisation should have, but be realistic in your expectations on the impact of this to your profit and loss.

Finally, it doesn’t stop with the business case. The team involved in developing the business case should be involved in the implementation. The business case falls over if implementation doesn’t reflect it. You need the stubborn individual that politely but firmly descopes some great ideas because “it doesn’t support the business case”.

Business cases play an important role, but decide if it’s a process merely to tick a box on a decision that’s already made, or you’re using it to make a decision. Then get your team right and allocate the right amount of time for it.

An example of a great outcome of spending a significant time on a business case is that you choose not to go ahead and confidently proclaim “we thought that this would make a significant change to our business, we really wanted to do it, but after diving into it our investment wasn’t going to make the impact we had hoped, and so we are repurposing that investment”.

--

--