Removing Bias from your Software Selection Process Delivers Results
Digital transformation has evolved to a point where technology can make our lives and jobs easier through software that leverages new ideas to eliminate tedious aspects of our processes. Despite this, software project failure, abandonment, and overrun rates remain high. The best way to manage this is to focus on the common mistakes made by project participants. In general, these mistakes are due to cognitive bias. Here, we define and outline some examples of cognitive bias that affect software evaluation project success.
Avoid these 10 cognitive biases when selecting enterprise software
1 Anchoring bias
Anchoring bias happens when pre-existing information or the first thing you find influences your decision making.
Let’s say, for example, you are looking into a new ERP system for your organization. You see one for what appears to be a reasonable price and a second one for half the cost. You might assume that the second option is ‘better’ based on the first price you saw, despite maybe the less expensive option being an ideal match for your company’s requirements. The first price you saw influenced your decision.
2 Availability heuristic
Availability heuristic is your brain confusing ‘easy’ with true. Also known as availability bias, ‘availability heuristic’ is a mental shortcut that relies on the examples you can think of now when evaluating a specific topic, concept, method, or decision.
“That software probably won’t work for us because last time we used that company, the project failed.”
3 Blind-spot bias
The blind-spot bias means recognizing the impact of biases on others’ judgments while failing to see the effects of biases on one’s own judgments.
“Julie thinks that software won’t work for us because last time we used that company, the project failed, but I have worked with them successfully before, and I know they will nail the project!”
4 Choice – supportive Bias
Choice supportive bias is the preference that you give to the choice you have made, despite the choice being flawed.
For example, if you choose CMS option A instead of CMS option B, you are likely to ignore or downplay option A’s faults while amplifying or subscribing new negative flaws to option B.
5 Clustering Illusion
The clustering illusion is the tendency to see random patterns in random events, as non-random.
Often with software evaluation projects and processes, there are so few data points that patterns cannot exist.
6 Confirmation bias
Confirmation bias is searching for, interpreting, favouring, and listening only to information that confirms pre-existing notions that supports one’s prior beliefs or values.
“I read the vendor’s case study for a successful project with another company, and I firmly believe that this vendor will be perfect for our project.”
7 Conservatism bias
Conservatism or conservatism bias refers to the tendency to revise one’s belief insufficiently when presented with new evidence, favouring old evidence to new evidence.
Let’s say – a number of years ago, you had a bad experience with an LMS system. The product didn’t meet your company’s needs at the time, and sat on the shelf, causing the company not to resign when the contract expired. Fast forward a few years, and you are looking to procure a new LMS system for an entirely different company, with entirely different requirements. Objectively, the previous vendor met the needs of your business. Because of your previous positive experience, you do not revise your beliefs sufficiently and fail to choose the best option for the new company.
8 Information bias
Information bias (also called observation bias or measurement bias) happens when critical information is measured, collected, or interpreted inaccurately, resulting in decisions being made from the wrong information.
If you are procuring software for many teams and stakeholders, it is important that their needs are accurately captured, so there is enough information to make the best decision. This process is often shortened or overlooked, and sometimes the quality of data can affect the quality and outcome of the project.
9 Ostrich effect
The Ostrich effect is ignoring dangerous or negative information that should be used in decision making, that people would mentally prefer to avoid.
For example – when people ignore the needs of IT departments because it is just seen as a cost centre. IT are often highlighting potential risks and necessary upgrades but are often ignored because “if it ain’t broke, don’t fix it”. Until it breaks, and then it’s a massive scramble to get things fixed. They are also often ignored during evaluations until the last mile.
10 Bandwagon effect
Did you make sourdough during the COVID – 19 Lockdown? If so, you may have succumbed to the bandwagon effect. This is a phenomenon in which people do something because they see other people doing it. In software evaluation, this may show up as buying whatever your competitors are using, even if it’s not the right product for your business.
How do you avoid these biases, and choose the best software vendors for your business?
If you are aware of these biases and are ready to let them go, Olive is the tool for you. Olive is an end-to-end technology sourcing platform. Olive can take the role of a consultant, or enhance the output from your existing technology partners. Olive’s platform simplifies the software evaluation process, getting you to a solution faster than a traditional RFP. It is is efficient and free of bias, revolutionizing the process of sourcing technology and delivers the value you would get from an IT consultant.
Everything we do here at Olive is without bias. We make decisions based on facts, not opinions, and pass this down to our customers.
We are the only software evaluation platform that does not charge our vendors, so by using Olive you can be confident that you are getting the most accurate evaluation for your organization’s unique needs, and ultimately choose the right software!
Acknowledgments
Part of the information in this article comes from;
Kent Hendricks The availability heuristic: Why your brain confuses “easy” with “true”
Samantha Lee, Drake Baer 20 cognitive biases that screw up your decisions
Daniil Pavliuchkov Fixing your blind spot: biases in decision making