17 Mental Models, biases and fallacies that Software Developers should know

Posted on Sep 11, 2021

Image

“Don’t forget about Parkinson’s law”, said the product manager during the last sprint planning. “There’s no silver bullet” mumbled the graybeard dev from the back. “Yes, sure, we’ll keep in mind Hofstadter’s law”, said the witty dev lead with a smile growing on his face.

Software development (and to some extent, product development) is a rather new discipline (after all, we’ve been building software for, how long, 70/80 years?) and formal education is not prominent in the field. But we are humans, highly thoughtful humans we, software developers, are. We like to discover patterns, name them, and communicate them to avoid repeating the same mistakes in the future (to avoid repeating ourselves). These patterns are conceptualized through Mental Models, Laws, Fallacies and other general aphorisms.

If you’re new to software development, here are the most common ones that every developer will at some point hear, say, or experience.

Note*: the models, laws, fallacies described here will be* bolded and will link to an article expanding on it. There’s also a reference (tldr;) of all the models discussed at the end of the post*.*

Software Development is hard

Software is a medium to materialize people’s “ideas” of a solution for a problem. But as humans, we have a very hard time conceptualizing what are the boundaries of the problem we’re trying to solve; let alone, the idea of the perfect solution for that problem. In my experience, all the problems that we try to solve with software should be treated as Wicked Problems: a problem that is so difficult, so problematic, that it can’t even be clearly defined. The requirements and business are always changing, stakeholders come and go, and there’s no concrete definition for it. This is the reason why Agile replaced waterfall, and why prototyping is so important: as Fred Brooks, the father of modern Software Engineering, said: Plan to Throw One Away

In most projects, the first system built is barely usable….Hence plan to throw one away; you will, anyhow.

— Fred Brooks (we’ll see a lot more of Brooks in this post)

Before the project starts: the models involved in estimations

Here’s something you need to know if you’re getting started with software development: estimations never go right. We add a lot of buffer to the “realistic” estimation: if we think it’ll take 4 weeks, we just say 2½ months. We’re not cheating, we’re using the Ninety-ninety rule:

The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.

— Tom Cargill | Jon Bentley

But good managers know that devs will tend to cover themselves and ask for extra time; the usual saying is “it’s best to deliver early on an inflated deadline, than be delayed on a realistic one”. That’s why we always keep in mind Parkinson’s Law, which states that “work expands so as to fill the time available for its completion”. If you give engineers 2 months or 4 months to complete the same task, they will, in both cases, deliver by the end of the deadline. When you ask why, the usual response will be Hofstadter’s law: a humorous, recursive law that states that projects will always take longer than expected, even when you take into account Hofstadter’s Law.

Common pitfalls in the process

Not just developers, all human beings suffer from Optimism bias: we think “it’ll be better this time”. But, as developers, we’ve learned about optimism bias early on and we’ve developed a more pessimistic approach (consider it a strategy for survival). This pessimistic view makes us always consider the worst case scenario, which requires a thoughtful analysis of the problem in order to identify the worst case in which something can go wrong.

The worst type of bias that a developer can suffer from is the above-average effect: there are many of these fun experiments resulting in, for example, 90% of people stating that they’re “above average” in a certain skill. This is, of course, statistically impossible. This just translates to a stubborn/arrogant developer. The counterweight to the above-average effect is Impostor Syndrome, and it’s a big deal in our industry. I know a lot of people that suffer from Impostor Syndrome, and in my eyes, it only seems to be getting worse. I personally suffer tremendously from Impostor Syndrome. Heck, I can anticipate the anxiety I’ll experience when the time of publishing this post approaches (EDIT: as anticipated, this post has been sitting as a draft for 4 months now)

There are 2 other biases that constantly manifest in software projects and are problematic. First, the hard-easy effect: we tend to underestimate the complexity of large projects, and overestimate the complexity of simple ones. Adding a button to the website? “That’ll be 2 weeks”. Re-writing the entire system in Rust? “Around 6 weeks”. This happens because simple problems are easier to judge and the boundaries are more clear. Harder problems are, as we saw before, wicked problems. The second bias we suffer from is Maslow’s law of the hammer: once we’ve mastered a programming language, a framework, a cloud provider, or a particular architecture, we tend to use it over and over again, without analyzing if it’ll be the right tool for the job.

Related to the biases expressed above, we can find the Second-system effect: the tendency of small, elegant, and successful systems to be succeeded by over-engineered, bloated systems, due to inflated expectations and overconfidence.

For me, the Second-system effect can manifest itself in two forms. First, as the problem of underestimating the complexity of rewriting an already-functioning piece of software. There are two great examples in this case:

  • Netscape’s, when they decided to completely rewrite the code from scratch and it ended up taking 3 years, proving fatal for the company
  • Bloomberg’s 25M lines of Fortran They’ve been trying to switch to C++ for 15 years without success.

There are multiple edges to a rewrite project that we tend to underestimate or just forget: freezing features, bugfixes happening in the current codebase, changing requirements, architectural changes, data migrations, etc. The second manifestation is when the rewrite project is “successful” (ie: it happens), but the resulting system is more complicated, less stable or just plain uglier than the previous one. The Wikipedia article of the second-system effect brutally summarizes its cause as:

Inflated expectations and overconfidence.

Stupid business people

We’re not always in the wrong. After all, we just listen to business requirements (which “in theory” describe the correct solution to the underlying problem), and materialize them into working software. But the folks in charge of coming up with these requirements and asking for deadlines also suffer from biases. There are 2 in particular that I want to point out and might be useful in your next planning/grooming meeting.

First, the Dunning–Kruger effect, which causes people unfamiliar with software to overestimate their capacity or ability at understanding it. We’ve all heard our managers say something like:

What!? 2 weeks to add a button to the site?

— Mr. Manager

The problem is that they see JUST that one button. But we can see the full picture: design and development time, the A/B testing framework in place, the statistics microservice, the React Native application, QA, load tests, etc.

The second “bias” (not really a bias, but a sarcastic quote) is SMOP, small matter of programming: this is when people are discussing things, ideas, experiments, but they forget that someone has to actually sit and write the code. It’s similar to the Dunning-Kruger effect, it reaffirms this idea that people not familiar with software tend to underestimate the complexity of it.

Reacting to estimation failures

It’s too late, the damage is already done, we won’t make it to the deadline. What have we learned in these past 70/80 years to help us in this situation?

First, the well known Brooks’ Law: adding more people to a delayed project won’t help; on the contrary, it’ll slow it down. Second, another of Mr. Brook’s laws, there’s No Silver Bullet: in this situation, we can assume that there’s no magical solution we can use to make up for the delay. We just need to re-estimate, communicate the delay, and mitigate damages.

At this point, it’s also useful to just reevaluate the project altogether and assess if it even makes sense to keep working on it. A related bias is the well known Sunk Cost Fallacy: “we’ve come this far, we shouldn’t stop now”. The sunk cost fallacy is one of my favorite ones, it happens all the time, even on trivial situations. For example, you’re 50 minutes into a movie you don’t like, but instead of just stopping, you feel you have to finish it, because you’ve already put 50 minutes of your life into it.

The best known example of the Sunk Cost Fallacy is the project Concorde, the supersonic airplane. The project’s original estimation said it’d cost £70 million, but it ended up costing £1.3 billion. It was such a disaster that, “Concorde Fallacy”, is a synonym of the Sunk Cost Fallacy (if you want to read more about the Concorde project, check out this thread, with comments from pilots and designers). One of my favorite movies, Dog Day Afternoon, is also a good example of the sunk cost fallacy.

Closing remarks

If you’re new to software development (or IT in general), it’s expected that you’ll focus most of your time on learning hard skills (mastering Python, AWS, React, etc). But with time, you should try to learn more and more about the discipline itself, the biases and mental models involved; what we call soft skills. They’re technology-agnostic and timeless. These skills will stay with you during your entire career.

Glossary of models, biases and fallacies used in this post

  • Wicked Problems: A problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognize.
  • Plan to Throw One Away: associated with rapid prototyping, if you plan to throw the first version away, developers won’t feel so attached to it.
  • Ninety-ninety rule: Humorous aphorism that states: The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time. Basically, projects are always delayed.
  • Parkinson’s Law: Work expands so as to fill the time available for its completion.
  • Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.
  • Optimism bias: a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event.
  • Worst case scenario: a concept in risk management wherein the planner, in planning for potential disasters, considers the most severe possible outcome that can reasonably be projected to occur in a given situation.
  • Above-average effect (or Illusory superiority): a condition of cognitive bias wherein a person overestimates their own qualities and abilities, in relation to the same qualities and abilities of other people.
  • Impostor Syndrome: a psychological pattern in which an individual doubts their accomplishments or talents and has a persistent internalized fear of being exposed as a “fraud”.
  • Hard-easy effect: a cognitive bias that manifests itself as a tendency to overestimate the probability of one’s success at a task perceived as hard, and to underestimate the likelihood of one’s success at a task perceived as easy.
  • Maslow’s law of the hammer: a cognitive bias that involves an over-reliance on a familiar tool: “It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”
  • Second-system effect: is the tendency of small, elegant, and successful systems to be succeeded by over-engineered, bloated systems, due to inflated expectations and overconfidence.
  • Dunning–Kruger effect: a cognitive bias in which people with low ability at a task overestimate their ability.
  • SMOP, small matter of programming: is a phrase used to ironically indicate that a suggested feature or design change would in fact require a great deal of effort. It points out that although the change is clearly possible, it would be very laborious to actually perform. It often implies that the person proposing the feature underestimates its cost.
  • Brooks’ Law: an observation about software project management according to which “adding manpower to a late software project makes it later”.
  • No Silver Bullet: there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity.
  • Sunk Cost Fallacy: may be described as “throwing good money after bad”, while refusing to succumb to what may be described as “cutting one’s losses”.
  • Postel’s law: related to Software Engineering more than management; it states that software should be “be conservative in what it sends and liberal in what it accepts”. Meaning, if you define a function, you should “prepare for the worst” regarding the arguments it accepts, but what it returns (or raises) should be consistent and restricted to what’s documented.

This post was automatically migrated from Medium. It still lives at: https://medium.com/@santiagobasulto/17-mental-models-biases-and-fallacies-for-software-developers-42f107bfcb84