I’m sure there’s a name for this fallacy, but I don’t know what it is. Perhaps that name is “fallacy of complacency”. If you know the real name, comments are open.
Here’s an example: there are 10 ways I can get to work from home. On average, it takes me 30 minutes. Today I chose route 8 to get there and it took me 20 minutes. I know I made the right (best?) decision because I saved 10 minutes.
The fallacy arises from not knowing the outcomes of the other 9 choices and from basing our standard on past history. There are a few scenarios here:
- Past average is 30 minutes. However, traffic was light today because of a holiday, so the average of the 10 routes was actually 15 minutes. This means we definitely made a sub-optimal choice, perhaps the worst possible choice.
- Today’s average conforms with the past and the average of all routes was 30 minutes. Our route took us 20 minutes, so we did better than average. However, there was a route that would have taken us 15 minutes, so we didn’t pick the best route.
We assume we made the best decision because the outcome exceeded our expectations, even though we don’t know the other possible outcomes.
What’s interesting to me is the effect this has. From that day on, we decide to take that same route to work. Given our complacency with our decision, we lose 5 minutes per trip (assuming the second scenario). At two trips per day, 260 days per year, that’s 43 hours of wasted time per year, or more than one full work week. We lose that time because we failed to realize that just because an outcome met or beat our expectation, didn’t mean there wasn’t yet a better outcome available.
I see this fallacy in many scenarios, especially politics. Take, for example, the financial crisis. I remember the day the $700 BB bailout was announced, the Dow shot up. People hailed the bailout as the right decision. But an increase in the Dow for one day does not a valid decision make. Who knows but that there was a possible response to the crisis that would have made the Dow increase more and stay higher, longer?
I especially hear this fallacy used to justify decisions that take a path outside of known principles. “Sure,” somebody will say, “I violated XYZ principle, but look what happened! Surely it was worth it.” Perhaps, perhaps not. Who knows what might have happened had the principle been adhered to? Perhaps a better outcome, no?
I guess what I’m saying here is that since we don’t know all possible outcomes, a good outcome realized is a shaky foundation for violating known true principles (be they economic, moral, religious, etc), simply due to our inability to see all possible outcomes. It seems the decision that’s most risky is the one that requires a decision that flies in the faith of principles that one knows to be true but that leads to an outcome one considers acceptable.
In my opinion, it’s safer to adhere to known truth than to venture out and accept what is likely to be merely good enough, as opposed to best.
I would love to have some discussion about this, so please feel free to comment.