Overfitting
There is an old saying that if a cat jumps on a hot stove once, it will never jump on a cold stove again. The wisdom of this is that we have a tendency to overgeneralize about the world from relatively limited information. The thing that allows humans to navigate stoves safely is we have a more complete model of the world and we understand stoves at a more fundamental level.
Still, I suspect we are all people who have been burned once and rather than build a more complete model of the world to navigate it safely sometimes we just decide to write off an activity completely. I have seen people give up too quickly on learning new skills, giving people feedback, building new habits, and more.
But more pernicious than overgeneralizing from a single data point is overgeneralizing from multiple data points, because it gives us much more confidence that our model is accurate even if it may not be.
It starts with someone who is faced with a problem. They talk to some friends about it and find that those people have also experienced something similar. And with just those data points they extrapolate out a line. Their local problem is now a global one. And global problems can feel too big to fix. This is a very disempowering turn of events.
I will grant that there certainly are global problems. But there are orders of magnitude more local problems, even if many of them are very similar. Indeed it is much more productive to assume every problem is local and can be solved locally, even if other localities have the same problem, until unequivocally proven otherwise.