Then your sample may be unrepresentative in a consistent direction. That sort of error is called “statistical bias.” When your method of learning about the world is biased, learning more may not help. Acquiring more data can even consistently worsen a biased prediction.
The idea of cognitive bias in psychology works in an analogous way. A cognitive bias is a systematic error in how we think, as opposed to a random error or one that’s merely caused by our ignorance.
There’s a completely different notion of “rationality” studied by mathematicians, psychologists, and social scientists. Roughly, it’s the idea of doing the best you can with what you’ve got. A rational person, no matter how out of their depth they are, forms the best beliefs they can with the evidence they’ve got. A rational person, no matter how terrible a situation they’re stuck in, makes the best choices they can to improve their odds of success.
For a human, rationality often means becoming more self-aware about your feelings, so you can factor them into your decisions.
Rationality can even be about knowing when not to overthink things.
Dan Ariely notes: we’re predictably irrational. We screw up in the same ways, again and again, systematically.