I've noticed a trend among church-goers. We seem to misunderstand or forget the original meanings of words when liberals or secularists hijack them and change the meaning.
Psychology is one of those. I cringe when I hear people say, "I don't believe in psychology. I believe in turning to God." Really? You don't believe in:
: the science or study of the mind and behavior