I am an avid viewer of the show, and it's not often I disagree with something Dr. Phil says, but he has said something more than once that I find offensive to my sensibilities.

You do not put your hands on anyone, in anger.

I have a feeling I may be alone in this, but this sentence comes across the wrong way to me every time I hear it, and I'm not even a woman! My purpose for expressing this is in the hopes that it's no longer said, or better yet, rephrased to include not being physical with men as well as women.

Showing 1-1 of total 1 Entries