In my own experience, I don't want to tell them about my anxiety (but know I should at some point), in general you don't want to tell a man to not cry, or to tell them to make more $$ .. I think those things are degrading that women (some) think men should be 'men' (and even men themselves sometimes believe this).....
I mean, essentially it is degrading to men to think all they 'Have' to do is make more money- it is ridiculous. as if their worth is their job/money. (same for women though too)...
Similar to my answer for the "woman" version of the question, I try to avoid thinking "never." And, I'd probably could end up saying the wrong thing to a guy, anyway. :)