In western culture and society the gender line is becoming more and more blurred, particularly in the last 2 generations. Where has the archetypal male image gone? What do women think about this? What do the guys think?
Do women find men wearing women's pants attractive? I know of some chemical issues surrounding a contaminated food supply (BPA, etc.).
What do people feel is exactly going on?