Okay, so some Hollywood-based female says something. Why should we believe someone from Hollywood? Why would anyone consider a “star” (of one sort) an expert on society. Actually, all we are getting is an opinion and in a free country–everyone can have an opinion.
Who said it and what did she say? I am referring to Cameron Diaz who said:
I think we have to make our own rules. I don’t think we should live our lives in relationships based off old traditions that don’t suit our world any longer.