So during a tutorial discussion–about who should take the initiative in a relationship: the boy, the girl, or both also can–a girl in the class explained (to the best of my memory) why she thinks only boys should take the initiative:

I think guys should be made to woo girls, to try to win their hearts, so that they wouldn’t take girls for granted.

Well…perhaps that made sense to some..but I feel it sounded a wee bit off. To me, it sounded as if girls would definitely not take boys for granted, and therefore everything hinged upon the boy.

Really?