Both men and women can answer in this, but I would prefer to hear the latter's argument. Now, I've never seen this kind of thread before, so I am sorry if it has came up more often than not. However, that being said, what are your opinions on the woman's role within society? Should she, the female grouping as a whole, be more concerned with child-rearing or leading a career - or can she even do both with immaculate execution?
Personally, I feel that the woman's place within the home is paramount for any growing family. A child needs it's mother, and without her he/she is bound to suffer. It must be a "woman's touch" which leads one to feel this way, as they can rear and educate their youths far differently from the male sex. They are generally seen are more caring and devout to their children, willing to risk their lives for their child (that is the stereotypical viewpoint, which I'm sure most women embody).
Equally fair to note is that women's roles within society are changing, and a whole wave of new opportunities are opening up for the female gender. Do not get me wrong, I believe that women, if they choose to, should take up a career if the situation is right - maybe they have a job to support their children, or perhaps they just don't want children altogether, then the situation really changes.
However, is a career more important than motherhood?
I am not one to judge specifically, as my point could be lengthened superfluously; though for argument's sake I would like to hear others' opinions on the matter.
Thank you.
Bookmarks