As far back as one can go in history to now, women have been treated as inferior to men by the society in almost all cultures of the world. Maybe inferior is a bit of a harsh word, so you could replace "inferior" with unequal.

Why do you think that is?
Has God actually made women inferior, and nature just took it's place with women being less equal to men in societies? Or has the society created such a system?
Any other inputs?