Feminism [fem-uh-niz-uhm] noun:
1. the doctrine advocating social, political, and all other rights of women
equal to those of men.
2. ( sometimes initial capital letter ) an organized movement for the
attainment of such rights for women.
(source: dictionary.com)
3. Also defined as "what is absent rather than what is present"
(source: Feminist Approaches 197)