Comment Re:Honest question. (Score 1) 479
To flip things around for a moment, what about all those female-dominated careers? Why is it that we aren't up in arms about the fact that yoga studios, elementary schools, secretarial staff, birthing services, and hospital nursing staffs are overwhelmingly dominated by women? Nobody seems to be losing sleep over the idea that there is some kind of pervasive gender discrimination that discourages men from these careers. Is that because these careers are seen as somehow less worthwhile- and if so, why? Because women do them?
Modern feminism seems consumed with the idea that career success for a woman can only come by pursuing a traditionally male career path. But this seems like an incredibly sexist viewpoint, because it's assuming that the only kind of job that's worthwhile or important for a woman to aspire to is one that a man traditionally has done. If you're not a CEO, a surgeon, a professor, then you're somehow less worthwhile. But taking care of other people- which is something a lot of female-dominated careers have in common- is incredibly important, and probably contributes as much or more to society than coming up with a better way for Amazon to flood my inbox with special offers.
The other issue is that feminism seems obsessed with the idea that women will be happy if they can pursue these career paths. But here's a thought. Maybe women opt out of certain career paths in favor of other career paths because those career paths better fit what they want out of life. Maybe many women- not all of them, but a lot of them- find working with kindergartners or being a midwife more rewarding than firing employees, shooting at insurgents, or writing computer code.