I definitely think in the west women are the more privileged class. I've collected some reasons to give, although anyone with half a brain ought to have already know most of these reasons.
How so? Simple- whenever a woman does anything she will be praised for it. If she goes and gets a job, say as an engineer, she'll be glorified as a hero who broke the glass ceiling. If she decides to be a housewife, it'll be understood. If she has no aspirations, it's forgiven, unlike with a man who is expected to always be a wage slave in the rat race.
Take the US airforce. The military encourages women to become pilots, and have lowered the physical standards to allow females to serve in the military.
Legally women are equals now, but in practice have more rights. During a divorce a woman will inevitably end up with custody of a child due to biased judges, and can often get alimony. A woman can cry "rape" and be believed, but if a student says his teacher raped him, he'll be laughed at. Women who commit crimes serve shorter sentences.
Then there are biological factors. Women live longer for example. A woman can choose to become a single parent, but a man cannot.
It has become perfectly acceptable for women to wear men's clothing, in fact it's often considered cute, or routine for business/politics. A man wearing something "feminine" is considered ridiculous.
Women can also approach strangers without being considered dangerous. Women can walk up to a man and say whatever they want. Women can pat a stranger's child on the head without someone suspicions of their intent. Women can safely befriend anyone.
Women can slap men, but when men hit women it's considered a horrible thing. A woman can seek shelter at a battered woman's home, but such shelters aren't funded for men. A woman is unlikely to be homeless, because family or whoever picks them up is likely to give them a home indefinitely.
Sexually women are more free. Women are freer to solicit themselves for sex, (they simply choose not to.) They can more easily choose relationships - they can be materialistic gold-diggers, or trophy wives, or house wives, and their choice to not contribute to the greater society will be respected rather than derided. I don't think lesbians are as hated as gays either. And yeah, there are many fetishes and sex toys that are nearly off-limits to men.
The list goes on an on..
Obviously there are benefits of being a man, but for brevity I'll skip that, since the benefits clearly go the other way in a modern world where we don't need physical strength to fight sabre tooth tigers and hunt antelope. Our laws, and our norms protect women from needing to worry about as much violence as men do, and female bullies they put up with in school do not usually threaten to beat up other girls. (If they did, the catfight would probably be quickly stopped before they could do much damage. With men, the fight is expected.)
After that, the only gender-linked violence I know of are abusive husbands (if you don't get along then why don't you fucking separate?), and the much-touted (but less than one in a thousand) chance of blind rape. Both are infinitely more preventable, and rape isn't nearly as bad as non-sexual physyical abuse. If you shut off your moral inhibitions you might actually enjoy rape, regardless of how you're pinned, since your body is wired to enjoy sex, but it's hard to imagine ever enjoying being pummeled. If you do get raped, well it's not the end of the world, and every single person will comfort you. No one will question your femininity for "losing" in a fight. It's no like having your boobs pleasurably "tortured," is less degrading than being repeatedly struck and otherwise tortured!
In conclusion, if you want to be happier, and more free, you'd be better to be born as a woman. Whether you want to be self-reliant or dependent, you're better off being born as a woman.
With that being the case, why can't we support the MRA instead of the crazy feminists who demand grants for gender studies, so they can rant about patriarchy, rape culture, and wage gaps? Feminism is really about protecting the privileges women have gained rather than being for equality. They want the best of both worlds- the privileges women have traditionally enjoyed, and the ones of men, and as a whole they would rather keep men from having as many privileges.
I find that such political ideology is beyond questioning, is disgusting. You know the saying that the one thing which you cannot criticize shows who is in control? Well it seems to be Feminism. So many people have recently lost their jobs and social standing for saying it's mostly lies and bullshit. We're so close to kicking religion out of its special time-honored place, and now Feminism is positioning itself to fill the void.