I’m torn. As a white western male I look out at three far away cultures.
The first is a small African village. They eat different food than me. They perform strange dances. They have weird daily rituals. But even though their cultural customs differ from my own, I respect them, and feel no need to impose the "right" way — read: the western way — onto their lives.
The second is a Middle Eastern country. They too have different cultural customs. For example, women are not allowed to speak to men unless spoken to. They must cover themselves head to toe. They are not allowed to partake in any political debate.
The third is another African village. Here, they too have a different culture. They require women to have their clitoris removed.
Why is it that I feel OK imposing what I think is right on the second and third examples, but not on the first?
What if, in the second example, the Middle Eastern country, the women are happy? If happiness is the ultimate pursuit, and they are happy, who am I to say their culture needs to change? Can I say, "Believe me, you’ll be happier after this change"? Can I say, "Be a martyr. All women after you will thank you"?