I don't get it. Why are people always blaming the victims of rape? Saying that "you wore something revealing, you deserved it" or "you should have said no" but then when it comes down to it, those people don't even know what it feels like. They don't understand how horrifying it can be to the body and mind. They're too absorbed in staying away from the victim and making them feel even worse because of it. People always say that it's the fault of the victim and that there's nothing that can be done. "Get over it" is something commonly said. I assume most of those people don't even know a person who was raped. Because if you knew someone and you care for them a whole lot then you'd definitely get mad at whoever raped them. Of course, by the time it all happened, they wouldn't be able to get much help since there are more laws protecting the unborn fetus. The fetus that was conceived from rape. A woman can't decide what to do with her a body, the man decides. Why is that? Why can't we decide on our own what's best for our bodies? Is it because we're "weak" to men? Or is it because too many people are too religious for their own good and forget that we are a country where we can all be free? That's something I don't understand. People scream that we are all "god's children" and "god loves you no matter what" but if we try to do anything people don't like, they start throwing back things at us? A man raping a woman is a crime and the same goes for a woman raping a man. The latter being more rare than the former. I don't understand this world.
I'll start another thread at another time discussing what people think about "the bible" and some of it's "laws".
Keep in mind, this is a civil discussion, therefore, no flaming of any kind. If any is seen, a mod will be notified via reporting and your posts will be cleaned up.