It's true, that for centuries, women have lived under the paradigm that they are second-class citizens. In our culture today, many people would point their finger and say that the reason women are treated so poorly is because of religion, namely…Christianity! Many might even blame Jesus Christ Himself. After all, when you think about it, God refers to Himself in the masculine pronoun as “He”: “God so Loved the world that HE gave His one and only Son…”
Likewise, our entire Christian theology hinges on the belief that God became a MAN, in the person of Jesus Christ. Additionally, Jesus Himself picked 12 men to be His disciples to start the early church… no women.
There are verses that are misunderstood, taken out of context, used against the Christian, such as:
1 Peter 3:7 "Husbands, in the same way be considerate as you live with your wives, and treat them with respect as the weaker partner….."
1 Timothy 2:12 "I do not permit a woman to teach or to assume authority over a man; she must be quiet."
1 Timothy 2:15 "But women will be saved through childbearing…"
Does Christianity degrade women? Is the Bible a chauvinist/sexist book? What did God REALLY say?
lie recorded in human history was one of doubting God’s goodness. The devil twisted what God said, and then used it to attack Mankind. Unfortunately, we bought into the lie and the rest is history. Fast forward to today. Our culture still believes the lie. The attack on Truth, the Bible, has never been more malicious. Ideas such as, "How can you believe the Bible when it blatantly promotes slavery and degrades women?” or “Look at how much murder, death and evil Christianity is responsible for.” or “The Bible is a homophobic book that threatens to send everyone to hell who doesn’t agree with it." With people saying and believing things like this, it seems like Christianity has its back up against the wall. Or does it? Are these statements actually true? Or have people been misinformed in this 140 character Social Media generation?