What Does the Bible Say About Women?
From the beginning, women have been the backbone of the church. It was a group of women who were first identified as Jesus’ financial backers. After his resurrection, Jesus first appeared to women and he entrusted them to spread the good news. But within just a few years, the role of women in the church was relegated from commissioned church leaders to passive church members. The question is, what does the Bible actually say about women?