There is a new movement sweeping the internet, and that is for the freedom for women to go topless in public. They make an interesting point to the fact that it used to be unacceptable for men to topless in public at one point. Now that it is no longer considered "sexual" or "men shouldn't have to worry about being sexually objectified" when being topless, shouldn't women also have that same right?
Christians should not endorse this movement, but it is a part of our culture. How should we as a people? It may very well impact our society to a point where our daughters may one day wish to go topless at a beach. Thoughts as to how to navigate?
Christians should not endorse this movement, but it is a part of our culture. How should we as a people? It may very well impact our society to a point where our daughters may one day wish to go topless at a beach. Thoughts as to how to navigate?