This randomly popped into my head from my childhood, I searched for it online, and apparently it’s not even that uncommon of a Christian take today.
This randomly popped into my head from my childhood, I searched for it online, and apparently it’s not even that uncommon of a Christian take today.
I didn’t grow up Christian but I haven’t heard that argument, that is just absurd to me. Aren’t we supposed to respect and have love for God’s creatures and the world made for us? How do they even arrive at that conclusion?
The Bible says God gave man dominion over the Earth, and many Christians take “dominion” to mean in the colonial sense.
I’ve had some leaders who would cede that were are to be good stewards of the earth, so maybe don’t set fire to forests… while defending the people destroying forests and other ecological structures.
Points at death cult sign.
No no, God placed us here to rule over all the shit he made for us, and promised he wouldn’t destroy the earth, therefor we can be as destructive as we want and everything will turn out fine