Image from source, Huffington Post |
[Excerpt]
5 Facts About Dominionism
. . .The term "Dominionism" was popularized in the 1990s by scholars and journalists, who applied it to conservative Christians seeking political power. It derives from the Book of Genesis, in which God tells Adam and Eve to have "dominion" over the Earth and its animals. "Dominionism" generally describes the belief that Christians are biblically mandated to control all earthly institutions until the second coming of Jesus. . .
Read more at: Huffington Post
No comments:
Post a Comment
Have something to say to us? Post it here!