Should public schools be mandated to teach the Bible?

Should public schools be mandated to teach the Bible?

  • Yes

  • No

  • Undecided


Results are only viewable after voting.
Wanted to see what the members of the US message or thought about this issue.

I vote yes because the USA is a Christian majority country and Christianity has profoundly affected our country. And this is not the same thing as forcing somebody to become a Christian. None of the students are being forced to become Christian. They are some being taught about a religion, which is a major part of American history.
It’s in many areas of life. It is truly a part of the US culture, even though we are technically not at theocracy.

And here’s a critical point that some folks who are reactionaries will not get to or will not have read this far. I would say the same thing in a Muslim majority country or Jewish majority country..meaning if they mandated teaching about the Islamic religion or Jewish religion at public schools… I would not oppose it. This is the response to those who will say “well why not teach about Satanism or something like that in public schools”
Well, Atheism has dragged America down. Same has happened in the UK.
 

Forum List

Back
Top