Texas passed a law in 2007 that requires its public schools to teach the Bible — and this law is about to be enacted for the 2009-2010 school year.
The social studies chair at Whitehouse High School in Texas claims that “The purpose of a course like this isn’t even really to get kids to believe it, per se, it is just to appreciate the profound impact that it has had on our history and on our government.”
But America is not a Christian nation — there is no national religion. Thus, is it right to force kids to learn about the holy book of one specific type of religion that not everyone practices? Why not also teach the Koran, or the Torah?
I can understand offering a course in religion (though, I would argue such a course should still be optional) and its impact on American history. But it seems to me that there are limited ways to teach the Bible specifically without teaching students to believe in a certain idea. After all, many people don’t even believe the Bible to be true, so what are they supposed to take away from this course? Does a mandatory Bible class operate under the assumption that the Bible is the true word of God and something that should not be questioned?
I think even some very religious folks would agree that religious beliefs is something that should be taught at home, not in a public school.
This sort of law opens up further questions about the role of Christianity in America. Why is it, for instance, that everyone makes a big deal about having a black president, or a woman president, but not a non-Christian president? Could a non-Christian president even have a chance of being elected? Why is it that the President elect must swear on a Bible at his inauguration ceremony? We are not a Christian nation, but we sure do act like one most of the time.
Just some food for thought.
Source for the quote: Texas public schools required to teach Bible this year, KLTV, http://www.kltv.com/global/story.asp?s=10933571