Social Question
If you attended public school in America: did any teacher of yours ever say anything antiGod?
Did religion ever come up? Or, politics for that matter, where you felt a teacher was trying to persuade you to think one way or the other?
I can’t think of one instance. The only politics we discussed was like a history lesson about our politics and political parties on government class. Religion, nothing. Maybe the mention of Christmas or a holiday party, but nothing in class where we were taught a specific religious doctrine, and never anything said against a religion.
I just wonder where this talk comes from the public schools are anti-religion? An exboyfriend of mine was saying something along these lines to me in facebook a while back (somehow in the years that have past he has become extremely right wing on some issues, which still confuses me) and when I asked him, “did any of our teachers in school every say anything to influence your politics or religion,” he replied, “I’ll plead the 5th to not aggravate my wife and family.” He knows that talk is total bullshit when it comes to the schools we attended. This seemed to come from the talk that “teachers” tend to be more liberal, but I think that statistic refers to university professors anyway.
But, I am certainly willing to be corrected, maybe some communities have things going on in public schools that are antireligion and trying to steer children to be more liberal? The one thing I can think of, which was not happening when I was in school, is I have heard people saying schools not teavh about gay sex. Do they? Or, do they just teach about hay reltioships?
What’s your take?