I know that people who believe that the media is "liberal" have their sources that they like to reference (which are ALWAYS from far right organizations with names that tout how they are fair and balanced).
But if you really think about what they are implying...
That somehow every corporation and educational system in America except for hard-right Christianity have somehow conspired to make sexuality, science and multiculturalism "acceptable" in society against the better judgment of most people...
It's kind of obtuse, right?
Bookmarks