According to NEWSWEEK, Christianity is declining in America. Depending on where you stand on the spectrum of religiosity (devout believer vs. atheist), and which axis (monotheism vs. polytheism), this could be a great thing or a terrible thing, or a piece of non-news, some bit of fluff that takes up your bandwidth that you don't particularly care for.
I thought I was in the last category. To me, religion has always been a non-issue--I'm a scientist by training, and science and religion, while not mutually incompatible, have their differences, and I'll freely confess my bias towards a rational system of thinking. Religion doesn't interest me (except where it interferes with science), so I tend to ignore it, even though it is apparently very important to a lot of other people.
How important to how many? Well, I don't know, and more to the point, I don't care enough to go Google the answer. The point I'm about to make doesn't need exact numbers:
Could it be that this drop in religiosity is the turning point in people's relationship with the natural world?
Let's not underestimate the importance of this Jewish book in our lives (the Bible's Old Testament is the Jewish Torah, and the New Testament--well, Jesus was a Jew). To this day Creationism's bogeymen are still lobbying to have their "point of view" taught as a science (I don't mind if you teach creationism as literature, philosophy, or as part of a theology course, but it's not a science). The Bible is still being misused as the main point of denying gays the right to marry--nowhere does the Bible actually state that marriage is a union between a man and a woman (and you have to wonder what exactly transpired between Moses and Aaron, Peter and Paul). The Good Book was instrumental in shaping the American West, what with Manifest Destiny driving good Christian soldiers onwards in the wilderness, and the taming of the "savages" and the landscape.
I doubt that we will ever be rid of every Judeo-Christian presence in our lives--and I don't think that's a laudable goal, either. Man needs religion, as a psychological crutch if nothing else, and if you take away the Bible you'll end up with something else. Worse, probably.
The Christian point of view: the world is there for humans to use as they see fit, God granted dominion to Man, animals are dumb beasts that don't have souls. Hardly edifying, if you ask me. Yes, Ecclesiastes asks us to be humble and realize that we are all stardust, but by and large the Christian Bible asks us to see the world as a gift of God--and relieves us of our responsibility to the environment.
Granted, this "responsibility" is a social construct. We don't really have a responsibility to keep the world in shape. God knows, if walruses were the dominant species, we'd have been f*cked a long time ago. As it is, humans are the dominant species (in terms of effects on the planet, I know we are woefully outnumbered by six-legged creepy crawlies), and for better or worse, we are the ones calling the shots about where water goes, what gets built on which land, what trees get cut down, what animals get shot and eaten, what plants get put where, and so on.
There's nothing new about that--we've been modifying our environment for ages. But what's changed is our awareness of how our modifications affect the environment around us, and eventually, us, again. Evidence is mounting against the Christian view that "God made the world so we could use it", and for the view that we are the stewards of our own future.
The optimist in me likes to think that the decline in religion marks a new type of environmentalism, one that has nothing to do with "living in harmony" and all that hooey, but rather one based on the fact that we're all stuck on this floating rock together, and the Big Guy in the Sky isn't going to hand us a shovel when we dig ourselves in over our heads in our own sh*t. Admittedly, a ten-percent decline in the number of self-proclaimed Christians isn't going to make a hoot of difference in the grand scheme of things, but then again, every little bit helps.