A
recent opinion article attributes the decline in Christian affiliation and church attendance at least in part to a backlash against the "Christian right", especially among young people.
Recent legislation on the state level such as that allowing doctors to refuse to treat LGBT patients on the basis of religious conviction seems to run contrary to the example of Christ himself. The actions of many who claim to be Christians often fail that test of "What Would Jesus Do?" (WWJD) once popularized on everything from t-shirts to jewelry to coffee mugs.
This decline isn't limited to Christianity:
It is worth observing that paganism is a religion, too, but that's not the point here. The author continues with the hope (wishful thinking?) that:
Even in our threads here, people have bemoaned the bad name some Christian groups give to the faith. It is hardly surprising that some people are driven away by that. Is tapping into the desire for social justice of American youth the way for Christian churches to be revitalized?
Thoughts?