Fads come and go in America, whether we’re talking about consumer products, hairstyles, or social-political ideas, so it’s reasonable to wonder whether the secular movement might be just another trendy fashion. If we’re considering what’s hot and what’s not in popular culture, clearly the notion of personal secularity is in the former category, with demographic trends breaking in favor of nonbelievers and the nonreligious. But will it last?
For several reasons, it’s hard to see the modern secular movement as a passing phase that will be gone tomorrow. The movement may level off, and even experience ebbs and flows over time, but the emergence of seculars resulting from the modern secular movement is highly unlikely to reverse itself, and the impact of that emergence is likely to be lasting and profound. Here are five factors indicating that the contemporary trend of secularity should have long-term traction:
1. The secular demographic won’t disappear
Secular Americans are a broad tent that includes not just atheists and agnostics, but millions of Americans who are simply not religious. These are good, taxpaying citizens who are generally skeptical of grand theological claims, who wouldn’t dream of spending Sunday morning sitting in church, and who tend to see church-state separation as important. These seculars have always been around, and there is no chance that they are suddenly going to disappear.Indeed, if the secular movement seems like a new phenomenon (and it is), that’s because many of these nonreligious and nontheistic Americans have only recently begun to appreciate their secular identity. By remaining in the closet for decades as the religious right was growing into a major political force, seculars inadvertently helped create a landscape ripe for anti-intellectualism and disastrous public policy. The modern secular movement can be understood as a response to that mistake, taking root as seculars increasingly realize that they must be “out” and visible in order to fight back against the religious right.
Written By: David Niosecontinue to source article at psychologytoday.com