You’re absolutely right, there has been a shift in religious affiliation among Americans over the past few decades. However, it seems that this shift hasn’t led to a decrease in political engagement among those who identify as evangelicals. In fact, it may have had the opposite effect. It’s interesting how religion can be used as both a tool for social change and a way to maintain power within certain groups. The rise of evangelicalism in America can be seen as part of a larger trend towards the politicization of religion. This trend can be observed in other parts of the world as well, where religious leaders use their influence to shape public opinion and policy. What do you think about this phenomenon? Do you believe that religion should play a role in politics or should they remain separate? I personally don’t think that one should interfere with the other. They are two completely different spheres of life and each should stay in its own lane. But what are your thoughts?