Faith, Hope, Love and Marriage

 

It would be a stretch to claim, either in elegy or triumph, that Christianity lies in ruins in the West. But it is undeniable that Americans and especially Western Europeans increasingly regard it with bemusement or hostility, even to the point that some intellectuals deny its founding influence on Western civilization. Whether one thinks Christianity is pernicious or socially nourishing, there should be near universal agreement that its cultural decline in the West has transformed politics, public education, the arts and academe, and—perhaps most prosaically yet also most profoundly—the family.

Read Full Article »
Comment
Show commentsHide Comments

Related Articles