The End of White Christian America?

The headline reads: White Christian America ended in the 2010s.[1] As a white evangelical (and male), the first reaction to such a headline, I admit, is to cringe. We hear so much about the white privilege, white evangelicals and white Christians, generally. It gets old for me. But, if this time really spells the end […]