At one point after the bloody civil rights wars and legislation we thought racism was in check. Racists lived under rocks in the scum and only came out in the dark. Now they are emboldened to advertise their racist views and run for office and get elected to positions of power and run roughshod on those who disagree. What monstrous thing has turned the tide in the wrong direction? What terrible thing has been unleashed released and let out? Are we now forever more a racist nation? Will it get worse and worse and worse and worse? Lynchings again?