The past week has been an immensely clarifying — and profoundly demoralizing — one in American politics. It has demonstrated beyond a shadow of a doubt that the country's foreign policy establishment, along with its leading center-right and center-left politicians and pundits, are hopelessly, perhaps irredeemably, deluded about the role of the United States in the world.
Yep. Read the whole thing.
'via Blog this'