I keep seeing things in the news that seem to suggest the very foundations upon which America was built, are crumbling away. I do not hold romantic notions about America. I do not think it is the greatest nation on earth. I do not think it is free or fair. I do not think it has an unblemished history. But I do think it embodies certain ideals and in at least some important ways, has given us one of the less awful systems of government and justice in the world.
Now that really does seem to be falling apart. What I think has always made America work, to the extent that it has worked, has been a fundamental underlying agreement between most Americans of all political persuasions, that they had certain values in common that were worth defending. Now that no longer seems to be the case. I find this sad story in the Washington Post to be just one example of an unfortunate trend. Even different branches of the justice system are at war with each other. If US Marshals feel they can ignore the orders of a federal judge when they don't like them, the breakdown of civil society may already be beyond repair. The system only works at all because each part of it respects the authority of the other. On this path lies armed revolt, and in a country with as many guns as America has, that is a truly terrifying prospect.

