Conrad Black: The West dominates. Don’t believe lies to the contrary
It has become something of a cliché to assert as an evident fact accepted resignedly, that the West is in decline. But it isn't. The West is essentially the Americas, Central and Western Europe, Israel, Australasia, and Japan, South Korea, Taiwan, arguably the Philippines and beleaguered elements in South Africa. Obviously, some of these places are in better condition than others. A degenerating society is one that has lost the will to defend itself from both external and internal enemies and where belief in the value of the society or civilization and loyalty and pride in the country have eroded to the point where there is legitimate doubt that they can be sustained under any pressure. No part of the Western world has achieved such a condition. Read More