There is no Western imperialism. What they call “Western imperialism” is the efforts of the USA and allies (not necessarily the entire West) to keep their (i.e. Islamic and Communistic) imperialism from expanding.
The United Kingdom and France and other nations had colonies around the globe. They were kicked out by the 1960s and Socialism funded by the Soviets and others filled the void.
The life is no better for the people in those nations. At least colonies produce something.
I hear Afghanistan is a mess and that some regions have no governance. At a point it can’t even be called a country and a conquering nation can run things better. But that’s “bad”. Japan today is neither a threat nor managed by the United States.