I’ve noticed a recurring theme in conversations and media where there’s an assumption that everyone around the world aspires to be American or embraces American culture wholeheartedly. While I appreciate the contributions of the U.S. in various fields, this presumption comes off as a bit arrogant.
The only persons I have met that love the USA are people believing you can “get rich” over there. They never go though.
In recent years there has been some “maga” types too, “enable racism here too!!”