Why do Americans think they are the only country in the world

They think they are the only country that gets cicadas

They think American accents don’t exist because it’s just normal English

They know nothing about their neighbouring countries or ANY country at all

They’ll see an online video of people who look just like them and there’s snow in the back when it’s July. Obviously it’s Australia!

They literally say their country is the land of the free or the American dream when every other country knows they are no where close to a free country

Was geography not emphasized and taught enough in educational institutions? Why do they think the world revolves around them?

I’m sorry if you’re an educated American and I grouped you with these idiots

I just haven’t seen any other country seem so self centered besides North Korea