Definder - what does the word mean?

What is the West has fallen?

an older catchphrase of alt-right and adjacent groups. It's the idea that, to them, the social decay and fall from grace perceived in western countries is akin to the fall of Rome; that the Americas and Europe are beginning to collapse or lose their glory.

Paul: The western countries are declining crime is rising, whites are getting replaced, values are lost and cultures are getting eradicated

Thomas: The west has fallen!

👍67 👎49


the West has fallen - video

loading