Kind of a vague question. But I guess anyone that responds can state their interpretation.
Edit: I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war and turning into a D class country.
A lot of that is politicians creating an ‘us - them’ situation and the news sensationalizing it because it makes people watch which is revenue. That said, the Republican party has gone completely off the deep end. I have some friends that are very worried. I have some friends who believe strongly enough in our system of checks and balances that they’re not terribly worried, just irritated and frustrated.