I ask this having been to events with national/ethnic dress, food, and other cultures. What can a white American say their culture is? It feels that for better or worse it’s been all melted together.
Trying to trace back to European roots feels disingenuous because I’ve been disconnected from those roots for a few generations.
This also makes me wonder was their any political motive in making white American culture be everything and nothing?
Works boots, denim, beer, God and NASCAR in reverse order