I’m asking because, with everything happening lately, it feels like every crisis just turns into finger-pointing. “The right” blaming “the left” and vice versa. Like two kids arguing over who broke the lamp. It seems like the country is split into just two camps, and I honestly can’t even tell what the American “left” is supposed to look like; The Democratic party just seems like a rainbow painted right wing. Do people really not care about what happens to others simply because they’re in a different political side, or is this exaggerated by the internet?


Leave a Reply