Is it really true that your universities are woke or is this just a social media myth?

I’m in Sweden and at least my university is very old fashioned, you don’t talk back as a student. My major (history) is very conservative and we don’t learn about female history. One professor said that he wouldn’t mention any females at all because they hadn’t done anything worth mentioning. There’s plenty of racism too. I am seen as outspoken because I think stolen artifacts should be returned and my professors think that “those people” would ruin the artifacts if we did that. Despite that some talk about wokeness in universities over here and that the “left wing” shouldn’t dictate education. That’s why I’m asking if it’s the same thing in the US.


Leave a Reply