Many people constantly think the England don’t teach about the Empire which may be true in the past but it is changing now. And if they don’t, then it likely doesn’t get taught less due to malice and more to do with how we specialise early so British history gets prioritised first. Believe me, we Brits who aren’t right wing are very self deprecating about our Empire and people know it. And if they don’t teach it, it’s likely not to do with not wanting to teach the negative parts of our country, since in history lessons the country is put in a negative light through British domestic history. Of course there may be exceptions if the teachers are right wing.

But I’ve heard India are having issues with textbooks distorting history so are all your countries having issues with not teaching the negative parts of your country?


Leave a Reply