I have been with the Walking Dead since the first show, and I still watch it. These days I download and watch the whole season on a weekend, but still I enjoy the show overall. I know it’s had it’s ups and downs and there were times it strayed from the books I wish it didn’t and vice versa.
That being said I have had a much harder time of late, it seems the even good old Walking Dead land has become ultra woke. It’s amazing how many of the current relationships on the show are inter racial or same sex, I mean obviously those things exist in the real world and there is no reason to pretend they won’t in zombie times.
BUT.. this is the issue with so many of the people trying to make the general public “woke” to their plight, they are over doing it. If you watch TWD you would think that over 50% of all relationships in the world are same sex, when that’s not even remotely close to factual in society at large.
Look this isn’t the end of the world and it’s not even the worst case I’ve seen of this in a show or movie but it was just sad to see how it’s becoming driven home more and more in entertainment as if it’s the new norm, when it isn’t.
Do people not understand that if same sex couples were the majority, the world would die off? Who is gonna have babies? Not to mention the stats say that domestic abuse is MUCH higher in same sex relationships for some reason. This isn’t the place to dig into that.
Either way just thought in keeping with the theme of this site I would post my disappointment in the direction they are going on the inter personal side of the show.
0 Comments