-
Super Moderator
Senior Crime Reporter
LT. Colonel
Get Woke, Go Broke: Hollywood Is Dying And They Deserve It
Hollywood is dying, their various partners are dying, and they brought it on themselves. The entertainment and corporate news industry has long had a cringe inducing leftist bias, but for many years their propaganda and their motivations remained comparatively subtle. Then, something happened. Maybe it was the election of Donald Trump, maybe it was a unified decision within corporate culture to take the mask completely off and reveal the true ugliness underneath, or, maybe it was just pure arrogance. Whatever the cause, Hollywood and all the related appendages of the tinsel town religion suddenly turned openly militant and the zealotry was palpable.
This is a dynamic that had been developing for some time, but truly became an international phenomena around 2016 onward. It's important to note that also around this same time there was a burgeoning revelation among conservatives and many moderates that our popular culture had been overrun by people with an agenda, and they did not have our best intentions at heart. We had been lax in our vigilance. Many thought that pop culture was “stuff for children” and that the real fight was in politics. They were wrong.
https://www.zerohedge.com/political/...hey-deserve-it
Be polite, be professional, but have a plan to avoid every nigger you meet.
-
Post Thanks / Like - 2 Likes